top of page

"Overview"

"Overview"

Algorithmic Harm

In 2020, Mackinac Partners financing advisory firm sent CCTV footage of a theft recorded by one of their clients – Shinola – to Michigan State Police. The Michigan State Police used a facial recognition algorithm initially built by Data Works Plus, that in turn worked with Rank One Computing and NEC Corporation on various dimensions of the algorithm, to identify several potential matches for the face of a person depicted committing theft in Shinola's CCTV footage. Michigan State Police sent these potential matches to the Detroit Police Department to investigate further.

prep_websitedesign.jpg
Screen Shot 2023-08-25 at 12.58.55 PM.png
Screen Shot 2023-08-25 at 12.59.05 PM.png

The Detroit Police Department, without apparent further investigation, sought to arrest one of these matches – Robert Julian-Borchak Williams. After several hours of detention and Williams pointing out that the person's face in the CCTV footage was not his face, the Detroit Police Department realized that the facial recognition algorithm's proposed match of Williams, was not correct. 

Williams was detained for 30 hours for a crime he did not commit, leading to the New York Times' headline – "Wrongfully Accused by an Algorithm" (Hill, 2020).

This event was likely the first known incident in the US of an incorrect match by a facial recognition algorithm leading to wrongful detention.

Since Williams' detention, scholarship (Buolamwini & Gebru 2018) and a US government report (Boutin 2019) have established that the majority of facial recognition algorithms evaluated are far more likely to mis-identify dark-skinned faces. Williams was darker skinned.

 

Despite this information, Michigan State Police continues to use the same facial recognition algorithm.

 

In February 2023, Detroit resident Porcha Woodruff – also darker skinned – was wrongfully detained for a crime she did not commit due to a false positive match suggested by the Michigan State Police's facial recognition algorithm (Democracy Now! 2023).

Screen Shot 2023-08-25 at 1.05.49 PM.png

The constellation of algorithm design, implementation, and application that led to Williams' and Woodruff's wrongful detentions, are algorithmic harms.

HOME

THE PROJECT

References

Boutin, C. (2019). NIST Study Evaluates Effects of Race, Age, Sex on Face Recognition Software | NIST. https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software

Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of Machine Learning Research, 81, 1–15. http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf

Democracy Now! (2023, August 9). Meet Porcha Woodruff, Detroit Woman Jailed While 8 Months Pregnant After False AI Facial Recognition. Democracy Now! https://www.democracynow.org/2023/8/9/porcha_woodruff_false_facial_recognition_arrest

Hill, K. (2020, June 24). Wrongfully Accused by an Algorithm. The New York Times. https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html

bottom of page