Facial Recognition and Racial Bias

]it is estimated that almost half of American adults – over 117 million people, as of 2016 – have photos within a facial recognition network used by law enforcement. This participation occurs without consent, or even awareness, and is bolstered by a lack of legislative oversight. More disturbingly, however, the current implementation of these technologies involves significant racial bias, particularly against Black Americans.

-harvard.edu

In as many years, 3 Black men have had their lives upended by wrongful arrests. Robert Williams, Michael Oliver, and Nijeer Parks were misidentified by facial recognition software, arrested, and held under suspicion of crimes ranging from petty theft to assault of a police officer. For Parks, who was accused of the more serious crimes of assault and eluding the police, the fight to clear his name went on for the better part of a year. Before his case was thrown out of court, and his name cleared, Parks would go on to spend 10 days in jai,l all due to hyper-reliance on technology. In the later filed lawsuit against the Woodbridge Police Department, its affiliates and Idemia the company behind the facial recognition software, Parks alleged that proper investigative techniques were forgone in lieu of faulty technology.

Despite widely published research findings detailing the issues of misidentification of darker skinned faces by facial recognition technologies, law enforcement’s hyper-reliance remains. For BIPOC, and most notably, dark-skinned Black women (for whom misidentification occurs as often as 33% of the time compared to that of white men) this adds an added layer of concern.

“Automated systems are not inherently neutral. They reflect the priorities, preferences, and prejudices—the coded gaze—of those who have the power to mold artificial intelligence.”

Gendershades.org

BIPOC are more highly surveilled by law enforcement agencies, more likely to be arrested, more likely to receive harsher sentences when convicted, and most daunting of all—are most likely to die at the hands of law enforcement officers. 
Wrongful arrests, even if/when eventually dismissed, have far-reaching consequences on the personal and professional lives, and mental health and wellness of the victims as well as that of their families.

Technological design should be just, equitable, and relationship-centered; it should be built with not just for all users and those impacted by its use. If we are not building for stress-use, fringe-use, and nefarious use, if we are not intentionally building for the most vulnerable among us and scaling out to close the gaps in literacy, accessibility, equity and actively designing for harm mitigation we are making the willful choice to perpetuate harm. 

Dr. Dede Testubayashi’s (Deh-deh Teh-tsu-bye-ya-she) expertise is DEI + product + business value and integrating them into a team and organization's best practices. She has extensive experience building frameworks and guidelines to integrate product inclusion into the development process, and driving adoption as an integral portion of phased and prioritized roadmaps for teams to execute against. Dede is a member of the Equity Army run by Annie Jean-Baptiste, a group focused on educating organizations on Product Inclusion. She's also a founding member of Tech Ladies, a group focused on inclusivity in tech, and is working on two new publications; a memoir and a product inclusion guide.

Previous
Previous

Ethical AI and Smart Lock Systems

Next
Next

The Pitfalls of Adversarial Clothing