The Pitfalls of Adversarial Clothing

When I present on panels about equitable and inclusive design, there are two areas I emphasize; as both a social scientist and tech ethicist, these are the areas where we, as humans have the greatest opportunity to bring about transformative change.

The first and most fundamental tool we have within our arsenal is the call-in. The call-in is the seed from which the best accessible, equitable, and inclusive products and processes take shape. Who am I designing this for? Who am I designing it with? If they are not one and the same, we must go back and begin again. This work is slow, iterative, and intentional, but necessary for radical change. 

The second is more of a guiding principle, “Design for one, extend to many.” This principle from my friend and mentor, Kat Holmes is another essential tool in designing accessibly and equitably. When we design anything with the most marginalized and underrepresented, (deliberately and not) among us at the forefront of our design process, we are in essence extending that ease and accessibility to the masses.

As more and more citizens increase their literacy and awareness around personally identifying data/biometric data collection it stands to reason that we are all looking for ways to protect ourselves. In the way that capitalism is often both the cause and cure, designers are stepping up to fulfill that need; but at what cost? 
I’ve written about hyper-reliance on facial recognition technology and the added layer of harm it causes for the oft-misidentified BIPOC and LGBTQ+ communities, so when adversarial clothing brands started to emerge, my interests were piqued and as it turns out, I wasn’t alone.   

I recently sat with a small group to discuss the rapid growth of facial recognition technologies and more specifically the harms inevitably levied upon us by the designers, the consumers, and even so-called adversaries of these technologies. With a focus on accessibility, equity, and harm mitigation what remains true is, the most marginalized, the most endangered among us are still not being called-in and/or considered.

Cap_able boasts the following:  “Cap_able brings a wholly new and deeper attitude to the fashion sector, through a human-centric design approach.”

https://www.capable.design

Yet, their products, while designed for privacy and protection, actually pose grave danger for a large faction of the human population because, as I can only assume, the products were not designed with those most at risk.

As a Black woman, I have to consider what the ramifications of donning adversarial clothing, makeup or other items designed to obfuscate facial recognition technologies. Would law enforcement perceive my decision to opt-out of having my personally identifying information collected as an act of antagonism, non-compliance, criminality?

BIPOC, LGBTQ+, and other marginalized persons are unduly burdened with the precarious choice of protecting their biometric data or in the worst-case-scenario saving their lives. This is not a human-centric approach, but instead is wholly rooted in upholding the harmful status quo of white supremacist, heteronormative, patriarchal systemic oppression. 

Dr. Dede Testubayashi’s (Deh-deh Teh-tsu-bye-ya-she) expertise is DEI + product + business value and integrating them into a team and organization's best practices. She has extensive experience building frameworks and guidelines to integrate product inclusion into the development process, and driving adoption as an integral portion of phased and prioritized roadmaps for teams to execute against. Dede is a member of the Equity Army run by Annie Jean-Baptiste, a group focused on educating organizations on Product Inclusion. She's also a founding member of Tech Ladies, a group focused on inclusivity in tech, and is working on two new publications; a memoir and a product inclusion guide.

Previous
Previous

Facial Recognition and Racial Bias

Next
Next

Digital Blackface: Are you complicit?