When I was a master’s student at MIT, I worked on a number of different art projects that used facial analysis technology. I soon realized that the software I was using had a hard time detecting my face. But after I made one adjustment, the software no longer struggled. That adjustment? I put on a white mask.
Why does this matter?
The social implications of this are rooted in the longstanding marginalization of Black women, a disregard for not only our gender identity, but our very humanity. And the civic implications can be life-threatening.
For example: When it comes to interactions with law enforcement, flawed facial analysis systems pose serious risks for people of color. Innocent people can be misidentified as criminals. This is not hypothetical. The Face-Off report, released by Big Brother Watch UK, highlights false positive match rates of over 90 percent for facial recognition technology deployed by the Metropolitan police. And even as recent as this summer, the ACLU found Amazon erroneously labeled 28 members of congress as criminals.
Read full story here: Source Blavity