AI Facial Recognition Systems Work the Worst for Black Women

0

I got curious about computer science when I was nine years old. I was watching PBS and they were interviewing someone from MIT who had created a social robot named Kismet. It had big ears that moved. It could smile. I didn’t know you could do that with machines. So, from when I was little, I had in my mind that I wanted to be a robotics engineer and I was going to MIT.

Eventually, I did reach MIT, but I went to Georgia Tech for my undergraduate degree. I was working to get a robot to play peekaboo because social interactions show some forms of perceived intelligence. It was then that I learned about the code bias: Peekaboo doesn’t work when your robot doesn’t see you.

[A few years later] at MIT, when I was creating [a robot] that would say, “Hello, beautiful,” I was struggling to have it detect my face. I tried drawing a face on my hand and it detected that. I happened to have a white [Halloween] mask in my office. I put it on [and it detected that]. I was literally coding in whiteface. I did another project where you could “paint walls” with your smile. Same issue: Either my face wasn’t detected, or when it was I was labeled male.

Cathy O’Neil’s book Weapons of Math Destruction talks about the ways technology can work differently on different groups of people or make certain social conditions worse, which made me feel less alone. [At the time], I was a resident tutor at Harvard and the dining hall workers were on strike. I heard people protesting and wondered, Do I just follow this comfortable path that I’m on or might I take the risk and fight for algorithmic justice?

I changed my research focus and started testing different systems that analyze faces. That became my MIT master’s work, [a project] called Gender Shades. I collected a data set of members of parliament from three African countries and three European countries — and found AI systems worked better overall on lighter-skinned faces. Across the board, they work the worst on people most like me: darker-skinned women.

Dr. Buolamwini is the face of Olay’s Decode the Bias campaign.

Photography by Naima Green

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Education News Click Here 

Read original article here

Denial of responsibility! Rapidtelecast.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment