fbpx

Amazon Face-Detection Tech Reads Darker-Skin Women As Men

Amazon Face-Detection Tech Reads Darker-Skin Women As Men

There have been warnings and doubts about just how accurate facial-detection technology can be. A new study focusing on facial-detection technology created by Amazon shows that it’s not so accurate when it comes to recognizing people — especially women with darker skin. It even reads some darker-skin women as men.


Listen to GHOGH with Jamarlin Martin | Episode 39: Tunde Ogunlana Jamarlin talks to family wealth advisor Tunde Ogunlana, CEO of Axial Family Advisors, about estate planning and Snoop Dogg’s comment that he doesn’t need a will (“I don’t give a f— when I’m dead. What am I gonna give a f— about?”). They also discuss the growing college debt bubble, whether more free tuition will help solve the problem, and why MBAs are like the bachelor’s degrees of 30 years ago.

Amazon’s Rekognition service, which is being marketed to law enforcement, often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto. The study points out that the service can be abused and because of its inaccuracy rate, it poses threats to privacy and civil liberties.

Even before the new study, civil rights advocates were calling on Amazon to stop marketing its Rekognition service because of the possibility of discrimination against minorities, particularly when used by law enforcement.

Amazon Face-Detection Tech
Photo by 3Motional Studio from Pexels

“The researchers said that in their tests, Amazon’s technology labeled darker-skinned women as men 31 percent of the time. Lighter-skinned women were misidentified 7 percent of the time. Darker-skinned men had a 1 percent error rate, while lighter-skinned men had none,” the ABC News reported.

But Amazon said the new study is what’s faulty. Matt Wood, general manager of artificial intelligence with Amazon’s cloud-computing unit, said in a statement that the study looked at “facial analysis” and not “facial recognition” technology and added that facial analysis “can spot faces in videos or images and assign generic attributes such as wearing glasses; recognition is a different technique by which an individual face is matched to faces in videos and images.”

Wood said: “It’s not possible to draw a conclusion on the accuracy of facial recognition for any use case – including law enforcement – based on results obtained using facial analysis. The results in the paper also do not use the latest version of Rekognition and do not represent how a customer would use the service today.”

Amazon noted that it recreated the test used in the survey with its updated software and there were no misidentifications.

Jacob Snow, an attorney with the American Civil Liberties Union, isn’t buying Amazon’s explanation. Snow told the Verge that Amazon’s response shows that the company isn’t taking the “really grave concerns revealed by this study seriously.”

MIT Media Lab researcher Joy Buolamwini and Inioluwa Deborah Raji of the University of Toronto explained they decided to study Amazon’s technology because the company has marketed it to law enforcement. (Raji is currently a research mentee for artificial intelligence at Google.)

Other companies besides Amazon, such as IBM, have had trouble with their facial recognition technology and mistakes made by law enforcement using these sort of flawed tools to apprehend and arrest people has some troubling consequences.

“Facial recognition systems have long struggled with higher error rates for women and people of color — error rates that can translate directly into more stops and arrests for marginalized groups. And while some companies have responded with public bias testing, Amazon hasn’t shared any data on the issue, if it’s collected data at all. At the same time, it’s already deploying its software in cities across the U.S., its growth driven by one of the largest cloud infrastructures in the world. For anyone worried about algorithmic bias, that’s a scary thought,” the Verge reported.

What some say is most worrisome is false identifications.  Usually when police use facial recognition to search for specific suspects rather than comparing suspect photos, white subjects are less likely to generate false matches than Black subjects.

Even freshman Congresswoman Alexandria Ocasio-Cortez weighed in on the issue and warned about bias in Amazon’s facial detection technology.

She tweeted about the new study: “Algorithms are still made by human beings. And those algorithms are still pegged to basic human assumptions.”