fbpx

Facebook Apologizes For ‘Primates’ Label On Video Of Black Men

Facebook Apologizes For ‘Primates’ Label On Video Of Black Men

primates

Facebook Apologizes For 'Primates' Label On Video Of Black Men. Photo: Facebook

Facebook has apologized for artificial intelligence (AI) software that labeled Black men as “primates,” but only after the New York Times reported on the social media giant’s racist gaffe.

Facebook’s AI software asked users watching a video featuring Black men if they wanted to see more “videos about primates.” 

In its Sept. 3 apology, Facebook said its AI mislabeling was “clearly an unacceptable error.” The company is investigating the cause to prevent the behavior from happening again, The Verge reported. “As we have said, while we have made improvements to our AI we know it’s not perfect and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”

Facebook has disabled the topic recommendation feature. The video had been online for more than a year, NPR reported.

On June 27, 2020, The Daily Mail uploaded the video. In it, viewers see an encounter between a white man and a group of Black men who were celebrating a birthday. In the clip, the white man allegedly calls 911 to report that he is “being harassed by a bunch of Black men” before cutting to an unrelated video showing police officers arresting a Black tenant at his own home.

Former Facebook employee Darci Groves recently tweeted about the error, sharing a screenshot of the video that captured Facebook’s prompt, “Keep seeing videos about Primates?”

“This ‘keep seeing’ prompt is unacceptable, @Facebook,” Groves tweeted. “And despite the video being more than a year old, a friend got this prompt yesterday. Friends at [Facebook], please escalate. This is egregious.”

PeaceMaker @georgiaartist tweeted, “As a black man, I will not accept apology from Mark Zuckerberg and his company. What they did was intentional! Facebook, along with Instagram, have a reputation of being so racist towards people of color.”

This is just one example of AI tools depicting racial bias with facial recognition tools shown to misidentify Black people. And Facebook isn’t the only tech company whose AI tools have committed racist errors. In 2015, Google apologized after its Photos app tagged photos of Black people as “gorillas,” Wired reported.

In April, the U.S. Federal Trade Commission warned that AI tools that have shown “troubling” racial and gender biases may be in violation of consumer protection laws if they’re used decision-making for credit, housing, or employment, The Verge reported. 

Listen to GHOGH with Jamarlin Martin | Episode 74: Jamarlin Martin Jamarlin returns for a new season of the GHOGH podcast to discuss Bitcoin, bubbles, and Biden. He talks about the risk factors for Bitcoin as an investment asset including origin risk, speculative market structure, regulatory, and environment. Are broader financial markets in a massive speculative bubble?

Listen to GHOGH with Jamarlin Martin | Episode 74: Jamarlin Martin Jamarlin returns for a new season of the GHOGH podcast to discuss Bitcoin, bubbles, and Biden. He talks about the risk factors for Bitcoin as an investment asset including origin risk, speculative market structure, regulatory, and environment. Are broader financial markets in a massive speculative bubble?