Facebook video recommendations disabled after labelling black men Primates

PHOTO: Facebook suggests "Primates" on video featuring black men. (via Twitter Dacri Groves)

Facebook has disabled their topic recommendation features on videos today after the AI software appears to have mistaken black men in a video for monkeys. The video in question was from the British tabloid The Daily Mail and was title “White man calls cops on black men at marina.” But under the video where auto-generated suggestions are made to encourage Facebook users to continue watching videos, the prompt appeared: “Keep seeing videos about Primates?”

The video was posted in June of 2020 and it is unclear if the prompt had recently changed or if it was only now noticed and publicised, as the New York Times reported that users had seen the prompt in recent days.

Advertisements

Former Facebook content design manager Darci Groves spotted the prompt recommending Primates and took a screenshot of the offensive suggestion. She tweeted the message with the hopes of getting the attention of her former Facebook coworkers, calling the message unacceptable.

A Facebook spokesperson quickly released a comment also calling the Primates link a “clearly unacceptable error” saying that they have disabled the feature and will analyse the software that makes video category recommendations.

Related news

“We apologize to anyone who may have seen these offensive recommendations. We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again.”

While technically humans are a species in the Homo genus which falls under the order of Primates, which includes monkeys, apes, tarsiers and lemurs, the implication or naming of a black person as a monkey or primate has been used for centuries as a racist and inflammatory insult, hence the uproar over the error on Facebook.

Facebook uses artificial intelligence and facial recognition software to automatically analyse and categorise videos uploaded to their platform. Civil rights activists have long complained of the implicit bias and inaccuracy of facial recognition software in recognising people who are not caucasian.

Advertisements

Facebook has previously run into trouble in Thailand with its automated software when its translation services mistranslated the word “salute” and used “slut” instead. The post though was a story about the Royal family, so the error caused such an uproar that Facebook completely disabled Thai translation since June.

SOURCE: Bangkok Post

Technology NewsWorld News

Neill Fronde

Neill is a journalist from the United States with 10+ years broadcasting experience and national news and magazine publications. He graduated with a degree in journalism and communications from the University of California and has been living in Thailand since 2014.

Related Articles

Check Also
Close