Clicky

Meta Oversight Board says Facebook’s image auto-censorship tech is broken, harming free speech

The technology is automatically censoring speech.

Meta’s Oversight Board said the company should improve its automated image takedown system. The system mistakenly removed posts featuring a cartoon image showing police violence in Colombia.

The image was added to Meta’s Media Matching Service database. Once an image is added to the database, it is automatically taken down when it is posted.

According to the Oversight Board, 215 people appealed the removal of the image, and 98% of them were successful. Despite the successful appeals, Meta did not remove the image from the database.

“By using automated systems to remove content, Media Matching Service banks can amplify the impact of incorrect decisions by individual human reviewers,” the board said. It added that the appeals should have triggered a review of the image and its removal from the database, because one bad decision could result in an image being indefinitely prohibited, despite different conclusions by human reviewers.

Become a Member and Keep Reading…

Reclaim your digital freedom. Get the latest on censorship, cancel culture, and surveillance, and learn how to fight back.

Already a supporter? Sign In.

Share this post