Clicky

Meta Oversight Board says Facebook’s image auto-censorship tech is broken, harming free speech

The technology is automatically censoring speech.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Meta’s Oversight Board said the company should improve its automated image takedown system. The system mistakenly removed posts featuring a cartoon image showing police violence in Colombia.

The image was added to Meta’s Media Matching Service database. Once an image is added to the database, it is automatically taken down when it is posted.

According to the Oversight Board, 215 people appealed the removal of the image, and 98% of them were successful. Despite the successful appeals, Meta did not remove the image from the database.

“By using automated systems to remove content, Media Matching Service banks can amplify the impact of incorrect decisions by individual human reviewers,” the board said. It added that the appeals should have triggered a review of the image and its removal from the database, because one bad decision could result in an image being indefinitely prohibited, despite different conclusions by human reviewers.

The board recommended more methods of oversight.

“The board is particularly concerned that Meta does not measure the accuracy of Media Matching Service banks for specific content policies,” it notes. “Without this data, which is crucial for improving how these banks work, the company cannot tell whether this technology works more effectively for some community standards than others.”

The Oversight Board also asked Meta to publish the error rates of content that has been mistakenly included in the Media Matching Service database.

The board also considered a case surrounding Facebook’s policy on extremism, where it appeared to not make the difference between reporting on extremist groups and supporting them.

The board concluded that Meta made a mistake when it took down a post in Urdu reporting on the Taliban re-opening schools for girls and women. Facebook removed the post for violating its rule that ban “praise” of extremist groups like the Taliban. The removal of the post was appealed, but was never reviewed because Facebook has less than 50 reviewers that speak Urdu.

The board said the case “may indicate a wider problem” with the rules on dangerous organizations. The rules are not clear to moderators and users, and the punishments are also “unclear” and severe. The board recommended that Meta should make the rules and definition of “praising” dangerous individuals clearer and add more reviewers.

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share