Clicky

Meta’s Nick Clegg Admits Excessive Censorship and High Error Rates in Content Moderation

The tech giant acknowledges high censorship error rates.
Zuckerberg and Clegg sitting in separate rooms, one wears a black shirt, the other a light-colored shirt with glasses.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Meta’s President of Global Affairs Nick Clegg has admitted that the tech giant “still has too high” content moderation error rates.

This is another way of conceding that censorship is alive and well on Meta’s massive platforms, Facebook and Instagram, but also, Threads.

That’s despite there being something of a shift in the way this issue is treated by Meta, including by CEO Mark Zuckerberg.

Now Clegg, in a blog post dedicated to 2024 “global elections,” touches on free expression allowed on these social platforms, to state that Meta’s choice is to find a “balance” between free speech and “keeping people safe.”

It’s unclear how Meta “keeps people safe,” but free speech is a straightforward concept, and here Clegg offers a “mea culpa” by not only publicly accepting that there are high rates of error, something that he says “gets in the way” of free expression.

On top of that, he explains how it gets in the way: “Too often harmless content gets taken down or restricted and too many people get penalized unfairly.”

Any of the large number of users who fell victim to Meta’s ever-escalating levels of censorship over the past almost a decade could have told him that; but, speaking about this in clear terms is a marked shift, at least in the approach to the problem.

How that will translate into action is to be seen. Clegg is much less direct there, saying that a number of policies have changed in order to allow previously more tightly controlled recommendations, and prevent some content that would be banned and that this work “will continue.”

We learn that Meta allows users “to ask questions or raise concerns about election processes in organic content” – but, it does not allow “claims or speculation about election-related corruption, irregularities, or bias when combined with a signal that content is threatening violence.”

The mention of “a signal” is key here, rather – how Meta interprets its “signals” and how that could be used as an excuse to block content.

Other than around elections, some of the most egregious, mass-scale censorship was seen on Facebook and other Meta platforms concerning COVID-19. To Clegg’s mind, Meta “overdid it a bit.” Free speech supporters would have no problem saying – “no, you actually overdid it a lot.”

“No one during the pandemic knew how the pandemic was going to unfold, so this really is wisdom in hindsight,” Clegg told journalists earlier this week and added:

“But with that hindsight, we feel that we overdid it a bit. We’re acutely aware because users quite rightly raised their voice and complained that we sometimes over-enforce and we make mistakes and we remove or restrict innocuous or innocent content.”

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Share this post

Reclaim The Net Logo

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.