Clicky

Join the pushback against online censorship, cancel culture, and surveillance.

UK: Ofcom Appoints “Experts” to Advisory Committee on “Disinformation” Under New Censorship Law

A committee of speech-policing veterans now holds the keys to defining “harm” under one of the UK’s most sweeping internet laws.

Union Jack flag flying at half-mast against a blue sky with scattered white clouds

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

UK’s regulator Ofcom, which is tasked with enforcing the sweeping online censorship and age verification law, the Online Safety Act, has appointed members of its “Online Information Advisory Committee” (formerly known as “Advisory Committee on Disinformation and Misinformation”), which will advise Ofcom on “misinformation” and “disinformation.”

Lord Richard Allan, who was last November appointed a non-executive director of Ofcom’s Board for a four-year term, now chairs the Committee, comprised of five members – most of whom have prominent track records as pro-censorship advocates.

One is Jeffrey Howard, a political philosophy professor at University College London (UCL), whose website’s research page includes an upcoming article titled, “The Ethics of Social Media: Why Content Moderation is a Moral Duty.”

Howard says the article defends platforms’ “moral responsibility” to “proactively” moderate “wrongfully harmful or dangerous speech” as one of justifications for platforms to censor out of a sense of “moral duty.”

Elisabeth Costa, Chief of Innovation and Partnerships at the Business Insights Team (BIT, which started off as the “Nudge Unit“) is another Committee member.

Costa should feel right at home helping enforce the Online Safety Act, given that BIT has close ties to many governments and international organizations that push for the kinds of censorship like “prebunking.”

And way back in 2019, Costa was co-authoring papers about “the role behavioral science can play – especially for more emergent and uncertain harms like attention capture and erosion of civility.”

Then there are Will Moy, founder and former CEO of Full Fact (a Meta “fact-checker” in the UK), and Mark Scott, former Politico and New York Times reporter, who both complained before the Online Safety Act was passed that it needed to be harsher and go further.

Moy wanted real-time monitoring of “misinformation” to be included in the new rules. Full Fact and other fact-checkers that are part of Meta’s fact-checking program can have a significant impact on post distribution on Meta’s platforms, with fact-checked posts getting 95% fewer clicks and 38-47% fewer share completions. And Full Fact was also part of a separate initiative that targeted content that was expressing vaccine skepticism (this was a Full Fact-coordinated initiative bringing together the UK government, Facebook, Google, and Twitter).

Posting in late 2022, Mark Scott criticized what was at the time the Online Safety Bill (OSB) for not giving “access for outsiders to hold platforms to account (that doesn’t exist in the OSB)” as well as for leaving out “risk assessment and audits to check compliance, international regulatory overlap with the EU’s separate rules and those in Australia & Canada.”

The fifth member of the Committee is Devika Shanker-Grandpierre, who also sits on the panel of EU’s Knowledge Hub on Prevention of Radicalization.

Ofcom announced the 3-year appointments on April 28, stating that the Committee is there to provide advice related to the Online Safety Act’s section 152(4).

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Logo with a red shield enclosing a stylized globe and three red arrows pointing upward to the right, next to the text 'RECLAIM THE NET' with 'RECLAIM' in gray and 'THE NET' in red

Join the pushback against online censorship, cancel culture, and surveillance.

Reclaim The Net Logo

Defend free speech and individual liberty online. 

Share this post