Clicky

EU Tightens Social Media Censorship Screw With Upcoming Mandatory “Disinformation” Rules

With the Digital Services Act in play, compliance could mean steep costs for platforms to fund and align with EU-endorsed "fact-checking" initiatives.
Abstract painting with the European Union stars encircled by yellow speech bubbles on a blue, red, and orange textured background.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

What started out as the EU’s “voluntary code of practice” concerning “disinformation” – affecting tech/social media companies – is now set to turn into a mandatory code of conduct for the most influential and widely-used ones.

The news was revealed by the Irish media regulator, specifically an official of its digital services, Paul Gordon, who spoke to journalists in Brussels. The EU Commission has yet to confirm that January will be the date when the current code will be “formalized” in this way.

The legislation that would enable the “transition” is the controversial Digital Services Act (DSA), which critics often refer to as the “EU online censorship law,” the enforcement of which started in February of this year.

The “voluntary” code is at this time signed by 44 tech companies, and should it become mandatory in January 2025, it will apply to those the EU defines as Very Large Online Platforms (VLOPs) (with at least 45 million monthly active users in the 27-nation bloc).

Currently, the number of such platforms is said to be 25.

In its present form, the DSA’s provisions obligate online platforms to carry out “disinformation”-related risk assessments and reveal what measures they are taking to mitigate any risks revealed by these assessments.

But when the code switches from “voluntary” to mandatory, these obligations will also include other requirements: demonetizing the dissemination of “disinformation”; platforms, civil society groups, and fact-checkers “effectively cooperating” during elections, once again to address “disinformation” – and, “empowering” fact-checkers.

This refers not only to spreading “fact-checking” across the EU member-countries but also to making VLOPs finance these groups. This, is despite the fact many of the most prominent “fact-checkers” have been consistently accused of fostering censorship instead of checking content for accuracy in an unbiased manner.

The code was first introduced (in its “voluntary” form) in 2022, with Google, Meta, and TikTok among the prominent signatories – while these rules originate from a “strengthened” EU Code of Practice on Disinformation based on the Commission’s Guidance issued in May 2021.

“It is for the signatories to decide which commitments they sign up to and it is their responsibility to ensure the effectiveness of their commitments’ implementation,” the EU said at the time – that would have been the “voluntary” element, while the Commission said the time it had not “endorsed” the code.

It appears the EC is now about to “endorse” the code, and then some – there are active preparations to make it mandatory.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Share this post

Reclaim The Net Logo

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.