The EU’s hunger for imposing control, whether the target be social media, media in general, or search engines, seems to be truly insatiable, but also, awkward in finding a way to fully let its “censorship flower” blossom.
That’s because the EU is generally speaking little else than a layer upon layer of most times unappealing, but often just unpalatable bureaucratic “cake.”
Case in point: European Digital Media Observatory (EDMO) is a EU project to back “independent communit[ies] working to combat disinformation.”
It’s fact-checkers coming together for supposedly greater social good, selfless academics in pursuit of truth, etc. – and, lecturing the “unwashed masses” on “media literacy.”
Now, one of the apparent “nodes” in this EDMO “community” is something called the European Fact-Checking Standards Network (EFCSN).
Is EFCSN, then, a bona fide “voice of European fact-checkers who uphold and promote the highest standards of fact-checking and media literacy” – as it claims – or just another way for EU to obfuscate its policies, including those bent on narrative control and open censorship?
Let the footnote on the EFCSN site speak for itself: it is “supported by the European Union under the 2020 work program on the financing of Pilot Projects and Preparatory Actions in the field of ‘Communications Networks, Content and Technology’.”
There’s nothing like independence bankrolled by somebody else, is there? So let’s see what EU money can buy here. There’s “50 independent fact-checking organizations in over 30 European countries.”
EFCSN assets that Big – and smaller – Tech are “not doing enough.” Far, in fact – FAR are they from “fulfilling their promises (…) and do not have effective risk mitigation measures against disinformation in place, as DSA requires.”
The DSA mentioned here is EU’s the Digital Services Act that came into force last September, covering “Very Large Online Platforms (VLOPs) or Very Large Online Search Engines (VLOSEs)” – i.e., Google, Meta, TikTok, Microsoft, X, Telegram.
DSA requires VLOPs and VLOSEs to enforce “reasonable, proportionate and effective risk-mitigation measures.” What does something “defined” so broadly even mean? Apparently, most of these companies have not (yet) figured it out, either.
But “disinformation” is said to be “one of those risks that have actual or foreseeable negative effects on democratic processes, civic discourse and electoral processes.”
Finally, we’re talking business. Elections.
In a graphic covering compliance, EFCSN shows that what some may call “gun-shy-since-2016” Meta (Facebook and Instagram) are doing “great” – concerning “agreements and fact-checking coverage, integration and use of fact checking, access to information for fact-checkers.”
Google Search also does very well here – but curiously, allegedly, not YouTube. (Probably just another way to put pressure on arguably the most influential platform of them all.)
And then there’s your non-compliant “troublemakers” – X and Telegram.
At present, and according to the report – they don’t seem to care about EFCSN “standards” at all. Not the least bit.