On the very day that European lawmakers intensified their calls for stricter oversight of platforms like Meta and X, the European Commission (EC) denied accusations of censorship made by Meta CEO Mark Zuckerberg. Despite this denial, critics point out that the EUโs Digital Services Act (DSA) and past demands to remove content reveal a pattern of censorship embedded in its approach to regulating digital platforms.
Zuckerbergโs remarks, which accused Europe of institutionalizing censorship through its regulatory framework, prompted a firm rebuttal from the EC. โWe absolutely refute any claims of censorship,โ stated a Commission spokesperson. The DSA, they argued, does not compel platforms to remove lawful content but focuses on illegal material or content deemed harmful, such as that impacting children or democratic processes.
However, this defense should be met with skepticism. The DSA passed as a landmark piece of legislation, has been criticized for its potential to stifle free expression under the guise of ensuring safety and security. Zuckerberg expressed concern about Europeโs increasingly restrictive digital environment, stating, โEurope has an ever-increasing number of laws institutionalizing censorship and making it difficult to build anything innovative there.โ
Zuckerberg isnโt wrong. The timing of the clash between Meta and the EC coincides with Metaโs decision to overhaul its content moderation policies in the United States. Zuckerberg announced that the company would abandon its US-based fact-checking programs on platforms like Facebook, Instagram, and Threads, replacing them with a โcommunity notesโ system modeled after the approach used by X. This system allows users to add publicly visible notes to posts they consider misleading, provided those notes are deemed helpful by a diverse group of contributors.
When asked about the potential use of similar systems in Europe, the Commission noted that such measures would require risk assessments submitted to the EU executive. While emphasizing flexibility in content moderation approaches, the EC stressed that any system must be effective. โWhatever model a platform chooses needs to be effective, and this is what weโre looking atโฆ So we are checking the effectiveness of the measures or content moderation policies adopted and implemented by platforms here in the EU,โ said an EC spokesperson.
Critics argue that the EUโs insistence on evaluating โeffectivenessโ opens the door for indirect censorship by incentivizing platforms to over-moderate content to avoid penalties. While the EC maintains that it does not dictate specific content moderation practices, its regulatory framework exerts significant pressure on platforms to align with EU standards.
Amid this dispute, European users are set to continue suffering from content โoversightโ conducted by โfact-checkersโ in the US, according to the Commission.