On the very day that European lawmakers intensified their calls for stricter oversight of platforms like Meta and X, the European Commission (EC) denied accusations of censorship made by Meta CEO Mark Zuckerberg. Despite this denial, critics point out that the EU’s Digital Services Act (DSA) and past demands to remove content reveal a pattern of censorship embedded in its approach to regulating digital platforms.
Zuckerberg’s remarks, which accused Europe of institutionalizing censorship through its regulatory framework, prompted a firm rebuttal from the EC. “We absolutely refute any claims of censorship,” stated a Commission spokesperson. The DSA, they argued, does not compel platforms to remove lawful content but focuses on illegal material or content deemed harmful, such as that impacting children or democratic processes.
However, this defense should be met with skepticism. The DSA passed as a landmark piece of legislation, has been criticized for its potential to stifle free expression under the guise of ensuring safety and security. Zuckerberg expressed concern about Europe’s increasingly restrictive digital environment, stating, “Europe has an ever-increasing number of laws institutionalizing censorship and making it difficult to build anything innovative there.”
Zuckerberg isn’t wrong. The timing of the clash between Meta and the EC coincides with Meta’s decision to overhaul its content moderation policies in the United States. Zuckerberg announced that the company would abandon its US-based fact-checking programs on platforms like Facebook, Instagram, and Threads, replacing them with a “community notes” system modeled after the approach used by X. This system allows users to add publicly visible notes to posts they consider misleading, provided those notes are deemed helpful by a diverse group of contributors.
When asked about the potential use of similar systems in Europe, the Commission noted that such measures would require risk assessments submitted to the EU executive. While emphasizing flexibility in content moderation approaches, the EC stressed that any system must be effective. “Whatever model a platform chooses needs to be effective, and this is what we’re looking at… So we are checking the effectiveness of the measures or content moderation policies adopted and implemented by platforms here in the EU,” said an EC spokesperson.
Critics argue that the EU’s insistence on evaluating “effectiveness” opens the door for indirect censorship by incentivizing platforms to over-moderate content to avoid penalties. While the EC maintains that it does not dictate specific content moderation practices, its regulatory framework exerts significant pressure on platforms to align with EU standards.
Amid this dispute, European users are set to continue suffering from content “oversight” conducted by “fact-checkers” in the US, according to the Commission.