European Commission Official Touts 17 Investigations as Proof the Digital Services Act “Delivers”

The Commission now employs 127 people to police online speech and is hiring 60 more, all without a single courtroom in the loop.

Two clusters of floating metallic gold stars forming mirror speech-bubble shapes against a dark blue background

Stand against censorship and surveillance: join Reclaim The Net.

The European Union’s Digital Services Act is a censorship and surveillance law dressed in the language of safety. It gives unelected officials in Brussels the power to decide what hundreds of millions of people are allowed to say online and it is building the infrastructure to verify their identities before they’re permitted to say it.

But at POLITICO’s AI & Tech Week summit in Brussels this month, Renate Nikolay, the European Commission’s Deputy Director-General at DG CONNECT, celebrated the law’s growing enforcement record. Seventeen ongoing investigations and one non-compliance decision, she told the audience, prove the DSA “delivers.”

What the DSA delivers is pressure. Pressure on platforms to censor more speech, faster, with fewer questions asked. Pressure to open their algorithms and internal systems to government inspection without a court order. And, increasingly, pressure on individual users to prove who they are before they’re allowed to participate in public discourse online.

Nikolay presented these enforcement numbers as proof of success. They are proof of something but not what she thinks.

Reclaim Your Digital Freedom.

Get unfiltered coverage of surveillance, censorship, and the technology threatening your civil liberties.

The DSA’s censorship powers are extensive and largely unchecked. The law requires platforms with more than 45 million monthly EU users to assess and mitigate “systemic risks,” a category that includes risks to “civic discourse,” “electoral processes,” and “public security.”

The Commission decides what counts as a systemic risk. The Commission decides whether a platform’s response is sufficient. And when the Commission decides it isn’t, the Commission opens an investigation, gathers evidence, issues preliminary findings, and imposes fines of up to 6% of global annual revenue. There is no independent prosecutor. No separation between the body that writes the rules and the body that punishes violations. The Commission is a regulator, investigator, and judge.

The law also empowers the Commission to order “interim measures” while investigations are still underway, forcing platforms to change how they operate before anyone has established that they did anything wrong. It can demand access to platform algorithms, require changes to recommender systems, and order increased monitoring of specific keywords or hashtags.

At the same summit, the Commission’s Martin Harris-Hess declared that “2026 is the year of enforcement.” He explained that “when the DSA came to enforce, we had to build capacity, we had to build experience, we had to build understanding of how the platforms work.”

The Commission now has 127 staff working on DSA enforcement, is hiring 60 more, and has launched investigations covering X, TikTok, Meta’s Facebook and Instagram, AliExpress, Temu, Snapchat, and several pornographic platforms. The building phase is over.

The single completed enforcement action, a €120 million fine against X in December 2025, targeted the platform’s blue checkmark system, its advertising repository, and researcher access to data. X has appealed the fine to the General Court of the European Union, arguing prosecutorial bias and due process violations.

The politically dangerous investigation, the one that goes to the heart of what the DSA actually is, remains open. That probe, launched against X in December 2023, examines the platform’s handling of “illegal content” and “information manipulation.”

Neither term has a fixed legal definition under the DSA. The Commission gets to interpret both. “Information manipulation” could mean a coordinated bot campaign. It could also mean a viral post that the Commission finds politically inconvenient. The law does not distinguish between the two because the people who wrote it did not want it to.

“Systemic risk” is the DSA’s most powerful and most dangerous concept. Platforms must assess risks to “civic discourse” and then mitigate them.

The effect is now global, not just European. The US House Judiciary Committee published reports documenting how the Commission used the DSA and earlier informal pressure campaigns to force platforms into changing their worldwide content moderation rules.

Subpoenaed documents showed TikTok rewriting its global community guidelines specifically to “achieve compliance with the Digital Services Act.”

The new rules censor “marginalizing speech,” “coded statements” that “normalize inequitable treatment,” and “misinformation that undermines public trust.” These categories are so vague that almost any political statement could trigger them. And because TikTok applies its guidelines globally, a censorship regime designed in Brussels now determines what users in São Paulo, Lagos, and Los Angeles can post.

The privacy dimension of the DSA is at least as alarming as the censorship dimension, and the two are converging. The Commission is now pushing age verification requirements under the DSA that would require platforms to collect identity data from users before granting access to certain content.

Nikolay herself and enforcement chief Prabhat Agarwal recently held a press conference explaining plans to use verification systems linked to the EU Digital Identity Wallet, a digital ID that EU countries are expected to implement by the end of 2026. The wallet would let users manage their identity, educational qualifications, drivers licenses, and other personal attributes from a single app. Five member states are already testing the system.

At the summit, Harris-Hess previewed this trajectory when discussing potential social media bans for minors. He said a ban is “legally” possible but cautioned that “ban is not the right word” because the term is “emotionally laden.” He preferred “age-related restrictions to accessing certain services.”

The Commission has developed an entire vocabulary for softening what its powers actually do. Censorship becomes “content moderation.” Surveillance becomes “verification.” A ban becomes “age-related restrictions.” Government control of speech becomes “platform accountability.” The phrases obscure the same underlying reality: the state is deciding who gets to speak, what they’re allowed to say, and whether they must identify themselves before saying it.

Flemish Minister for Brussels and Media Cieltje Van Achter offered the summit a rare moment of candor. The EU is taking steps to enforce the DSA, she said, but “we’re not seeing the result yet on the social media platform.” What she wants is “safe for social media in real life, in practice.”

She also noted that if existing age thresholds can’t be enforced, raising them accomplishes nothing. The observation cuts deeper than she may have intended. The Commission is building a massive regulatory apparatus, hiring hundreds of enforcement staff, launching investigation after investigation, and the politicians who wanted the law admit they can’t see the difference it’s making.

The enforcement machine is growing but the problem it claims to solve remains unsolved. That is the pattern with censorship regimes. The apparatus always expands.

Stand against censorship and surveillance: join Reclaim The Net.

Fight censorship and surveillance. Reclaim your digital freedom.

Get news updates, features, and alternative tech explorations to defend your digital rights.

Read More

Share this post

Reclaim The Net Logo

Reclaim The Net

Defend free speech and privacy online. Get the latest on Big Tech censorship, government surveillance, and the tools to fight back.