Clicky

Leaked Legal Analysis Of EU’s Private Message Snooping Plans Says It Interferes With “Fundamental Rights”

Undermining the EU's self-described commitment to privacy.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

There are few things the EU Commission (the EU’s executive arm) would like to present more than the bloc and its institutions speaking with one voice, particularly on controversial topics, such as attempts to destroy encryption.

However, documents leaked from the EU Council Legal Service regarding the legality of a proposal known as “chat control” (formally, Child Sexual Abuse Regulation, CSAR), show that there may be “trouble in paradise.”

As digital rights advocate and European Parliament member (MEP) Patrick Breyer of Germany reports, the Service has warned the Commission that its idea probably runs contrary to the fundamental right to respect for private life – meaning that the European Court of Justice would likely annul it.

Summed up, the “chat control” scheme proposes forcing providers of chat, messaging, phone, and email services to screen all private messages in search for illegal content and then inform the police.

But the problem with this, as the Service has noticed, is that is very easily could be interpreted as general and indiscriminate, as well as permanent surveillance, given that the plan gives “generalized” access to every citizen, including those the analysis says are “not even remotely connected with child sexual exploitation.”

And with the high likelihood that CSAR’s “detection orders” would be considered a violation of the fundamental right to privacy and confidentiality of correspondence, the EU court is also highly likely to squash “chat control” as indiscriminate surveillance, the Service warns.

The analysis also notes that while if the justification for “communications metadata screening” is national security, the court allows it – the drastic measures proposed in the CSAR would probably not be considered proportional to their stated purpose.

There’s also the issue of the EU Commission making the dubious claim that the process, rather than generalized, is somehow “targeted” (it does target everyone – so perhaps that’s the sophistry those behind the CSAR chose to go with.)

But the Legal Service’s analysis fears this is actually a “contradiction” between what the Commission is saying, and what the proposal actually spells out.

The Service’s logical suggestion then is to actually target detection orders so that they apply to people “in respect of whom there are reasonable grounds to believe that they are in some way involved in, committing or have committed a child sexual abuse offense.”

Observers have noted that the analysis of the CSAR – whose UK counterpart is the Online Safety Bill, represents serious criticism of similar, encryption-undermining proposals on both sides of the Atlantic.

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share