The European Union has a habit of turning its worst temporary ideas into permanent fixtures. This time it is “Chat Control 1.0,” the 2021 law that lets tech companies scan everyone’s private messages in the name of child protection.
It was supposed to be a stopgap measure, a temporary derogation of privacy rights until proper evidence came in.
Now, if you’ve been following our previous reporting, you’ll know the Council wants to make it permanent, even though the Commission’s own 2025 evaluation report admits it has no evidence the thing actually works.
We obtained a copy of the report for you here.
The report doesn’t even hide the chaos. It confesses to missing data, unproven results, and error rates that would embarrass a basic software experiment.
Yet its conclusion jumps from “available data are insufficient” to “there are no indications that the derogation is not proportionate.” That is bureaucratic logic at its blandest.
The Commission’s Section 3 conclusion includes the sentence “the available data are insufficient to provide a definitive answer” on proportionality, followed immediately by “there are no indications that the derogation is not proportionate.”
In plain language, they can’t prove the policy isn’t violating rights, but since they can’t prove that it is, they will treat it as acceptable.
The same report admits it can’t even connect the dots between all that scanning and any convictions. Section 2.2.3 states: “It is not currently possible…to establish a clear link between these convictions and the reports submitted by providers.” Germany and Spain didn’t provide usable figures.
If hundreds of thousands of reports are generated, 708,894 in 2024, and the Commission still cannot point to outcomes, the structure functions more as noise than support for police work.
Officers face huge piles of irrelevant material while real cases remain unresolved.
The technology is just as unstable as the policy. Yubo reported error rates of 20 percent in 2023 and 13 percent in 2024, and those figures exclude human review. Former Commissioner Johansson and the German Police have said the true rate is far higher.
The Commission also admits that providers “did not use the standard form for reporting” and that Member States sent in “fragmented and incomplete” data.
France received roughly 150,000 reports from NCMEC, yet it can’t say clearly what happened to most of them. The EU built a continent-wide surveillance scheme, but can’t get the participants to follow basic reporting rules. In practice, the structure leaves US platforms acting as informal police while the EU attempts to assemble a puzzle with missing pieces.
Turning this temporary measure into a permanent rule would be negligent. The system has high error rates, no proven results, and oversight that barely functions. The Regulation fails the basic legal tests it was supposed to meet.
The Council’s plan exposes what Chat Control has become, a symbolic gesture that lets policymakers claim progress while the machinery underneath produces confusion, false positives, and unanswered questions.
The only thing the EU has measured with certainty is the distance between its stated goals and the system it built.








