Lawmakers in the United Kingdom are proposing amendments to the Children’s Wellbeing and Schools Bill that would require nearly all smartphones and tablets to include built-in, unremovable surveillance software.
The proposal appears under a section titled “Action to promote the well-being of children by combating child sexual abuse material (CSAM).”
We obtained a copy of the proposed amendments for you here.
The amendment text specifies that any “relevant device supplied for use in the UK must have installed tamper-proof system software which is highly effective at preventing the recording, transmitting (by any means, including livestreaming) and viewing of CSAM using that device.”
It further defines “relevant devices” as “smartphones or tablet computers which are either internet-connectable products or network-connectable products for the purposes of section 5 of the Product Security and Telecommunications Infrastructure Act 2022.”
Under this clause, manufacturers, importers, and distributors would be legally required to ensure that every internet-connected phone or tablet they sell in the UK meets this “CSAM requirement.”
Enforcement would occur “as if the CSAM requirement was a security requirement for the purposes of Part 1 of the Product Security and Telecommunications Infrastructure Act 2022.”
In practical terms, the only way for such software to “prevent the recording, transmitting (by any means, including livestreaming) and viewing of CSAM” would be for devices to continuously scan and analyze all photos, videos, and livestreams handled by the device.
That process would have to take place directly on users’ phones and tablets, examining both personal and encrypted material to determine whether any of it might be considered illegal content. Although the measure is presented as a child-safety protection, its operation would create a system of constant client-side scanning.
This means the software would inspect private communications, media, and files on personal devices without the user’s consent.
Such a mechanism would undermine end-to-end encryption and normalize pre-emptive surveillance built directly into consumer hardware.
The latest figures from German law enforcement offer a clear warning about the risks of expanding this type of surveillance: in 2024, nearly half of all CSAM scanning tips received by Germany were errors.
According to the Federal Criminal Police Office (BKA), 99,375 of the 205,728 reports forwarded by the US-based National Center for Missing and Exploited Children (NCMEC) were not criminally relevant, an error rate of 48.3 percent, up from 90,950 false positives the year before.
Many of these reports originate from private companies such as Meta, Microsoft, and Google, which voluntarily scan user communications and forward suspected material to NCMEC under the current “Chat Control 1.0” framework, a system that is neither mandatory nor applied to end-to-end encrypted services.
Such a high error rate means that users are having their legal and private photos and videos falsely flagged and sent to authorities, a massive invasion of privacy.
Other parts of the same bill introduce additional “age assurance” obligations. On pages 19 and 20, the section titled “Action to prohibit the provision of VPN services to children in the United Kingdom” would compel VPN providers to apply “age assurance, which is highly effective at correctly determining whether or not that person is a child.”
On page 21, another amendment titled “Action to promote the well-being of children in relation to social media” would require “all regulated user-to-user services to use highly-effective age assurance measures to prevent children under the age of 16 from becoming or being users.”
Together, these amendments establish a framework in which device-level scanning and strict age verification become legal obligations.
While described as efforts to “promote the wellbeing of children,” they would, in effect, turn personal smartphones and tablets into permanent monitoring systems and reduce the privacy of digital life to a conditional privilege.
The proposal represents one of the most widespread assaults on digital privacy ever introduced in a democratic country.
Unlike the European Union’s controversial “Chat Control” initiative, which has faced strong resistance for proposing the scanning of private communications by online services, the UK plan goes a step further.
The EU proposal focused on scanning content as it passed through communication platforms. The UK’s version would build surveillance directly into the operating system of personal devices themselves.
Every photo taken, every video saved, every image viewed could be silently analyzed by software running beneath the user’s control.
The bill would turn every connected device into a government-mandated inspection terminal.
Even though it is presented as a measure to protect children, the scope of what it enables is staggering. Once a legal foundation for on-device scanning exists, the definition of what must be scanned can easily expand.
A system designed to detect child abuse imagery today could be repurposed to search for other material tomorrow. The architecture for continuous surveillance would already be in place.
The United Kingdom is seeing a steady erosion of civil liberties as surveillance and speech policing expand at the same time.
People are being arrested over online posts and private messages under loosely applied communications laws, while police are rolling out live facial recognition systems that scan the public without consent and rely on error-prone biometric data.
When this is combined with proposals for device-level content scanning and mandatory age verification, the result is a climate in which privacy, anonymity, and free expression are increasingly treated as risks to be managed rather than rights to be protected.








