Britain’s policing system, we are told, is broken. And on Monday, the home secretary, Shabana Mahmood, announced that the fix would arrive in the form of algorithms, facial recognition vans, and a large check made out to the future.
The government plans to spend £140m ($191M) on artificial intelligence and related technology, with the promise that it will free up six million police hours a year, the equivalent of 3,000 officers.
It is being billed as the biggest overhaul of policing in England and Wales in 200 years, aimed at dragging a creaking system into the modern world.
The ambition is serious. The implications are too.
The plan is for AI software that will analyze CCTV, doorbell, and mobile phone footage, detect deepfakes, carry out digital forensics, and handle administrative tasks such as form filling, redaction, and transcription. Mahmood’s argument is that criminals are getting smarter, while parts of the police service are stuck with tools that belong to another era.
She put it plainly: “Criminals are operating in increasingly sophisticated ways. However, some police forces are still fighting crime with analogue methods.”
And she promised results: “We will roll out state-of-the-art tech to get more officers on the streets and put rapists and murderers behind bars.”
There is logic here. Few people would argue that trained officers should be buried in paperwork. Technology can help with that. The concern is what else comes with it.
Live facial recognition is being expanded aggressively. The number of police vans equipped with the technology will increase fivefold, from ten to fifty, operating across the country. These systems scan faces in public spaces and compare them to watch lists of wanted individuals.
This is a form of mass surveillance and when automated systems get things wrong, the consequences fall on real people.
That is true for Shaun Thompson, an anti-knife crime campaigner, who was wrongly caught in the Metropolitan Police’s facial recognition tech.
Earlier this month, Mahmood took a tone explicitly embracing the logic of the panopticon rather than warning against it.
She argued that the knowledge of being observed can itself deter crime, describing visibility and certainty as powerful tools in modern policing.
In her account, technology that makes offenders feel watched is not a threat to public life but a feature of a safer one, provided it is deployed by the state rather than left to chance or private actors.
Mahmood said: “When the future arrives, there are always doubters. 100 years ago, fingerprinting was decried as curtailing our civil liberties. But today, we could not imagine policing without it.
“I have no doubt the same will prove true of facial recognition technology in the years to come.”
Facial recognition is not the same as fingerprinting because it operates at a distance and without active participation. Fingerprints are taken after arrest or with consent, in controlled settings, and they sit quietly in databases until a specific investigative need arises.
Facial recognition works in public spaces, scanning faces automatically as people go about their lives, many of whom are not suspected of any wrongdoing, and checking them against databases.








