Fight censorship and surveillance. Reclaim your digital freedom.

Get news updates, features, and alternative tech explorations to defend your digital rights.

Britain’s AI Policing Plan Turns Toward Predictive Surveillance and a Pre-Crime Future

When the state’s gaze never blinks, innocence becomes a temporary status.

Mahmood with dark shoulder-length hair in a blazer wearing a pendant and microphone, against a circular tunnel of lights.

Stand against censorship and surveillance, join Reclaim The Net.

Let me take you on a tour of Britain’s future. It’s 2030, there are more surveillance cameras than people, your toaster is reporting your breakfast habits to the Home Office, and police officers are no longer investigating crimes so much as predicting them.

This is Pre-Crime UK, where the weight of the law is used against innocent people that an algorithm suspects may be about to commit a crime.

With a proposal that would make Orwell blush, the British police are testing a hundred new AI systems to figure out which ones can best guess who’s going to commit a crime.

That’s right: guess. Not catch, not prove. Guess. Based on data, assumptions, and probably your internet search history from 2011.

Behind this algorithmic escapade is Home Secretary Shabana Mahmood, who has apparently spent the last few years reading prison blueprints and dystopian fiction, not as a warning about authoritarian surveillance, but as aspiration.

In a jaw-dropping interview with former Prime Minister and Digital ID peddler Tony Blair, she said, with her whole chest: “When I was in justice, my ultimate vision for that part of the criminal justice system was to achieve, by means of AI and technology, what Jeremy Bentham tried to do with his Panopticon. That is that the eyes of the state can be on you at all times.”

Now, for those not fluent in 18th-century authoritarian architecture, the Panopticon is a prison design where a single guard can watch every inmate, but the inmates never know when they’re being watched. It’s not so much “law and order” as it is “paranoia with plumbing.”

Enter Andy Marsh, the head of the College of Policing and the man now pitching Britain’s very own Minority Report.

According to the Telegraph, he’s proposing a new system that uses predictive analytics to identify and target the top 1,000 most dangerous men in the country. They’re calling it the “V1000 Plan,” which sounds less like a policing strategy and more like a discontinued vacuum cleaner.

“We know the data and case histories tell us that, unfortunately, it’s far from uncommon for these individuals to move from one female victim to another,” said Sir Andy, with the tone of a man about to launch an app.

“So what we want to do is use these predictive tools to take the battle to those individuals…the police are coming after them, and we’re going to lock them up.”

I mean, sure, great headline. Go after predators. But once you start using data models to tell you who might commit a crime, you’re not fighting criminals anymore. You’re fighting probability.

The government, always eager to blow millions on a glorified spreadsheet, is chucking £4 million ($5.39M) at a project to build an “interactive AI-driven map” that will pinpoint where crime might happen. Not where it has happened. Where it might.

It will reportedly predict knife crimes and spot antisocial behavior before it kicks off.

But don’t worry, says the government. This isn’t about watching everyone.

A “source” clarified: “This doesn’t mean watching people who are non-criminals—but she [Mahmood] feels like, if you commit a crime, you sacrifice the right to the kind of liberty the rest of us enjoy.”

That’s not very comforting coming from a government that locks people up over tweets.

Meanwhile, over in Manchester, they’re trying out “AI assistants” for officers dealing with domestic violence.

These robo-cop co-pilots can tell officers what to say, how to file reports, and whether or not to pursue an order. It’s less “serve and protect” and more “ask Jeeves.”

“If you were to spend 24 hours on the shoulder of a sergeant currently, you would be disappointed at the amount of time that the sergeant spends checking and not patrolling, leading and protecting.”

That’s probably true. But is the solution really to strap Siri to their epaulettes and hope for the best?

Still, Mahmood remains upbeat: “AI is an incredibly powerful tool that can and should be used by our police forces,” she told MPs, before adding that it needs to be accurate.

Tell that to Shaun Thompson, not a criminal but an anti-knife crime campaigner, who found himself on the receiving end of the Metropolitan Police’s all-seeing robo-eye. One minute, he’s walking near London Bridge, probably thinking about lunch or how to fix society, and the next minute he’s being yanked aside because the police’s shiny new facial recognition system decided he looked like a wanted man.

He wasn’t. He had done nothing wrong. But the system said otherwise, so naturally, the officers followed orders from their algorithm overlord and detained him.

Thompson was only released after proving who he was, presumably with some documents and a great deal of disbelief. Later, he summed it up perfectly: he was treated as “guilty until proven innocent.”

Mahmood’s upcoming white paper will apparently include guidelines for AI usage. I’m sure all those future wrongful arrests will be much more palatable when they come with a printed PDF.

Here’s the actual problem. Once you normalize the idea that police can monitor everyone, predict crimes, and act preemptively, there’s no clean way back. You’ve turned suspicion into policy. You’ve built a justice system on guesswork. And no amount of shiny dashboards or facial recognition cameras is going to fix the rot at the core.

This isn’t about catching criminals. It’s about control. About making everyone feel watched. That was the true intention of the panopticon. And that isn’t safety; it’s turning the country into one big prison.

If you’re tired of censorship and surveillance, join Reclaim The Net.

Fight censorship and surveillance. Reclaim your digital freedom.

Get news updates, features, and alternative tech explorations to defend your digital rights.

More you should know:

Share this post