“Nothing to Fear” Is Back: The UK High Court Clears Way for Police Facial Recognition

The policy that turns every Oxford Street shopper into a biometric template just got the judicial nod its architects were waiting for.

Two metal surveillance cameras angled toward the viewer over a distressed Union Jack flag background.

Stand against censorship and surveillance: join Reclaim The Net.

The High Court in London has decided that the Metropolitan Police can carry on pointing cameras at faces in London, turning each one into a bundle of biometric data and checking it against a watchlist, and that this is all perfectly fine, thank you very much, and definitely not a thing that should bother anyone who isn’t a murderer.

And here’s the best bit. Within hours of the ruling, Policing Minister Sarah Jones had popped up to announce that this technology, now officially blessed by two judges, will be rolled out across the entire country.

The case was brought by a youth worker called Shaun Thompson and Silkie Carlo of Big Brother Watch, and it was dismissed this week by Lord Justice Holgate and Mrs Justice Farbey, who found the Met’s live facial recognition policy entirely compatible with the European Convention on Human Rights.

You don’t need a law degree to see that what this technology does to an ordinary Tuesday in central London feels rather a long way from any reasonable definition of a free society.

Reclaim Your Digital Freedom.

Get unfiltered coverage of surveillance, censorship, and the technology threatening your civil liberties.

Let’s explain what it actually does, because the government would rather nobody thought about it too hard. You walk down Oxford Street. A camera sees you. Faster than you can blink, your face is converted into a biometric template, which Thompson’s barrister, Dan Squires KC, told the court is “similar to a DNA profile,” and that template is compared against a police watchlist. If you match, you get stopped. If you don’t, the Met assures us the data is deleted. But the processing was the intrusion, and the minister seems utterly incapable of grasping that distinction.

The scale of this is genuinely absurd. The Met deployed the cameras 231 times last year and ran roughly four million faces through the system. A single afternoon at Oxford Circus in December processed more than 50,000 people in four and a half hours. Squires warned the court that proposed permanent installations would make it “impossible” for Londoners to move freely without their biometric data being routinely captured and processed, and looking at those numbers, it’s rather hard to see how he’s wrong.

Now, a reasonable person might ask what happens when the computer gets it wrong, because computers, as anyone who has ever owned one will tell you, are wrong with some regularity.

Ask Thompson. On 23 February 2024, he was walking near London Bridge, minding his own business and committing the heinous crime of existing in public, when the Met’s cameras decided he looked sufficiently like someone on the watchlist to warrant a full stop.

Officers questioned him, demanded he prove who he was, and threatened him with arrest when he wouldn’t hand over his fingerprints. The court’s own summary notes he left the encounter distressed, angry, and frightened it would happen again. He had, and this cannot be stressed enough, done absolutely nothing wrong.

Thompson put it rather well himself. “I’ve considered the court’s judgment today and decided to appeal it to protect Londoners from facial recognition being used for mass surveillance and leading to situations like mine, where I was misidentified, detained and threatened with arrest. No one should be treated like a criminal due to a computer error.”

He’s called the technology “like stop and search on steroids,” which is actually a generous description, since stop and search at least requires a human being to form a suspicion before ruining your afternoon.

The judges, though, weren’t having any of it. They ruled that the Met’s policy “provides the claimants with an adequate indication of the circumstances in which LFR will be used and enables them to foresee, to a degree that is reasonable in the circumstances,” when they might be scanned, and that Thompson and Carlo’s human rights “have not been breached.” The deployment regime, they said, is “a far cry from the ‘hunch’ or ‘professional intuition’ of an individual officer,” because it confines operations to defined use cases and requires a proportionality assessment before each outing.

The idea that an ordinary human being on the way to buy a sandwich can “foresee” whether they are about to be biometrically processed because the policy document exists and is theoretically readable somewhere on a government website is the sort of reasoning that would get you laughed out of any pub in the country.

The policy allows deployments in crime hotspots, at major events, near critical infrastructure, and anywhere intelligence suggests a wanted person might show up, which covers a rather generous slice of London on any given day. Foreseeability here does not mean you know your face is being processed. It means someone once typed it into a Word document.

It gets better. Freedom of Information disclosures reported alongside the judgment reveal that the Met has no system whatsoever for identifying complaints that specifically concern facial recognition.

Working out how many people have complained about being wrongly scanned or wrongly matched requires someone to manually trawl through tens of thousands of records to find out. A system that processes four million faces a year cannot readily tell you how many of those four million people are furious about it, which is not the mark of a tightly controlled administrative tool so much as the mark of a system whose harms are genuinely uncountable. And harms that cannot be counted are harms that cannot easily be challenged, which is rather suspected to be the point.

Carlo, whose witness statement the court was largely excluded on the basis that she had “adopted the role of an advocate” (the horror, an advocate advocating), was commendably undeterred. “This is a disappointing judgment, but the fight against live facial recognition mass surveillance is far from over. There has never been a more important time to stand up for the public’s rights against dystopian surveillance tech that turns us into walking ID cards and treats us like a nation of suspects.”

Met Commissioner Sir Mark Rowley, naturally, was delighted. He called the ruling a “significant and important victory for public safety” and declared, with the straight face of a man who has never once questioned his own power, “The question is no longer whether we should use live facial recognition, it’s why we would choose not to.”

Why would we choose not to? The head of the Metropolitan Police genuinely thinks the burden of proof now sits with the public to explain why their biometric data shouldn’t be vacuumed up on the way to work. In a free country, the state is meant to justify watching its citizens, and not the other way round. A tool that processes the faces of everyone in front of a lens, regardless of whether they are suspected of anything at all, inverts the entire premise of policing by consent.

Then, God help us, we come to Sarah Jones. “I welcome today’s ruling because there can be no true liberty when people live in fear of crime in their communities,” she said. “Live facial recognition only locates specifically wanted people. Law abiding citizens have nothing to fear. This technology puts dangerous rapists and murderers behind bars, and I question any group who call that uncivil. We are rolling out facial recognition across the country with record investment to keep communities safe.”

“Nothing to fear.” There it is. The phrase that has been used to justify practically every expansion of state power for about a hundred years, a close cousin of “if you’ve got nothing to hide, you’ve got nothing to fear,” and probably the most un-British sentiment printed in the papers all month.

Rights exist precisely so that nobody has to audition their innocence before the state every time they step out for a pint of milk. A minister who thinks only wrongdoers should worry about being watched has already decided that being watched is the natural order of things, and that everything else is just tidying up.

Her claim that the system “only locates specifically wanted people” is also simply, factually, wrong. Every face that passes a camera is processed, every template is compared, and the non-matches are deleted only afterwards, which is rather like someone reading your diary before putting it back on the shelf and insisting they never read it because they’ve forgotten the interesting bits. The intrusion is the processing. The deletion happens later, if at all, and the public is simply meant to take the Met’s word for it.

Here is the wider point, which the court was never asked to decide. Nobody, at any stage of this, has been forced to explain why a democratic society should tolerate the routine biometric scanning of its own population as a baseline condition of leaving the house.

The judges ruled on whether the policy was clear enough, not on whether the policy should exist, and that bigger question remains wide open. Thompson is appealing, and good for him, because the government is clearly hoping the matter is settled while the public is still working out what has actually happened.

So there we are. London’s faces are now data, the Home Office would quite like yours as well, and the official government position, delivered with a reassuring smile, is that anyone who objects to any of this is probably a rapist or a murderer.

The people most fond of the phrase “nothing to hide, nothing to fear” are, without exception, the ones doing the watching. The ones being watched tend to have a rather different view, assuming anyone bothers to ask them, which, on the evidence of this week, nobody intends to.

Stand against censorship and surveillance: join Reclaim The Net.

Fight censorship and surveillance. Reclaim your digital freedom.

Get news updates, features, and alternative tech explorations to defend your digital rights.

Read More

Share this post

Reclaim The Net Logo

Reclaim The Net

Defend free speech and privacy online. Get the latest on Big Tech censorship, government surveillance, and the tools to fight back.