Fight censorship and surveillance. Reclaim your digital freedom.

Get news updates, features, and alternative tech explorations to defend your digital rights.

British Transport Police Launch Facial Recognition Trials in London Stations

Britain’s railways are turning into laboratories for algorithmic policing, where every commuter’s face becomes just another line of data in the state’s growing experiment with control.

Blurred London Underground train passing a curved platform at a busy station with commuters waiting behind the yellow safety line beneath an arched, lit ceiling and an overhead digital display.

Stand against censorship and surveillance, join Reclaim The Net.

Some people, when they want to improve public transport safety, hire more staff, fix the lighting, or maybe even try being on time.

The British Transport Police, however, have gone full Black Mirror, deciding the best way to protect you from crime on your morning commute is by pointing cameras at your face and feeding your biometric soul into a machine.

Yes, for many Britons, facial recognition is coming to a railway station near them. Smile. Or don’t. It makes no difference. The algorithm will be watching anyway.

In the coming weeks, British Transport Police (BTP) will be trialling Live Facial Recognition (LFR) tech in London stations. It’s being sold as a six-month pilot program, which in government-speak usually means it will last somewhere between forever and the heat death of the universe.

The idea is to deploy these cameras in “key transport hubs,” which is bureaucratic code for: “places you’re likely to be standing around long enough for a camera to decide whether or not you look criminal.”

BTP assures us that the system is “intelligence-led,” which doesn’t mean they’ll be targeting shady characters with crowbars, but rather that the cameras will be feeding your face into a watchlist generated from police data systems.

They’re looking for criminals and missing people, they say. But here’s how it works: if your face doesn’t match anyone on the list, it gets deleted immediately. Allegedly. If it does match, an officer gets a ping, stares at a screen, and decides whether you’re a knife-wielding fugitive or just a man who looks like one.

And you have to love the quaint touch of QR codes, and signs stuck up around the station letting you know that, yes, your biometric identity is being scanned in real time.

Chief Superintendent Chris Casey would like you to know that “we’re absolutely committed to using LFR ethically and in line with privacy safeguards.”

The deployments, we’re told, will come with “internal governance” and even “external engagement with ethics and independent advisory groups.”

As Matthew Feeney from Big Brother Watch put it, without even a hint of sarcasm, which is admirable under the circumstances, “subjecting law-abiding passengers to mass biometric surveillance is a disproportionate and disturbing response.”

He’s right. Because this isn’t targeted policing. It’s dragnet surveillance.

Feeney continues: “Facial recognition technology remains unregulated in the UK and police forces are writing their own facial recognition rules.”

Which is a bit like letting the fox draw up security protocols for the henhouse. Except the fox has facial recognition, and the hens can’t opt-out.

Let’s be honest. The police love gadgets. But there’s a difference between using technology to make policing smarter and using it to make policing easier by turning humans into data points.

This is a technology that, if misused (and let’s be honest, when has that not happened?), can turn a routine station visit into a Kafkaesque nightmare.

And just when you thought it couldn’t get worse, it turns out this isn’t some quirky BTP one-off. It’s part of a national push. The government is now drawing up official guidance to help police decide when and where to aim their surveillance lasers.

Policing minister Sarah Jones proudly announced it during the Labour Party conference, calling live facial recognition “a really good tool.” Like a hammer, one assumes, if the problem is everyone’s face.

The Home Office has already splashed cash across seven more regions. Greater Manchester, West Yorkshire, Surrey, Sussex, Bedfordshire, Thames Valley, and Hampshire are all next in line for the big biometric bingo.

In London, the Met’s watchlists have more than doubled since 2020. Tens of thousands of people scanned every single day. And still no specific law governing any of it. Police forces are writing their own rulebooks while Parliament takes a long nap in the corner.

As we’ve previously reported, of course, the system’s already gone wrong. Shaun Thompson, a volunteer working to keep kids out of gangs, was wrongly flagged and stopped outside London Bridge.

Despite showing ID and explaining himself, he was threatened with arrest. Now he’s suing.

Because if the machine can’t tell a youth mentor from a fugitive, maybe it’s not the public that needs to be scrutinized. Maybe it’s the tech and the people pushing it.

If you’re tired of censorship and surveillance, join Reclaim The Net.

Fight censorship and surveillance. Reclaim your digital freedom.

Get news updates, features, and alternative tech explorations to defend your digital rights.

More you should know:

Share this post