The steady spread of facial recognition technology onto Britain’s streets is drawing alarm from those who see it as a step toward mass surveillance, even as police leaders celebrate it as a powerful new weapon against crime.
Live Facial Recognition (LFR) is a system that scans people’s faces in public spaces and compares them against watchlists.
Civil liberties groups warn it normalizes biometric monitoring of ordinary citizens, while the Metropolitan Police insist it is already producing results.
Britain’s senior police leadership is promoting these biometric and artificial intelligence systems as central to the future of policing, with commissioner Sir Mark Rowley arguing that such tools are already transforming the way the Met operates.
Speaking to the TechUK trade association, Rowley described Live Facial Recognition (LFR) as a “game-changing tool” and pointed to more than 700 arrests linked to its use so far this year.
Camera vans stationed on streets have been deployed to flag people wanted for serious crimes or those breaking license conditions.
Rowley highlighted a recent deployment at the Notting Hill Carnival, where he joined officers using LFR.
“Every officer I spoke to was energized by the potential,” he said to The Sun. According to the commissioner, the weekend brought 61 arrests, including individuals sought in cases of serious violence and offenses against women and girls.
Rowley claimed that the technology played “a critical role” in making the carnival safer.
Beyond facial recognition, Rowley spoke of expanding the Met’s reliance on drones. “From searching for missing people, to arriving quickly at serious traffic incidents, or replacing the expensive and noisy helicopter at large public events,” he said, “done well, drones will be another tool to help officers make faster, more informed decisions on the ground.”
The commissioner also promoted the V100 program, which draws on data analysis to focus resources on those considered the highest risk to women. He said this initiative has already led to the conviction of more than 160 offenders he described as “the most prolific and predatory” in London.
Artificial Intelligence is being tested in other areas too, particularly to review CCTV footage.
Rowley noted the labour involved in manually tracing suspects through crowded areas. “Take Oxford Street, with 27 junctions—a trawl to identify a suspect’s route can take two days,” he explained.
“Now imagine telling AI to find clips of a male wearing a red baseball cap between X and Y hours, and getting results in hours. That’s game-changing.”
While the Met portrays these systems as advances in crime prevention, their deployment raises questions about surveillance creeping deeper into everyday life.
Expansions in facial recognition, drone monitoring, and algorithmic analysis are often introduced as matters of efficiency and safety, but they risk building an infrastructure of constant observation where privacy rights are gradually eroded.
Shaun Thompson’s case has already been cited by campaigners as evidence of the risks that come with rolling out facial recognition on public streets.
He was mistakenly identified by the technology, stopped, and treated as though he were a wanted suspect before the error was realized.
Incidents like this highlight the danger of false matches and the lack of safeguards around biometric surveillance.
For ordinary people, the impact is clear: even if you have done nothing wrong, you can still find yourself pulled into a system that treats you as guilty first and asks questions later.