Resist censorship and surveillance. Reclaim your digital freedom.

Get news, features, and alternative tech explorations to defend your digital rights.

Cardiff Man Wrongly Accused of Theft After Facial Recognition Error Triggers Privacy Complaint

A shopping trip turned public shaming exposes the hidden dangers of algorithmic policing.

Close-up surveillance camera mounted on a pole in a busy urban crowd, with blue digital face-recognition icons and translucent profile outlines over a blurred city background.

If you’re tired of censorship and surveillance, join Reclaim The Net.

A Cardiff man has filed a formal complaint with the Information Commissioner’s Office after being wrongly accused of theft in a store using facial recognition software.

The case is now drawing wider attention to the unchecked spread of biometric surveillance in everyday retail environments.

On 29 April 2025, Byron Long, 66, arrived at the B&M outlet in Cardiff Bay Retail Park expecting an ordinary shopping trip.

Instead, he was approached by staff and told he was barred from the premises. In front of other customers, he was accused of stealing £75 ($101) worth of goods during a visit earlier that month.

That accusation was entirely false. During the visit in question on 9 April, Long had bought a single item: a £7 ($9.50) packet of cat treats. He paid for them in full. He later obtained CCTV footage showing himself at the checkout in a Red Bull Formula 1 jacket, clearly completing the purchase.

“It was a horrible experience, and I haven’t been back to the store since. The incident has had a very serious impact on my mental health, which is very fragile anyway, and I am now very anxious whenever I go shopping,” Long said, as reported by Nation Cymru.

The misidentification came from Facewatch, a private firm contracted by retailers to run facial recognition scans on customers. Images from Long’s previous visit were processed and matched to a database of alleged offenders. That match triggered the alert that led B&M staff to accuse him.

B&M later acknowledged the error, issuing a written apology and stating: “Our B&M store and security teams have a duty of care to all our customers and to our company, and this includes challenging people that they believe are potentially shoplifting. This is an extremely difficult task, and sadly we don’t always get it right; your case would be one of these instances…We can confirm your data has been removed from Facewatch.”

They also offered a £25 ($34) voucher as compensation, an offer Long flatly rejected.

Facewatch responded to the incident by suspending the user who had submitted the incorrect data. Michele Bond, the company’s Head of Incident Review and Data Protection Enquiries, said: “Facewatch Incident data is submitted by authorized users, who must confirm the accuracy of the information provided. Once the error was identified, the user responsible was immediately suspended from using the Facewatch system.”

Long has since taken the matter to Big Brother Watch, a civil liberties group focused on privacy and surveillance. The organization has now submitted a complaint to the ICO on his behalf.

Jasleen Chaggar, Legal and Policy Officer at Big Brother Watch, raised an alarm over how facial recognition tools are being deployed in commercial spaces without sufficient oversight.

In the complaint, she wrote: “I am writing on behalf of Mr Byron Long to complain about infringements of his data rights by Facewatch and B&M Retail Limited (‘B&M’) on 9th April and 29th April 2025. I also seek to bring to the Commissioner’s attention wider concerns about how B&M and Facewatch implement facial recognition technology, in particular the failure of both Facewatch and B&M to adhere to their own policies and the ICO’s guidance on the use of this technology.”

The wrongful accusation of Byron Long at a B&M store in Cardiff, triggered by inaccurate facial recognition data from private surveillance firm Facewatch, is not an isolated incident.

Shaun Thompson, a youth mentor and community volunteer in London, is preparing to take the Metropolitan Police to court after facial recognition cameras falsely identified him as a wanted man.

Despite showing multiple forms of ID, Thompson was detained outside London Bridge station, threatened with arrest, and pressured for fingerprints. His case, now approved by the High Court, shows the growing misuse of facial recognition across both public and private sectors.

If you’re tired of censorship and surveillance, join Reclaim The Net.

Resist censorship and surveillance. Reclaim your digital freedom.

Get news, features, and alternative tech explorations to defend your digital rights.

More you should know:

Share this post