Clicky

Subscribe for premier reporting on free speech, privacy, Big Tech, media gatekeepers, and individual liberty online.

UK lawmakers call for police to stop using face recognition tech

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

The House of Commons Science and Technology Committee said the police should stop using AI face-recognition tech, and added there should be no further testing until relevant legislation is set up.

The technology raises several concerns on bias and accuracy.

“It is unclear whether police forces are unaware of the requirement to review custody images every six years, or if they are simply ‘struggling to comply’,” said the committee’s report.

“What is clear, however, is that they have not been afforded any earmarked resources to assist with the manual review and weeding process.”

The MPs warned of the possibility of innocent people’s photos being illegally included in face-recognition databases used in public spaces by authorities to stop and arrest suspects.

The report underlined similar concerns had been raised a year ago but saw little progress from the Home Office since. It also mentioned the Scottish Executive’s order for an independent review into the use and storage of biometric data.

One week before the report, Home Secretary Sajid Javid said he backed the police tests of face-recognition systems, acknowledging that full-time implementation would require a legislative substrate.

Earlier in the month, the Information Commissioner Elizabeth Denham stated that the use of this technology by the police raises “significant privacy and data protection issues” and might even be in contrast with current data protection laws.

Surveillance Camera Commissioner Tony Porter strongly criticized London’s Metropolitan Police trials: “We are heading towards a dystopian society where people aren’t trusted, where they are logged and their data signatures are tracked.”

The Home Office also noted that there is widespread public support for the use of face-recognition technologies to identify potential terrorists and serious violent crime perpetrators.

“The government believes that there is a legal framework for the use of live facial recognition technology, although that is being challenged in the courts and we would not want to pre-empt the outcome of this case,” said a Home Office spokesman.

“However, we support an open debate about this, including how we can reduce the privacy impact on the public.”

It also revealed that Kent and West Midlands’ authorities plan to test software to retrospectively analyze CCTV recordings in an attempt to identify missing and vulnerable people.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.

Share