Clicky

The US government is using images of citizens without their consent, including dead people and exploited minors, to test facial recognition tech

People are being captured at their most vulnerable positions and their images are being used without knowledge or consent.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

The National Institute of Standards and Technology, a part of the US Department of Commerce, is using images of immigrants, deceased people and abused children to test facial recognition software. The regulatory body of facial recognition appointed by the US Government itself turned out to be the worst offender.

The NIST maintains the Facial Recognition Verification Testing program – a system that software companies use as the standard by which to compare their own facial recognition technology.

In some cases, a few applicants are even presented with a $25,000 reward if their facial recognition application scores high on accuracy and precision in recognizing faces in this program. Technology that scores highly will also be shown more interest from government departments – usually in the form of contracts.

For facilitating this program, Slate reports that the NIST needs millions of images to run the tests. They supposedly procured these pictures from images of children exploited for child pornography, US Visa applicants from Mexico and other countries, criminals that had been arrested and even the deceased.

The Department of Homeland Security provides additional images from images of travelers boarding aircraft in the US. Even the individuals booked on suspicion of criminal activity have their images shared.

All this information came into light through documents and materials that were obtained from Freedom of Information Act. With such a wide and non-consensual NIST data set containing millions of pictures, anybody can end up being testing material. People are being captured in their most vulnerable positions and their images are being used without their knowledge or consent.

Not only is the violation of privacy an issue, but the fact that the facial recognition applications are being trained and tested on such skewed data sets also presents another problem. On top of all this, a recent executive order has made NIST the lead of regulatory efforts surrounding facial recognition and artificial intelligence. When an organization is tasked on providing oversight for this controversial technology, and they themselves have been collecting data without consent, it doesn’t offer a favorable outlook for the future of the technology when it comes to privacy.

When Slate reached out to NIST for a comment, they received the following from Jennifer Huergo, director of media relations:

“The data used in the FRVT program is collected by other government agencies per their respective missions. In one case, at the Department of Homeland Security (DHS), NIST’s testing program was used to evaluate facial recognition algorithm capabilities for potential use in DHS child exploitation investigations. The facial data used for this work is kept at DHS and none of that data has ever been transferred to NIST. NIST has used datasets from other agencies in accordance with Human Subject Protection review and applicable regulations.”

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share