Clicky

Students are facing facial recognition watchlists in Texas schools

Students' movements and actions are recorded.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

After a mass shooting in 2018, the Santa Fe Independent School District in Texas decided to try facial recognition as an added layer of security.

While the reason for installing the system was reactive, it is hard to ignore the privacy concerns, and the fact that facial recognition is not reliably accurate in identifying people.

The Markup obtained documents detailing how the facial recognition system the school district installed works. In the test run, back in 2019, more than 5,000 student photos were uploaded to the system, and the results, according to the tech company, were impressive.

“Overall, we had over 164,000 detections the last 7 days running the pilot. We were able to detect students on multiple cameras and even detected one student 1100 times!” Taylor May, then a regional sales manager for AnyVision, said in an email to the school’s administrators. To the tech company, the numbers are impressive, but to people concerned about their privacy, the number is concerning, as it gives a clearer picture of the extent to which facial rec technology means the end to anonymity.

Among the documents The Markup obtained was the 2019 user guide of the software, called “Better Tomorrow.” The software is touted as a watchlist-based facial rec system, meaning it only detects people who it has been told to identify (faces that have been put on a watchlist).

But from the user guide, it is clear that the program logs all faces that appear on cameras, not only the faces of people on a watchlist. That explains why one single student was identified over a thousand times in five days.

By default the program keeps all logged faces for thirty days. Using the reverse image search feature, you can upload the photo of someone and figure out if they were caught on camera over the last 30 days.

Using the system, the school district was able to help the police identify a drug dealer they suspected to be a high school student.
The software has privacy features, including “Privacy Mode,” which will ignore all faces that are not on a watchlist, and “GDPR Mode,” which blurs faces not on a watchlist on downloads and video playback. The school district refused to comment on whether they have turned on the privacy features.

“We do not activate these modes by default but we do educate our customers about them,” AnyVision’s chief marketing officer, Dean Nicolls, said in an email. “Their decision to activate or not activate is largely based on their particular use case, industry, geography, and the prevailing privacy regulations.”

One of the major concerns with facial rec technology, whose use is ever increasing, is privacy. The technology empowers governments and private companies to constantly monitor and track people’s movements. And when it is used in schools, it means even the movement of minors can easily be monitored.

About a month ago, the European Data Protection Supervisor and the European Data Protection Board recommended the banning of the use of facial rec technology in public spaces, arguing that “deploying remote biometric identification in publicly accessible spaces means the end of anonymity in those places.”

Another major concern with the technology is its accuracy, especially in identifying non-white students. A study by the National Institute of Standards and Technology from December 2019, found out that most facial rec systems return more false positives against non-whites compared to whites.

According to James Grassmuck, a member of the Santa Fe Independent School District board, so far, there have been no cases of misidentifications and privacy complaints.

“They’re not using the information to go through and invade people’s privacy on a daily basis,” Grassmuck said. “It’s another layer in our security, and after what we’ve been through, we’ll take every layer of security we can get.”

“The mission creep issue is a real concern when you initially build out a system to find that one person who’s been suspended and is incredibly dangerous, and all of a sudden you’ve enrolled all student photos and can track them wherever they go,” Clare Garvie, a senior associate at the Georgetown University Law Center’s Center on Privacy & Technology, said. “You’ve built a system that’s essentially like putting an ankle monitor on all your kids.”

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Share this post

Reclaim The Net Logo

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.