Subscribe for premier reporting on free speech, privacy, Big Tech, media gatekeepers, and individual liberty online.

Police caught using online spy tool to plot “pre-crimes”

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Tech startup Voyager Labs helps law enforcement agencies use what you post on social media and who you interact with to predict whether you have or “plan to” commit a crime. It is one of a growing number of companies that claim they can use social media analysis to help predict and solve crimes and has opened many questions about privacy.

Non-profit organization Brennan Center obtained documents through freedom of information requests that revealed the strategies Voyager uses violate the first amendment protections. For instance, the software uses posts about Islam and social media usernames indicating Arab pride as signs of potential inclination towards extremism. But they can also be used to target any group.

Additionally, according to the documents, obtained by The Guardian, the company uses questionable processes to access data on social media, and even enables law enforcement officers to infiltrate groups and private accounts using fake personas.

The company started nine years ago and has offices all over the world including New York, Washington DC, and Israel. The company is one of a growing number of tech firms exploring social media analytics for use in law enforcement. Others include Media Sonar, Palantir, PredPol, and Geofeedia.

The technologies provided by these tech firms are attractive to law enforcement, because they promise to automate and expedite the process of preventing crime. The documents obtained by the Brennan Center show that LAPD has been trialing Voyager Labs software since 2019. The department has also worked or considered working with other such companies.

According to experts, such kinds of software are a privacy nightmare for the public and potentially illegal as they criminalize otherwise legal behavior such as associating with certain people.

The documents revealed that Voyager uses a “guilty-by-association” model. The Guardian’s coverage of the story explained:
“Voyager software hoovers up all the public information available on a person or topic – including posts, connections and even emojis – analyzes and indexes it and then, in some cases, cross-references it with non-public information.

“Internal documents show the technology creates a topography of a person’s entire social media existence, specifically looking at users’ posts as well as their connections, and how strong each of those relationships are.

“The software visualizes how a person’s direct connections are connected to each other, where all of those connections work, and any “indirect connections” (people with at least four mutual friends). Voyager also detects any indirect connections between a subject and other people the customer has previously searched for.”

New York University’s data journalism professor and author of “Artificial Intelligence: How Computers Misunderstand the World” Meredith Broussard likened Voyager’s systems to the systems used for online ad targeting.

Online ad targeting systems group people into “affinity groups” based on shared interests.

“So instead of grouping people into buckets like ‘pet owners’, what Voyager seems to be doing is putting people into ‘buckets’ of likely criminals,” Broussard explained.

She added: “It’s a ‘guilt by association’ system.”

Voyager’s software supplements the publicly available data with information it acquires through warrants and subpoenas and what it calls an “active persona.”

The company obtains data such as private text messages and location of a subject through warrants and subpoenas obtained by law enforcement agencies.

John Hamasaki, a criminal defense lawyer and member of the police commission in San Francisco, said: “The degree to which private information is being seized, purportedly lawfully under search warrants, is just way over-broad.”

He added that the fact that the police can now analyze the data through AI technology provided by companies such as Voyager raises civil liberties and privacy concerns.

The documents do not contain many details on the so-called premium “active persona” service. The company states that clients can use “avatars” for the purposes of collecting and analyzing “information that is otherwise inaccessible” on several networks.

Voyager claims the service can be used to access encrypted information on Telegram, and a 2019 roadmap showed that it was planning on rolling out the “active persona” feature on WhatsApp and Instagram.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.