British police forces have begun acquiring AI software from a US tech company that merges sensitive personal data, such as race, health, political views, religious beliefs, sexuality, and union membership, into a unified intelligence platform.
A leaked internal memo from Bedfordshire Police obtained through freedom of information, reveals plans to roll out the “Nectar” system beyond its pilot stage.
More: The Federal Data Dragnet Just Got an Upgrade
Developed in partnership with Palantir Technologies, Nectar draws together approximately 80 data streams, from traffic cameras to intelligence files, into a single platform. Its stated aim is to generate in-depth profiles of suspects and to support investigations involving victims, witnesses, and vulnerable groups, including minors.
The 34-page briefing highlights police leadership hoping to extend the software’s deployment from Bedfordshire and the Eastern Region Serious Organised Crime Unit to a national scale, Liberty reported. It asserts the system could enhance crime prevention efforts and protect at-risk individuals more effectively.
This move forms part of a broader governmental initiative to apply artificial intelligence across public services, including health and defense, often via private sector partnerships such as this.
However, the deployment of Nectar, which accesses eleven “special category” data types, has raised alarms among privacy advocates and some lawmakers. These categories include race, sexual orientation, political opinions, and trade union membership.
While Palantir and Bedfordshire Police emphasize that Nectar only utilizes information already held within existing law enforcement databases and remains inaccessible to non-Police personnel, concerns are mounting. There are worries about potential misuse, such as data retention without proper deletion processes, and the risk that innocent individuals could be flagged by algorithms designed to identify criminal networks.
Former Shadow Home Secretary David Davis voiced alarm to the I Magazine, calling for parliamentary scrutiny and warning that “zero oversight” might lead to the police “appropriating the powers they want.”
Liberty and other campaigners have also questioned whether Nectar effectively constitutes a mass surveillance tool, capable of assembling detailed “360-degree” profiles on individuals.
In response, a Bedfordshire Police spokesperson stated the initiative is an “explorative exercise” focused on lawfully sourced, securely handled data.
They argue the system accelerates case processing and supports interventions in abuse or exploitation, especially among children. Palantir added that within the first eight days of deployment, Nectar helped identify over 120 young people potentially at risk and facilitated the application of Clare’s Law notifications.
Palantir, which built Nectar using its Foundry data platform, insists its software does not introduce predictive policing or racial profiling and does not add data beyond what police already collect. The firm maintains that its role is confined to data organization, not decision-making.
Still, experts express deep unease.
Although national rollout has not yet been authorized, the Home Office confirms that results from the pilot will inform future decisions. With private-sector AI tools embedded more deeply into policing, questions about oversight, transparency, data deletion, and individual rights loom ever larger.