Clicky

Resist censorship. Reject surveillance. Reclaim your voice.

Stay informed on censorship, cancel culture, and surveillance, and learn how to take your digital rights back.

Resist censorship. Reject surveillance. Reclaim your voice.

Stay informed on censorship, cancel culture, and surveillance, and learn how to take your digital rights back.

British Police Test AI System to Profile Individuals Using Sensitive Data From 80 Sources

A powerful new police tool quietly turns everyday data into full-spectrum portraits of people's lives.

Close-up of a blue eye overlaid with digital data elements and the British flag, symbolizing cybersecurity or technology in the UK.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

British police forces have begun acquiring AI software from a US tech company that merges sensitive personal data, such as race, health, political views, religious beliefs, sexuality, and union membership, into a unified intelligence platform.

A leaked internal memo from Bedfordshire Police obtained through freedom of information, reveals plans to roll out the “Nectar” system beyond its pilot stage.

More: The Federal Data Dragnet Just Got an Upgrade

Developed in partnership with Palantir Technologies, Nectar draws together approximately 80 data streams, from traffic cameras to intelligence files, into a single platform. Its stated aim is to generate in-depth profiles of suspects and to support investigations involving victims, witnesses, and vulnerable groups, including minors.

The 34-page briefing highlights police leadership hoping to extend the software’s deployment from Bedfordshire and the Eastern Region Serious Organised Crime Unit to a national scale, Liberty reported. It asserts the system could enhance crime prevention efforts and protect at-risk individuals more effectively.

Official Data Protection Impact Assessment (DPIA) document for Palantir Foundry Platform (Nectar) Beds force, detailing the project's goal to help multiple police units and eventually apply it nationally to protect vulnerable people by preventing, detecting, and investigating crime; it lists special category data used such as race, political opinions, religion, genetic data, sexual orientation, philosophical beliefs, ethnic origin, sex life, trade union membership, biometric data, and health; data subjects involved include persons suspected or convicted of criminal offences, victims, witnesses, children or vulnerable individuals, and employees.

This move forms part of a broader governmental initiative to apply artificial intelligence across public services, including health and defense, often via private sector partnerships such as this.

However, the deployment of Nectar, which accesses eleven “special category” data types, has raised alarms among privacy advocates and some lawmakers. These categories include race, sexual orientation, political opinions, and trade union membership.

While Palantir and Bedfordshire Police emphasize that Nectar only utilizes information already held within existing law enforcement databases and remains inaccessible to non-Police personnel, concerns are mounting. There are worries about potential misuse, such as data retention without proper deletion processes, and the risk that innocent individuals could be flagged by algorithms designed to identify criminal networks.

Checklist showing selected options for special category data to be used in the proposal, including Race, Ethnic origin, Political opinions, Sex life, Religion, Trade union membership, Genetic Data, Biometric Data, Sexual orientation, and Health, with Philosophical beliefs and None not selected.

Former Shadow Home Secretary David Davis voiced alarm to the I Magazine, calling for parliamentary scrutiny and warning that “zero oversight” might lead to the police “appropriating the powers they want.”

Liberty and other campaigners have also questioned whether Nectar effectively constitutes a mass surveillance tool, capable of assembling detailed “360-degree” profiles on individuals.

In response, a Bedfordshire Police spokesperson stated the initiative is an “explorative exercise” focused on lawfully sourced, securely handled data.

They argue the system accelerates case processing and supports interventions in abuse or exploitation, especially among children. Palantir added that within the first eight days of deployment, Nectar helped identify over 120 young people potentially at risk and facilitated the application of Clare’s Law notifications.

Palantir, which built Nectar using its Foundry data platform, insists its software does not introduce predictive policing or racial profiling and does not add data beyond what police already collect. The firm maintains that its role is confined to data organization, not decision-making.

Still, experts express deep unease.

Although national rollout has not yet been authorized, the Home Office confirms that results from the pilot will inform future decisions. With private-sector AI tools embedded more deeply into policing, questions about oversight, transparency, data deletion, and individual rights loom ever larger.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Logo with a red shield enclosing a stylized globe and three red arrows pointing upward to the right, next to the text 'RECLAIM THE NET' with 'RECLAIM' in gray and 'THE NET' in red

Resist censorship. Reject surveillance. Reclaim your voice.

Stay informed on censorship, cancel culture, and surveillance, and learn how to take your digital rights back.

Logo with a red shield enclosing a stylized globe and three red arrows pointing upward to the right, next to the text 'RECLAIM THE NET' with 'RECLAIM' in gray and 'THE NET' in red

Resist censorship. Reject surveillance. Reclaim your voice.

Stay informed on censorship, cancel culture, and surveillance, and learn how to take your digital rights back.

Share this post

Reclaim The Net Logo

Defend free speech and individual liberty online.