Fight censorship and surveillance. Reclaim your digital freedom.

Get news updates, features, and alternative tech explorations to defend your digital rights.

German States Expand Police Powers to Train AI Surveillance Systems with Personal Data

Germany’s new wave of police laws turns surveillance training into a state-sanctioned data experiment where privacy becomes optional.

Stylized urban street map split vertically into a cool blue left half and warm yellow-to-red right half, showing dense grid streets and coastline outlines on a black background.

Stand against censorship and surveillance, join Reclaim The Net.

Several German states are preparing to widen police powers by allowing personal data to be used in the training of surveillance technologies.

North Rhine-Westphalia and Baden-Württemberg are introducing legislative changes that would let police feed identifiable information such as names and facial images into commercial AI systems.

Both drafts permit this even when anonymization or pseudonymization is bypassed because the police consider it “impossible” or achievable only with “disproportionate effort.”

Hamburg adopted similar rules earlier this year, and its example appears to have encouraged other regions to follow. These developments together mark a clear move toward normalizing the use of personal information as fuel for surveillance algorithms.

The chain reaction began in Bavaria, where police in early 2024 tested Palantir’s surveillance software with real personal data.

The experiment drew objections from the state’s data protection authority, but still served as a model for others.

Hamburg used the same idea in January 2025 to amend its laws, granting permission to train “learning IT systems” on data from bystanders. Now Baden-Württemberg and North Rhine-Westphalia plan to adopt nearly identical language.

In North Rhine-Westphalia, police would be allowed to upload clear identifiers such as names or faces into commercial systems like Palantir’s and to refine behavioral or facial recognition programs with real, unaltered data.

Bettina Gayk, the state’s data protection officer, warned that “the proposed regulation addresses significant constitutional concerns.”

She argued that using data from people listed as victims or complainants was excessive and added that “products from commercial providers are improved with the help of state-collected and stored data,” which she found unacceptable.

The state government has embedded this expansion of surveillance powers into a broader revision of the Police Act, a change initially required by the Federal Constitutional Court.

The court had previously ruled that long-term video monitoring under the existing law violated the Basic Law.

Instead of narrowing these powers, the new draft introduces a clause allowing police to “develop, review, change or train IT products” with personal data.

This wording effectively enables continued use of Palantir’s data analysis platform while avoiding the constitutional limits the court demanded.

Across North Rhine-Westphalia, Baden-Württemberg, and Hamburg, the outcome will be similar: personal data can be used for training as soon as anonymization is judged to be disproportionately difficult, with the assessment left to police discretion.

Gayk has urged that the use of non-anonymized data be prohibited entirely, warning that the exceptions are written so broadly that “they will ultimately not lead to any restrictions in practice.”

Baden-Württemberg’s green-black coalition plans to pass its bill this week.

It would give police permission to train and test software such as Palantir’s, which the state already purchased for more than €25M ($29M), using personal data from citizens, regardless of whether they are suspects.

The proposed law states that the data may be processed “for the development, testing, and validation” of information technology systems.

Tobias Keber, the state’s data protection officer, has called for explicit checks to ensure that anonymization is truly impossible before allowing identifiable data to be used.

He also argued that his office should be involved early in the process to prevent abuse.

Hamburg has already implemented a system allowing its police to train AI models on real names and facial data and to share that information with third parties when anonymization is considered too burdensome.

The change aligns with the city’s broader effort to automate video monitoring.

A pilot project launched in 2023 used AI to scan public cameras for “atypical behavior” such as fights or clusters of onlookers.

Interior Senator Andy Grote stated, “Experience to date shows that thanks to the software, we become aware of dangerous situations very early on and can intervene immediately.”

Despite that claim, the test generated only one criminal case.

Hamburg’s data protection authority accepts that video surveillance using real data could be legally possible but warns that the law’s vague language, especially around “disproportionate effort,” leaves far too much room for interpretation. It also cautions against growing dependence on commercial surveillance providers.

Across these states, Germany’s long-standing privacy principles are being quietly rewritten.

The police are gaining the ability to feed real personal data into private AI systems, turning ordinary citizens into raw material for technology development.

If you’re tired of censorship and surveillance, join Reclaim The Net.

Fight censorship and surveillance. Reclaim your digital freedom.

Get news updates, features, and alternative tech explorations to defend your digital rights.

More you should know:

Share this post