There have been attempts to use predictive algorithms all over the place: from football scouting to TV streaming services, to law enforcement.
And once again, an effort has failed to deploy machine learning to predict a real-world problem, or to solve it – like for instance, the algorithms used for online moderation which turns into censorship.
This time, it happened at the Los Angeles Police Department (LAPD), which admitted after eight years of using the PredPol software tool that it was not really of much use.
PredPol – produced by a company by the same name and cost $60,000 apiece – was supposed to help the city's police do their job better by using data on past crimes to predict future ones, and thus reduce crime.
And although the use of the software had spread to other cities in the US after LAPD's “revolutionary adoption” of the tech in 2010 – there's not much evidence that the tool has worked as expected.
A spokesperson for Palo Alto police said PredPol was ineffective and didn't actually help them solve crimes. Some of the other police departments who adopted the software have in the meantime been dropping it.
But could this failure to preform really be the only, or even the major problem with PredPol, and the idea behind it?
In a post on the subject, Zero Hedge quotes LA Times and RT reports and takes issue with the police finding the lack of efficiency to be the only problematic thing here.
This approach leaves out completely the privacy-undermining, surveillance state implications of using this type of technology, which the blog equated to Orwellian-style tracking, monitoring, and thought-crime policing.
Both original reports mention that rights groups have voiced their concerns about law enforcement agencies turning to this kind of solution. LA Times remarks that privacy and liberty advocates are particularly concerned this may result in “heavier policing of black and Latino communities.”
PredPol is designed to take ten years' worth of crime data, and then “feed it into the algorithm by type of crime, date, time, and location, in order to predict the next 12 hours,” RT said.
PredPol, the company, meanwhile, defended their product and blamed the police officers who used it for its failure to produce any useful “crime predicting” data.