Clicky

Algorithms are being used to help determine if kids should be taken from their parents

Concerning.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

The limitations of algorithms today in terms of accuracy regarding context and nuance is evident even to a casual observer, most notably in the often blundering automated censorship on social networks.

Notwithstanding that state of affairs as far as limitations of the technology itself, without even taking into account intent of those behind it, social workers in the US have started relying on predictive algorithms in deciding when to investigate parents suspected of child neglecting.

The stakes are higher here than having a Facebook post deleted – these parents can eventually end up having their children removed, and the algorithm that is now in use in several US states, and spreading, is described by the AP as “opaque” – all the while providing social workers with statistical calculations that they can base their actions on.

Other concerns raised around the algorithms like the one used in Pennsylvania’s Allegheny County – that was the subject of a Carnegie Mellon University study – are reliability, which is entirely expected, as well as the effect of “hardening racial disparities in the child welfare system.”

And while such tools are either already put to work or are being considered in California, Colorado, and Oregon, in Illinois, the problems that emerged with the use of algorithms in such sensitive scenarios have caused the authorities to drop the idea of implementing such tools.

The university study that focused on the Allegheny County concluded that social workers do not agree with one in three risk scores that the system churns out, while another piece of research found that the number of black children flagged for mandatory neglect investigation, compared to their white peers, is “disproportionate,” and that this manifests as a pattern.

Allegheny’s authorities, however, dismissed the study as hypothetical, saying at the same time that social workers have the last word in making decisions, regardless of the output from the tool.

For the moment, the forms of neglect that are looked into with the help of the algorithm-powered tool are the quality of housing and a child’s hygiene, but not sexual and other forms of physical abuse.

Given the type of issues that are explored in a bid to decide whether to investigate, it is not surprising that the majority of the data collected concerns poor families.

“Families and their attorneys can never be sure of the algorithm’s role in their lives either because they aren’t allowed to know the scores,” reports the AP.

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share