Use of algorithms for child protection puts us on path to injustice
The use of predictive analytics to determine whether children should be taken away from their parents presents frightening implications for families and our society (“Computers may spot abuse risks,” Page A1, Oct. 7). As with predictive analytics for policing, such a policy runs the risk of creating a feedback loop of injustice, disproportionately ensnaring the poor and parents of color. Across the nation, these parents face stricter scrutiny from government agencies, and are much more likely than their wealthier and whiter neighbors to face charges of so-called neglect for things such as letting their 10-year-old children play outside unsupervised.
Running these data through algorithms will not produce more equitable outcomes for children, and it risks endangering children by taking them away from their parents and placing them in potentially worse foster care situations without just cause.
Computer algorithms are no substitute for well-trained counselors with sufficient resources to match demand for services.