Tuesday, May 03, 2022

An algorithm used by various U.S. jurisdictions that screens for child neglect raises concerns. Really!

What matters more? How many children can be saved? It is a basic fact that most children can not protect themselves from prolonged neglect. They are dependent on adults to report or to intervene.

Congratulations to AP  (non-profit Associated Press news agency) for such an extensive, comprehensive article on this subject!

Again so called "racial disparities" are used to demonize a possibly useful decision support tool! Of course, such tools should be used with caution and humans need to be in the loop. However, most of these concerns stressed by the article below can be handled easily, I suspect. I bet, these cited comparisons with white children are flawed as is so often the case!

What if there are peculiar reasons why black American children are more often reported than white children? Instead of pointing to the always convenient and easy demagoguery of systemic racism and the like perhaps the reasons should be researched!

"... From Los Angeles to Colorado and throughout Oregon, as child welfare agencies use or consider tools similar to the one in Allegheny County, Pennsylvania, an Associated Press review has identified a number of concerns about the technology, including questions about its reliability and its potential to harden racial disparities in the child welfare system. Related issues have already torpedoed some jurisdictions’ plans to use predictive models, such as the tool notably dropped by the state of Illinois.
According to new research from a Carnegie Mellon University team obtained exclusively by AP, Allegheny’s algorithm in its first years of operation showed a pattern of flagging a disproportionate number of Black children for a “mandatory” neglect investigation, when compared with white children. ...
Child welfare officials in Allegheny County ... the icon’s child-centric innovations, say the cutting-edge tool – which is capturing attention around the country – uses data to support agency workers as they try to protect children from neglect. That nuanced term can include everything from inadequate housing to poor hygiene, but is a different category from physical or sexual abuse, which is investigated separately in Pennsylvania and is not subject to the algorithm.
“Workers, whoever they are, shouldn’t be asked to make, in a given year, 14, 15, 16,000 of these kinds of decisions with incredibly imperfect information,” said Erin Dalton, director of the county’s Department of Human Services and a pioneer in implementing the predictive child welfare algorithm. ...
Incidents of potential neglect are reported to Allegheny County’s child protection hotline. The reports go through a screening process where the algorithm calculates the child’s potential risk and assigns a score. Social workers then use their discretion to decide whether to investigate.
The Allegheny Family Screening Tool is specifically designed to predict the risk that a child will be placed in foster care in the two years after they are investigated. Using a trove of detailed personal data collected from birth, Medicaid, substance abuse, mental health, jail and probation records, among other government data sets, the algorithm calculates a risk score of 1 to 20: The higher the number, the greater the risk. ..."

An algorithm that screens for child neglect raises concerns | AP News

No comments: