Friday, June 10, 2022

Oregon Child Welfare Officials Drop AI to screen child neglect for bias reasons

Skin color bias has become a very convenient and frequent excuse!

Disparate or disproportionate impact are very ambiguous if not controversial notions when applied to skin color!

What matters more, the skin color of a child or whether a child is potentially neglected?
Are their no humans in the loop? Can humans in the loop not handle these issues? Of course, they can!

Is it impossible that different family structures, behaviors, and relations exist or are observable between peoples of different backgrounds? If they exist, is it possible that some have more positive effects while others have negative effects on child welfare?

"From Los Angeles to Colorado and throughout Oregon, as child welfare agencies use or consider tools similar to the one in Allegheny County, Pennsylvania, an Associated Press review has identified a number of concerns about the technology, including questions about its reliability and its potential to harden racial disparities in the child welfare system. Related issues have already torpedoed some jurisdictions’ plans to use predictive models, such as the tool notably dropped by the state of Illinois.

According to new research from a Carnegie Mellon University team obtained exclusively by AP, Allegheny’s algorithm in its first years of operation showed a pattern of flagging a disproportionate number of Black children for a “mandatory” neglect investigation, when compared with white children. The independent researchers, who received data from the county, also found that social workers disagreed with the risk scores the algorithm produced about one-third of the time. ..."

The Batch: Top 100 AI Startups, Inside the Mind of DALL·E 2, Child Welfare Officials Drop AI, Fresh Images From Cellular Automata

No comments: