Technology

DOJ investigates AI tool allegedly biased against families with disabilities

Since 2016, social workers in one Pennsylvania county have relied on an algorithm to help them determine which child welfare calls warrant further investigation. Now, the Justice Department is reportedly investigating the controversial family screening tool over concerns that use of the algorithm could be a violation of the Americans with Disabilities Act by allegedly discriminating against families with disabilities, It was reported by the Associated Pressincluding families with mental health problems.

Three anonymous sources, breaking confidentiality agreements with the Justice Department, confirmed to the AP that civil rights lawyers have filed complaints since last fall and have grown increasingly concerned about alleged biases built into Allegheny County Family Screening Tool. While the full scope of the Justice Department’s alleged investigation is currently unknown, the Civil Rights Division is apparently interested in learning more about how the use of the data-driven tool could potentially harden historic systemic biases against people with disabilities.

The county describes its predictive risk modeling tool as a preferred resource for reducing human error for social workers, who benefit from the algorithm’s rapid analysis of “hundreds of data elements for each person involved in a child abuse allegation.” It includes “data points linked to disabilities in children, parents and other members of local households,” Allegheny County told the AP. These data points contribute to an overall risk score that helps determine whether a child should be removed from their home.

Although the county told the AP that social workers can override the tool’s recommendations and that the algorithm has been updated “several times” to remove disability-related data points, critics worry that the screening tool could still automate discrimination. This is especially troubling because the Pennsylvania algorithm has inspired similar tools used in California and Colorado, the AP reported. Oregon stopped using its family screening tool because of similar concerns that its algorithm could exacerbate racial biases in its child welfare data.

The Justice Department has yet to comment on its alleged interest in the tool, but the AP reported that the department’s scrutiny could potentially turn a moral argument against using child protection algorithms into a legal one.

A University of Minnesota child welfare and disability expert, Traci LaLiberte, told the AP that it is unusual for the Justice Department to get involved in child welfare issues. “It really has to rise to the level of pretty significant concern to dedicate time and get involved,” LaLiberte told the AP.

Ars could not immediately reach developers of the algorithm or the Allegheny County Department of Human Services for comment, but a county spokesman, Mark Bertolet, told the AP that the agency was not aware of the Justice Department’s interest in its screening tool.

Problems in predicting child maltreatment

On its website, Allegheny County said the family screening tool was developed in 2016 to “enhance our decision-making process for child welfare call screening with the singular goal of improving child safety.” That year, the county reported that before using the algorithm, human error led child protective services to investigate 48 percent of the lowest-risk cases while overlooking 27 percent of the highest-risk cases. ONE 2016 external ethics review supported the county’s use of the algorithm as an “inevitably imperfect” but comparatively more accurate and more transparent method of assessing risk, rather than relying on clinical judgment alone.

“We reasoned that by using technology to collect and weigh all available relevant information, we could improve the basis for these critical decisions and reduce variability in staff decision-making,” the county said on its website, pledging to continue refining the model as more analysis of the tool was carried out.

Although the county told the AP that risk scores alone never trigger investigations, the county’s website still says that “when the score is at the highest level, reaching the ‘mandatory screen in’ threshold, the allegations in a call must be investigated.” Because disability-related data points contribute to this score, critics suggest that families with disabilities are more likely to be targeted for surveys.

The same year the family screening tool was introduced, the Christopher & Dana Reeve Foundation and the National Council on Disability found. published a toolkit to help parents with disabilities know their rights when fighting in the courts for child welfare.

“For many of the 4.1 million parents with disabilities in the United States, the courts have decided that they are not good parents simply because they have a disability,” the organization wrote in the toolkit’s introduction. “In fact, as of 2016, 35 states still said that if you had a disability, you could lose your right to be a parent, even if you didn’t hurt or ignore your child.”

Allegheny County told the AP that “it should come as no surprise that parents with disabilities … may also need additional supports and services.” Neither the county’s ethical analysis nor its FAQ directly discusses how the tool could however disadvantage these families.

Ars could not reach LaLiberte for further comment, but she told the AP that her research has also shown that parents with disabilities are already disproportionately targeted by child welfare agencies. She suggested that incorporating disability-related data points into the algorithm is apparently inappropriate because it leads social workers to consider “characteristics that people cannot change,” rather than solely assessing problematic behaviors.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button