News

Predictive analytics in child welfare raise concerns

data analytics
Askhat Gilyakhov/Shutterstock

A contentious debate on the value of using predictive analytics within the child welfare system is unfolding between proponents who say the technology removes biases and helps protect kids, and opponents who say it has the power to amplify racial and income inequalities.

The conversation on predictive analytics — using large data sets and algorithms to forecast the odds of something likely to happen in the future —  is important because the decisions child welfare agencies make based on this technology ultimately decide the fate of whether families are broken up and children placed in foster care.  

The child welfare system has been often criticized for the harm it causes communities of color. Studies show that families of color are disproportionately represented in the child welfare system and Black children are most likely to be investigated by child protective services. For many advocates, the child welfare system is the epitome of structural racism. 

Michael Lindsey, executive director of the NYU McSilver Institute for Poverty Policy and Research at NYU Silver School of Social Work, takes a measured approach, calling predictive analytics a “double-edged sword.” The institute recently hosted a webinar on predictive analytics featuring a range of speakers and opinions.

“They can be transformational at helping us to identify people whose health and well-being are at risk, ” Lindsey said. “Yet they also can perpetuate, or even magnify, structural and inequities by relying on data sets and the assumptions that already reflect societal biases.”

Not only are some vehemently opposed to the use of predictive analytics, they also say the entire child welfare system should be abolished. Predictive analytics is just another example of a long history of the state separating and harming families of color. 

In the other camp, proponents argue children are regularly harmed or placed in harm’s way by family and need protection. The current system relies on outdated technology where decisions are made by people who are susceptible to bias. With predictive analytics, they say, bias can be mitigated and the scarce resources of the child welfare agencies can be used more efficiently. 

“[Predictive analytics]  is not intended to be punitive,” said Emily Putnam-Hornstein, Director of Policy Practice at UNC-Chapel Hill School of Social Work and co-director of Children’s Data Network. “There are children in our communities who need protection.”

Putnam-Hornstein has spent much of her career using data to research the child welfare system and has helped create predictive analytic models. She says data can be used to improve decisions in the child welfare system and remove bias. 

“The goal here is to improve the system,” Putnam-Hornstein said during the webinar, adding that agencies use out-of-date “rudimentary risk assessment tools” that are 20 years old or more.

Khadijah Abdurahman, a research fellow at Columbia University and the AI Now Institute at UCLA, disagreed with Putnam-Hornstein, pointing out that some of the data predictive analytics relies on comes from law enforcement.

“It uses data from the police about families,” Abdurahman said. “It doesn’t use data about the police abusing and murdering Black children.”

Abdurahman added that predictive analytics removes decision-making from frontline caseworkers who are often from the communities impacted by the child welfare system.   

ACLU attorney Anjana Samant said predictive analytics are being used “to help decide whether a call or complaint to a child welfare agency or hotline should be dismissed outright or followed up on by the agency.” 

The American Civil Liberties Union has even waded into the argument, releasing a report in September titled Family Surveillance by Algorithm: The Rapidly Spreading Tools Few Have Heard Of. The report detailed the wave of predictive analytics in the country and called for independent evaluations of the technology to ensure it isn’t exacerbating racial inequalities, breaking up families, or causing children to unnecessarily enter the foster care system. 

According to the ACLU, child welfare agencies in at least 11 states are currently using predictive analytics, but agencies in more than half of all states have considered predictive analytics. The ACLU report added that several states adopted the model, but, for whatever reason, have since discontinued using the tool. 

New York City, Oregon, and Allegheny County, Penn., have been using predictive analytics for several years. Los Angeles, which had been exploring its use for seven years now, launched a predictive analytics tool this year for helping sort through cases and flag high-risk ones to review.  

Predictive analytics can be applied in various ways by child welfare agencies. It is commonly used for screening purposes.

For example, ACLU attorney and author of the report Anjana Samant said the tool in Allegheny County “is used to help decide whether a call or complaint to a child welfare agency or hotline should be dismissed outright or followed up on by the agency.” 

Samant added the tool, known as “The Allegheny Family Screening Tool,” uses data pulled from a host of sources on risk factors such as past substance use, criminal activity or prior encounters with child welfare, to predict the odds a child will be removed from the home for placement in the child welfare system within two years of the complaint. 

The ACLU has identified other common uses, including identifying “hot spots” or geographies flagged for the greatest risk of child maltreatment; reviewing open cases; and child welfare system reentry prediction. 

For Abdurahman, who identifies as a biracial Black American, not only does the specific technology target communities of color, she said, but the whole child welfare system should also be abolished. 

“Just like they were separating us at the auction block; just like they were separating us in the slave ship; just like the way they are separating us on Rikers Island is the way that they are separating us through machine learning and artificial intelligence,” said Abdurahman. 

To Top
Skip to content