top of page
  • Writer's pictureJeffrey Reynolds

Can Big Data Save Little Kids? Don't Bet On It.


In a case that’s garnered national attention, ex-NYPD cop Michael Valva and his former fiancée, Angela Pollina, will face a judge and jury at the end of this month.


The two are accused of murdering Thomas Valva, the officer’s autistic 8-year-old son, in January 2020, by forcing him to sleep on the freezing garage floor of their Center Moriches home. They’re also accused of abusing Thomas’ 10-year-old brother, Anthony.


Valva and Pollina have both denied the charges. But they aren’t the only ones on trial here.

Thomas Valva’s preventable death turned a harsh spotlight on Suffolk County’s Department of Social Services. The child’s grieving mother and teachers said they repeatedly phoned-in maltreatment reports and begged – in vain – for Suffolk Child Protective Services caseworkers to help.


While child-abuse records are highly confidential, the case lifted the curtain on the inner workings of CPS. In response, Suffolk’s bipartisan CPS Transformation Act mandated several reforms, including strict caseload standards.


A recent Newsday report, however, found that during 2021, almost 60 percent of the county’s CPS workers had average caseloads exceeding the 12-per-month required under the 2020 law. Some workers handled up to 26 cases at once, according to the report.


Suffolk officials blamed COVID-related staff shortages and said they’re working to reduce caseloads. But hiring in this job market is tough and new staff must be properly trained.

Meanwhile, Suffolk gets nearly 9,000 reports of child abuse and neglect each year. Nassau averages closer to 7,000.


The stakes in these cases are high – missing signs of abuse or neglect could end with a child’s death – and even the most experienced caseworkers make mistakes. Flagging a family, on the other hand, subjects them to scrutiny that can lead to traumatic removal of the children, termination of parental rights and horrific foster-care placements.


That’s why some national child-protection agencies have turned to algorithm-assisted decision-making. Leading the way in 2016 was Allegheny County, Pa., where officials developed a computer program that would stratify family risk and help social workers decide which allegations of abuse and neglect should be investigated most quickly and thoroughly.


It sounds good. But new research from Carnegie Mellon University found that Allegheny’s Family Screening Tool flagged 68 percent of Black children it stratified for potential neglect investigations, compared to only 50 percent of White children.


The study also revealed that trained social workers disagreed with the computerized risk scores a whopping one-third of time. And when they overrode the system – using augmented, rather than automated, decision-making – the racial disparity dropped to 7 percent.


That’s still not great, but to put things in context, more than half of all Black children in the State of California are subjected to a CPS investigation based purely on human judgements – a pattern consistently repeated around the nation, often doubling the percentages of White children investigated.


The data variables and formulas Allegheny uses are kept secret, though they certainly include information from sister government agencies. Families who receive food, money, housing, healthcare, drug and alcohol counseling and/or mental-health services from government agencies will have a bigger electronic footprint than those who leverage second mortgages to finance ritzy drug rehab in Malibu.


Public Citizen, a Washington-based nonprofit consumer-advocacy group, says “black box” algorithms with undisclosed data sources and decision-making rules can exacerbate economic, social and racial injustice.


But isn’t data, by its very nature, sterile and unbiased?


Critics say that financial metrics, criminal justice intel, health records and the like cannot be race-neutral, because they draw on data that’s shaped by generations of discrimination and invariably baked into the equation. Others point out that algorithms can be biased based on their builders, developers and users.


Nassau County Department of Social Services Commissioner Nancy Nunziata, said her CPS staff doesn’t use an algorithm because of the potential disproportionate impact, nor does she believe there is any way to replace human intervention with technology.


Suffolk County Department of Social Services Commissioner Frances Pierre didn’t respond to inquiries about whether her department uses artificial intelligence to predict risk.


The upcoming trial will likely get into that – and raise even more questions about how Pierre’s staff, and everyone who interacts with endangered kids, can do a better job of preventing tragedy.


There are no easy answers. But we owe it to Thomas Valva to keep asking these questions.


A version of this article was first published by InnovateLI.



bottom of page