Skip to content
Longterm Wiki
Back

Bias in Criminal Risk Scores Is Mathematically Inevitable, Researchers Say

web

This ProPublica article covers the mathematical impossibility of simultaneously satisfying multiple fairness criteria in predictive algorithms — a foundational insight for AI fairness and a key real-world case study in algorithmic harm relevant to AI governance and deployment discussions.

Metadata

Importance: 72/100news articlenews

Summary

ProPublica reports on independent research from multiple top universities showing that racial bias in recidivism prediction algorithms like COMPAS is mathematically unavoidable given base rate differences between groups. Four independent research groups found that it is impossible to simultaneously satisfy multiple common fairness criteria, meaning any such algorithm will inevitably disadvantage Black defendants in some measurable way.

Key Points

  • Four independent research groups (Stanford, Cornell, Harvard, CMU, UChicago, Google) all concluded that simultaneous fairness across multiple criteria is mathematically impossible when base rates differ between groups.
  • COMPAS was found to mislabel Black defendants as high-risk at twice the rate of white defendants, while white low-risk defendants reoffended more often than Black counterparts with the same scores.
  • Northpointe defended COMPAS by noting equal accuracy rates (~60%) across races, but researchers showed equal accuracy and equal error distribution are mutually exclusive fairness goals.
  • This work formalized what became known as the 'impossibility of fairness' — a fundamental constraint in algorithmic fairness with broad implications beyond criminal justice.
  • Researchers suggested modest formula revisions could reduce unfair categorization of Black defendants without significantly reducing predictive accuracy.

Cited by 1 page

Cached Content Preview

HTTP 200Fetched Mar 20, 202614 KB
GlobeArrow RightCaretClose [Skip to content](https://www.propublica.org/article/bias-in-criminal-risk-scores-is-mathematically-inevitable-researchers-say#main)

[![](https://www.propublica.org/wp-content/uploads/2025/10/20160523-machine-bias-300x200_1.jpg?w=300)**Series:  Machine Bias: Investigating Algorithmic Injustice**](https://www.propublica.org/series/machine-bias)

[More in this series](https://www.propublica.org/series/machine-bias)

The racial bias that ProPublica found in a formula used by courts and parole boards to forecast future criminal behavior arises inevitably from the test’s design, according to new research.

The findings were described in scholarly papers published or circulated over the past several months. Taken together, they represent the most far-reaching critique to date of the fairness of algorithms that seek to provide an objective measure of the likelihood a defendant will commit further crimes.

Increasingly, criminal justice officials are using similar risk prediction equations to inform their decisions about bail, sentencing and early release.

The researchers found that the formula, and others like it, have been written in a way that guarantees black defendants will be inaccurately identified as future criminals more often than their white counterparts.

The studies, by four groups of scholars working independently, suggests the possibility that the widely used algorithms could be revised to reduce the number of blacks who were unfairly categorized without sacrificing the ability to predict future crimes.

The author of one of the papers said that her ongoing research suggests that this result could be achieved through a modest change in the working of the formula ProPublica studied, which is known as COMPAS.

An article published earlier this year by ProPublica focused attention on possible racial biases in the COMPAS algorithm. We collected the COMPAS scores for more than 10,000 people arrested for crimes in Florida’s Broward’s County and checked to see how many were charged with further crimes within two years.

## [Machine Bias](https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing)

There’s software used across the country to predict future criminals. And it’s biased against blacks. [Read the story.](https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing)

When we looked at the people who did not go on to be arrested for new crimes but were dubbed higher risk by the formula, we found a racial disparity. The data showed that black defendants were twice as likely to be incorrectly labeled as higher risk than white defendants. Conversely, white defendants labeled low risk were far more likely to end up being charged with new offenses than blacks with comparably low COMPAS risk scores.

Northpointe, the company that sells COMPAS, said in response that the test was racially neutral. To support that assertion, company officials pointed to another o

... (truncated, 14 KB total)
Resource ID: 3bd4b29e4c338882 | Stable ID: MDg0YjFmMT