Skip to content
Longterm Wiki
Back

Obermeyer et al. (2019)

paper

Authors

Z. Obermeyer·Brian W. Powers·C. Vogeli·S. Mullainathan

Credibility Rating

5/5
Gold(5)

Gold standard. Rigorous peer review, high editorial standards, and strong institutional reputation.

Rating inherited from publication venue: Science

Empirical study demonstrating racial bias in a widely-deployed clinical risk prediction algorithm, illustrating how AI systems can perpetuate societal inequities and the importance of algorithmic fairness auditing in high-stakes domains.

Paper Details

Citations
4,889
184 influential
Year
2019

Metadata

journal articleprimary source

Abstract

Racial bias in health algorithms The U.S. health care system uses commercial algorithms to guide health decisions. Obermeyer et al. find evidence of racial bias in one widely used algorithm, such that Black patients assigned the same level of risk by the algorithm are sicker than White patients (see the Perspective by Benjamin). The authors estimated that this racial bias reduces the number of Black patients identified for extra care by more than half. Bias occurs because the algorithm uses health costs as a proxy for health needs. Less money is spent on Black patients who have the same level of need, and the algorithm thus falsely concludes that Black patients are healthier than equally sick White patients. Reformulating the algorithm so that it no longer uses costs as a proxy for needs eliminates the racial bias in predicting who needs extra care. Science, this issue p. 447; see also p. 421 A health algorithm that uses health costs as a proxy for health needs leads to racial bias against Black patients. Health systems rely on commercial prediction algorithms to identify and help patients with complex health needs. We show that a widely used algorithm, typical of this industry-wide approach and affecting millions of patients, exhibits significant racial bias: At a given risk score, Black patients are considerably sicker than White patients, as evidenced by signs of uncontrolled illnesses. Remedying this disparity would increase the percentage of Black patients receiving additional help from 17.7 to 46.5%. The bias arises because the algorithm predicts health care costs rather than illness, but unequal access to care means that we spend less money caring for Black patients than for White patients. Thus, despite health care cost appearing to be an effective proxy for health by some measures of predictive accuracy, large racial biases arise. We suggest that the choice of convenient, seemingly effective proxies for ground truth can be an important source of algorithmic bias in many contexts.

Summary

Obermeyer et al. (2019) demonstrate significant racial bias in a widely used commercial health algorithm that affects millions of patients. The bias arises because the algorithm uses health care costs as a proxy for health needs, but due to unequal access to care, less money is spent on Black patients with equivalent health needs. This causes the algorithm to systematically underestimate illness severity in Black patients—at the same risk score, Black patients are considerably sicker than White patients. The authors show that reformulating the algorithm to directly predict health needs rather than costs eliminates this racial bias and would increase the percentage of Black patients identified for additional care from 17.7% to 46.5%.

Cited by 2 pages

Cached Content Preview

HTTP 200Fetched Mar 20, 202655 KB
[Skip to main content](https://www.science.org/doi/10.1126/science.aax2342#main-content-focus)

Advertisement

Main content starts here

Contents

## Racial bias in health algorithms

The U.S. health care system uses commercial algorithms to guide health decisions. Obermeyer _et al._ find evidence of racial bias in one widely used algorithm, such that Black patients assigned the same level of risk by the algorithm are sicker than White patients (see the Perspective by Benjamin). The authors estimated that this racial bias reduces the number of Black patients identified for extra care by more than half. Bias occurs because the algorithm uses health costs as a proxy for health needs. Less money is spent on Black patients who have the same level of need, and the algorithm thus falsely concludes that Black patients are healthier than equally sick White patients. Reformulating the algorithm so that it no longer uses costs as a proxy for needs eliminates the racial bias in predicting who needs extra care.

_Science_, this issue p. [447](https://doi.org/10.1126/science.aax2342); see also p. [421](https://doi.org/10.1126/science.aaz3873)

## Abstract

Health systems rely on commercial prediction algorithms to identify and help patients with complex health needs. We show that a widely used algorithm, typical of this industry-wide approach and affecting millions of patients, exhibits significant racial bias: At a given risk score, Black patients are considerably sicker than White patients, as evidenced by signs of uncontrolled illnesses. Remedying this disparity would increase the percentage of Black patients receiving additional help from 17.7 to 46.5%. The bias arises because the algorithm predicts health care costs rather than illness, but unequal access to care means that we spend less money caring for Black patients than for White patients. Thus, despite health care cost appearing to be an effective proxy for health by some measures of predictive accuracy, large racial biases arise. We suggest that the choice of convenient, seemingly effective proxies for ground truth can be an important source of algorithmic bias in many contexts.

## Register and access this article for free

As a service to the community, this article is available for free.

[Log in](https://www.science.org/action/ssostart?redirectUri=/doi/10.1126/science.aax2342) [Create a free account](https://purchase.aaas.org/order/startnew/580?&returnUrl=https://www.science.org/doi/10.1126/science.aax2342&ctc=SPREGFRE)

## Access the full article

View all access options to continue reading this article.

[CHECK ACCESS](https://www.science.org/doi/10.1126/science.aax2342#core-collateral-purchase-access)

## Supplementary Material

### Summary

Materials and Methods

Figs. S1 to S5

Tables S1 to S4

References ( [_46_](https://www.science.org/doi/10.1126/science.aax2342#R46)– [_51_](https://www.science.org/doi/10.1126/science.aax2342#R51))

### Resources

File(aax2342\_obermeyer\_sm.pdf)

- [Do

... (truncated, 55 KB total)
Resource ID: a2107d9d789b8124 | Stable ID: ZDE3ZTI3MD