Skip to content
Longterm Wiki
Back

2025 systematic review in npj Digital Medicine

paper

Authors

Ivan Vecchio·Lucas Mifsud·Sofia Castro e Almeida·Johannes Passecker

Credibility Rating

5/5
Gold(5)

Gold standard. Rigorous peer review, high editorial standards, and strong institutional reputation.

Rating inherited from publication venue: Nature

Systematic review examining algorithmic bias in 690 clinical decision instruments, identifying critical sources of bias including demographic skew, geographic concentration, and outcome definition issues—directly relevant to AI safety concerns around fairness and bias in deployed healthcare systems.

Paper Details

Citations
0
Year
2025
Methodology
peer-reviewed
Categories
npj Digital Medicine

Metadata

journal articleprimary source

Summary

This 2025 systematic review in npj Digital Medicine examines algorithmic bias in clinical decision instruments (CDIs) across 690 tools used in healthcare. The authors identify four potential sources of bias: skewed participant demographics (73% White, 55% male), geographically concentrated investigator teams (52% North America, 31% Europe), limited use of race/ethnicity as predictor variables (1.9%), and outcome definitions that may introduce socioeconomic bias (26% involve follow-up). The study highlights an equity dilemma where CDIs standardize best practices but may perpetuate existing healthcare disparities, recommending that developers and clinicians carefully consider these bias sources during instrument development and implementation.

Cited by 1 page

Cached Content Preview

HTTP 200Fetched Mar 15, 202647 KB
Potential for Algorithmic Bias in Clinical Decision Instrument Development | npj Digital Medicine 
 
 
 

 

 

 
 
 
 

 

 
 
 
 
 
 

 
 
 
 
 
 

 
 

 
 
 
 
 
 
 
 
 
 
 

 
 

 

 

 
 

 
 
 

 
 

 
 
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 
 
 
 
 
 
 
 

 
 
 

 
 Skip to main content 

 
 
 
 Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
 the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
 Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
 and JavaScript.

 
 

 

 

 
 
 

 
 
 Advertisement

 
 
 
 
 
 
 
 
 
 
 

 
 
 
 

 

 
 
 
 

 

 

 
 
 
 
 
 
 
 Potential for Algorithmic Bias in Clinical Decision Instrument Development
 
 
 
 
 
 
 Download PDF 
 
 
 
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 Download PDF 
 
 
 
 

 
 
 
 
 
 
 
 
 

 
 
 Subjects

 
 Diagnosis 
 Outcomes research 
 Predictive medicine 
 Clinical trial design 

 
 

 
 
 

 
 

 
 

 
 Abstract

 Clinical decision instruments (CDIs) face an equity dilemma. They reduce disparities in patient care through data-driven standardization of best practices. However, this standardization may perpetuate bias and inequality within healthcare systems. We perform a quantitative, systematic review to characterize four potential sources of bias in the development of 690 CDIs. We find evidence for potential algorithmic bias in CDI development through various analyses: self-reported participant demographics are skewed—e.g. 73% of participants are White, 55% are male; investigator teams are geographically skewed—e.g. 52% in North America, 31% in Europe; CDIs use predictor variables that may be prone to bias—e.g. 1.9% (13/690) of CDIs use Race and Ethnicity ; outcome definitions may introduce bias—e.g. 26% (177/690) of CDIs involve follow-up, which may skew representation based on socioeconomic status. As CDIs become increasingly prominent in medicine, we recommend that these factors are considered during development and clearly conveyed to clinicians.

 

 
 
 

 
 
 
 
 
 Similar content being viewed by others

 
 
 
 
 
 
 
 
 
 Mitigating the impact of biased artificial intelligence in emergency decision-making
 
 

 
 Article 
 Open access 
 21 November 2022 
 
 
 
 
 
 
 
 
 
 
 
 
 Acquisition parameters influence AI recognition of race in chest x-rays and mitigating these factors reduces underdiagnosis bias
 
 

 
 Article 
 Open access 
 29 August 2024 
 
 
 
 
 
 
 
 
 
 
 
 
 The future of algorithmic nondiscrimination compliance in the affordable care act
 
 

 
 Article 
 Open access 
 10 December 2025 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 Introduction

 Clinical decision i

... (truncated, 47 KB total)
Resource ID: e206c777c8b622b5 | Stable ID: MDBmMDBhYW