Skip to content
Longterm Wiki
Back

NIST News: There's More to AI Bias Than Biased Data

government

Credibility Rating

5/5
Gold(5)

Gold standard. Rigorous peer review, high editorial standards, and strong institutional reputation.

Rating inherited from publication venue: NIST

This NIST news article summarizes a key government report relevant to AI fairness and governance; useful for understanding official US standards-body thinking on bias as a sociotechnical rather than purely technical problem.

Metadata

Importance: 52/100press releasenews

Summary

A NIST report argues that AI bias cannot be addressed through technical fixes alone, as it also stems from human and systemic biases embedded in societal structures, institutional contexts, and deployment practices. The publication 'Towards a Standard for Identifying and Managing Bias in Artificial Intelligence' broadens the scope of bias identification beyond training data to include social and organizational factors. This work contributes to NIST's broader AI trustworthiness and responsible AI development agenda.

Key Points

  • AI bias originates from multiple sources beyond biased training data, including human cognitive biases and systemic societal biases.
  • Technical solutions alone are insufficient; bias management must account for institutional and deployment contexts.
  • NIST recommends a wider, multi-layered framework for identifying and managing bias across the AI lifecycle.
  • The report supports NIST's broader mission to develop standards for trustworthy and responsible AI systems.
  • Recognizing sociotechnical dimensions of bias is essential for effective AI governance and fairness.

Cited by 1 page

PageTypeQuality
NIST and AI SafetyOrganization63.0

Cached Content Preview

HTTP 200Fetched Mar 15, 20269 KB
[Skip to main content](https://www.nist.gov/news-events/news/2022/03/theres-more-ai-bias-biased-data-nist-report-highlights#main-content)

![](https://www.nist.gov/libraries/nist-component-library/dist/img/icon-dot-gov.svg)

**Official websites use .gov**

A **.gov** website belongs to an official government organization in the United States.


![](https://www.nist.gov/libraries/nist-component-library/dist/img/icon-https.svg)

**Secure .gov websites use HTTPS**

A **lock** ( LockA locked padlock
) or **https://** means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.


https://www.nist.gov/news-events/news/2022/03/theres-more-ai-bias-biased-data-nist-report-highlights

![National Institute of Standards and Technology](https://www.nist.gov/libraries/nist-component-library/dist/img/logo/nist_logo_sidestack.svg)

[NEWS](https://www.nist.gov/news-events/news)

# There’s More to AI Bias Than Biased Data, NIST Report Highlights

### Rooting out bias in artificial intelligence will require addressing human and systemic biases as well.

March 16, 2022

## Share

[Facebook](https://www.facebook.com/share.php?u=https://www.nist.gov/news-events/news/2022/03/theres-more-ai-bias-biased-data-nist-report-highlights "Facebook")

[Linkedin](https://www.linkedin.com/shareArticle?mini=true&url=https://www.nist.gov/news-events/news/2022/03/theres-more-ai-bias-biased-data-nist-report-highlights&source=https://www.nist.gov/news-events/news/2022/03/theres-more-ai-bias-biased-data-nist-report-highlights "Linkedin")

[X.com](https://x.com/intent/tweet?url=https://www.nist.gov/news-events/news/2022/03/theres-more-ai-bias-biased-data-nist-report-highlights&status=https://www.nist.gov/news-events/news/2022/03/theres-more-ai-bias-biased-data-nist-report-highlights "X.com")

[Email](https://www.nist.gov/cdn-cgi/l/email-protection#a897dbddcac2cdcbdc95e6e1fbfc86cfc7de8ec9c5d893cac7ccd195ebc0cdcbc388c7dddc88dcc0c1db88dbc1dccd88c0dcdcd8db928787dfdfdf86c6c1dbdc86cfc7de87c6cddfdb85cddecdc6dcdb87c6cddfdb879a989a9a87989b87dcc0cddacddb85c5c7dacd85c9c185cac1c9db85cac1c9dbcdcc85ccc9dcc985c6c1dbdc85dacdd8c7dadc85c0c1cfc0c4c1cfc0dcdb "Email")

![An iceberg is shown, labeled with technical biases above the water's surface and with human biases and systemic biases underwater. ](https://www.nist.gov/sites/default/files/images/2022/03/14/22ITL003_risk-ai-final.jpg)

Bias in AI systems is often seen as a technical problem, but the NIST report acknowledges that a great deal of AI bias stems from human biases and systemic, institutional biases as well.

Credit:

N. Hanacek/NIST



As a step toward improving our ability to identify and manage the harmful effects of bias in artificial intelligence (AI) systems, researchers at the National Institute of Standards and Technology (NIST) recommend widening the scope of where we look for the source of these biases — beyond the machine learning processes and data used to train AI software to the 

... (truncated, 9 KB total)
Resource ID: e621922bd1b767c1 | Stable ID: MDY4ZmU3M2