Skip to content
Longterm Wiki
Back

NIH PMC: Hallucination Terminology

paper

Credibility Rating

4/5
High(4)

High quality. Established institution or organization with editorial oversight and accountability.

Rating inherited from publication venue: PubMed Central

A terminological critique relevant to AI safety researchers concerned with precise characterization of LLM failure modes; useful background for discussions on evaluation standards and avoiding anthropomorphization of AI systems.

Metadata

Importance: 38/100journal articlecommentary

Summary

This editorial argues that the term 'hallucination' is an imprecise and misleading metaphor when applied to false outputs from AI language models, since AI systems lack the sensory perception that defines clinical hallucinations. The authors contend that borrowing medical terminology obscures the computational mechanisms behind AI errors and call for more precise, technically accurate vocabulary to describe how and why models produce unjustified or false outputs.

Key Points

  • The term 'hallucination' is a medical/psychiatric concept tied to sensory perception, making it a poor metaphor for AI systems that lack any perceptual apparatus.
  • Using imprecise borrowed terminology risks conflating distinct phenomena, creating confusion among researchers, clinicians, and the public.
  • More precise terminology would better characterize the underlying computational mechanisms causing AI models to generate false or unjustified outputs.
  • The framing of AI errors matters for AI safety and evaluation: vague terminology can obscure root causes and hinder mitigation efforts.
  • This debate reflects broader concerns about anthropomorphizing AI systems in ways that may mislead understanding of their capabilities and failure modes.

Cited by 1 page

Cached Content Preview

HTTP 200Fetched Mar 15, 202616 KB
False Responses From Artificial Intelligence Models Are Not Hallucinations - PMC
 

 
 
 
 
 
 
 
 
 
 
 
 

 
 
 

 
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 
 
 
 
 Skip to main content
 

 
 

 
 
 

 
 
 
 
 
 
 Official websites use .gov 
 

 A
 .gov website belongs to an official
 government organization in the United States.
 

 
 

 
 

 
 
 Secure .gov websites use HTTPS 
 

 A lock (
 
 
 Lock 
 
 Locked padlock icon
 
 
 
 ) or https:// means you've safely
 connected to the .gov website. Share sensitive
 information only on official, secure websites.
 

 
 
 
 
 
 

 

 
 
 
 

 
 

 
 
 

 
 
 
 
 
 
 
 
 
 
 
 
 

 
 Search PMC Full-Text Archive 
 
 
 
 
 Search in PMC 
 
 
 
 
 
 
 
 
 Journal List
 
 
 

 
 
 
 User Guide
 
 
 

 
 
 
 
 

 

 
 

 
 

 

 

 
 
 
 
 
 
 
 
 
 

 
 
 
 
 
 

 
 
 
 

 
 
 

 
 
 
 
 
 
 

 
 
 
 
 
 

 
 
 
 
 
 

 
 
 PERMALINK

 
 
 
 
 Copy 
 
 
 
 
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 As a library, NLM provides access to scientific literature. Inclusion in an NLM database does not imply endorsement of, or agreement with,
 the contents by NLM or the National Institutes of Health.

 Learn more:
 PMC Disclaimer 
 |
 
 PMC Copyright Notice
 
 
 
 
 
 
 

 
 
 editorial Schizophr Bull . 2023 May 23;49(5):1105–1107. doi: 10.1093/schbul/sbad068 
 
 
 False Responses From Artificial Intelligence Models Are Not Hallucinations

 
 Søren Dinesen Østergaard 
 Søren Dinesen Østergaard 

 
 1 
Department of Clinical Medicine, Aarhus University, Aarhus, Denmark 
 
 2 
Department of Affective Disorders, Aarhus University Hospital - Psychiatry, Aarhus, Denmark 
 Find articles by Søren Dinesen Østergaard 
 
 
 1, 2, ✉ , Kristoffer Laigaard Nielbo 
 Kristoffer Laigaard Nielbo 

 
 3 
Department of Culture and Society, Center for Humanities Computing, Aarhus University, Aarhus, Denmark 
 Find articles by Kristoffer Laigaard Nielbo 
 
 
 3 
 
 
 Author information 

 Article notes 

 Copyright and License information 

 
 
 
 
 1 
Department of Clinical Medicine, Aarhus University, Aarhus, Denmark 
 
 2 
Department of Affective Disorders, Aarhus University Hospital - Psychiatry, Aarhus, Denmark 
 
 3 
Department of Culture and Society, Center for Humanities Computing, Aarhus University, Aarhus, Denmark 
 
 ✉ To whom correspondence should be addressed; Søren D. Østergaard, Department of Affective Disorders, Aarhus University Hospital – Psychiatry, Palle Juul-Jensens Boulevard 175, 8200 Aarhus N. tel: 45 61282753. e-mail: soeoes@rm.dk 

 
 
 Collection date 2023 Sep.

 
 
 © The Author(s) 2023. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com 
 This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model ( https://academic.oup.com/pages/standard-publication-reuse-rights )

 PMC Copyright notice 
 
 
 PMCID: PMC

... (truncated, 16 KB total)
Resource ID: fdbf8855dee63d77 | Stable ID: NTExYzBlMG