Skip to content
Longterm Wiki
All Source Checks
Citation

Reducing Hallucinations in AI-Generated Wiki Content - Footnote 5

partial90% confidence

1 evidence check

Last checked: 4/3/2026

The claim states hallucination rates of 2-18% for RAG-based systems with reliable information sources, but the source only provides specific hallucination rates for the CIS and Google-based chatbots in this study, not a general range for all RAG-based systems. The claim states hallucination rates of 19-40% for conventional approaches, but the source states approximately 40%.

Evidence — 1 source, 1 check

partial90%Haiku 4.5 · 4/3/2026
Found: Research demonstrates that RAG-based systems with reliable information sources achieve hallucination rates of 2-18% compared to 39% for conventional models without retrieval grounding. In medical appl

Note: The claim states hallucination rates of 2-18% for RAG-based systems with reliable information sources, but the source only provides specific hallucination rates for the CIS and Google-based chatbots in this study, not a general range for all RAG-based systems. The claim states hallucination rates of 19-40% for conventional approaches, but the source states approximately 40%.

Debug info

Record type: citation

Record ID: page:reducing-hallucinations:fn5