Reducing Hallucinations in AI-Generated Wiki Content - Footnote 5
1 evidence check
Last checked: 4/3/2026
The claim states hallucination rates of 2-18% for RAG-based systems with reliable information sources, but the source only provides specific hallucination rates for the CIS and Google-based chatbots in this study, not a general range for all RAG-based systems. The claim states hallucination rates of 19-40% for conventional approaches, but the source states approximately 40%.
Evidence — 1 source, 1 check
Note: The claim states hallucination rates of 2-18% for RAG-based systems with reliable information sources, but the source only provides specific hallucination rates for the CIS and Google-based chatbots in this study, not a general range for all RAG-based systems. The claim states hallucination rates of 19-40% for conventional approaches, but the source states approximately 40%.
Debug info
Record type: citation
Record ID: page:reducing-hallucinations:fn5