Skip to content
Longterm Wiki
All Source Checks
Citation

Reducing Hallucinations in AI-Generated Wiki Content - Footnote 17

partial90% confidence

1 evidence check

Last checked: 4/3/2026

The claim states 97.9% factual accuracy in human conversations, but the source specifies this accuracy is specifically for conversations about recent topics. The claim mentions a multi-stage verification process, but the source does not explicitly use this term, although it describes the process of grounding on Wikipedia and retaining only grounded facts.

Evidence — 1 source, 1 check

partial90%Haiku 4.5 · 4/3/2026
Found: Developed by Stanford's Human-Centered AI Institute and published in 2023, WikiChat achieves 97.9% factual accuracy in human conversations by implementing a multi-stage verification process:

Note: The claim states 97.9% factual accuracy in human conversations, but the source specifies this accuracy is specifically for conversations about recent topics. The claim mentions a multi-stage verification process, but the source does not explicitly use this term, although it describes the process of grounding on Wikipedia and retaining only grounded facts.

Debug info

Record type: citation

Record ID: page:reducing-hallucinations:fn17