All Source Checks
confirmed99% confidence
1 evidence check
Last checked: 3/31/2026
The source text explicitly states that GPT-3 has '175 billion parameters' in the abstract. The paper was submitted on 28 May 2020 with final revision on 22 Jul 2020, which aligns with the claimed timeframe of 2020-06. The claim's parameter count of 175B matches exactly with the source's statement of '175 billion parameters'.
Evidence — 1 source, 1 check
arxiv.org/abs/2005.14165(1 check)
confirmed99%primaryHaiku 4.5 · 3/31/2026
Found: GPT-3, an autoregressive language model with 175 billion parameters
Note: The source text explicitly states that GPT-3 has '175 billion parameters' in the abstract. The paper was submitted on 28 May 2020 with final revision on 22 Jul 2020, which aligns with the claimed timeframe of 2020-06. The claim's parameter count of 175B matches exactly with the source's statement of '175 billion parameters'.
Debug info
Record type: fact
Record ID: f_ZencK2XFDA