Skip to content
Longterm Wiki
All Source Checks
Fact

OpenAI — Model Parameters: 175 billion

GPT-

confirmed99% confidence

1 evidence check

Last checked: 3/31/2026

The source text explicitly states that GPT-3 has '175 billion parameters' in the abstract. The paper was submitted on 28 May 2020 with final revision on 22 Jul 2020, which aligns with the claimed timeframe of 2020-06. The claim's parameter count of 175B matches exactly with the source's statement of '175 billion parameters'.

Evidence — 1 source, 1 check

confirmed99%primaryHaiku 4.5 · 3/31/2026
Found: GPT-3, an autoregressive language model with 175 billion parameters

Note: The source text explicitly states that GPT-3 has '175 billion parameters' in the abstract. The paper was submitted on 28 May 2020 with final revision on 22 Jul 2020, which aligns with the claimed timeframe of 2020-06. The claim's parameter count of 175B matches exactly with the source's statement of '175 billion parameters'.

Debug info

Record type: fact

Record ID: f_ZencK2XFDA

Source Check: GPT- | Longterm Wiki