Skip to content
Longterm Wiki
All Source Checks
Citation

AI Timelines - Footnote 18

partial85% confidence

1 evidence check

Last checked: 4/3/2026

The claim states that training compute increased from 2010 to 2024, but the source does not specify this time frame. The source states that training compute is expanding at a rate of approximately 4x per year, but does not specify the years. The claim states that Epoch AI analysis suggests training runs of 2×10²⁹ FLOP may be feasible by 2030, but the source states that training runs of 2e29 FLOP will likely be feasible by 2030.

Evidence — 1 source, 1 check

partial85%Haiku 4.5 · 4/3/2026
Found: Training compute for state-of-the-art models increased by a factor of 4–5× per year from 2010 to 2024. <EntityLink id="epoch-ai">Epoch AI</EntityLink> analysis suggests training runs of 2×10²⁹ FLOP ma

Note: The claim states that training compute increased from 2010 to 2024, but the source does not specify this time frame. The source states that training compute is expanding at a rate of approximately 4x per year, but does not specify the years. The claim states that Epoch AI analysis suggests training runs of 2×10²⁹ FLOP may be feasible by 2030, but the source states that training runs of 2e29 FLOP will likely be feasible by 2030.

Debug info

Record type: citation

Record ID: page:ai-timelines:fn18