Skip to content
Longterm Wiki
Back

Epoch AI Brief, October 2025 (https://epochai.substack.com/p/the-epoch-ai-brief-october-2025)

blog

Credibility Rating

2/5
Mixed(2)

Mixed quality. Some useful content but inconsistent editorial standards. Claims should be verified.

Rating inherited from publication venue: Substack

A monthly research digest from Epoch AI, a nonprofit focused on empirical AI trends; useful for tracking near-term developments in compute, benchmarking methodology, and frontier model capabilities relevant to AI safety forecasting.

Metadata

Importance: 42/100blog postnews

Summary

The October 2025 Epoch AI Brief summarizes Epoch AI's research findings on decentralized training feasibility (10 GW runs across distributed sites), the launch of the Epoch Capabilities Index (ECI) as a unified benchmark aggregation metric, and analysis of leading LLM performance including FrontierMath evaluations and OpenAI revenue trends.

Key Points

  • Decentralized training across geographically distributed sites spanning thousands of kilometers is technically feasible for 10 GW runs and could ease power bottleneck constraints.
  • Epoch Capabilities Index (ECI) launched as a unified metric aggregating dozens of AI benchmarks to track long-term capability progress and reduce benchmark saturation effects.
  • FrontierMath evaluations of leading LLMs provide updated data on frontier model mathematical reasoning capabilities.
  • OpenAI revenue growth analysis offers context for understanding the commercial trajectory of leading AI labs.
  • The brief represents Epoch AI's ongoing effort to provide empirical, data-driven monitoring of AI progress for researchers and policymakers.

Cited by 1 page

PageTypeQuality
Capability-Alignment Race ModelAnalysis62.0

Cached Content Preview

HTTP 200Fetched Feb 23, 202610 KB
The Epoch AI Brief - October 2025 
 
 
 
 
 

 

 

 
 
 
 
 
 
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 
 
 
 
 
 
 
 
 
 
 

 

 

 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

 
 
 

 
 
 
 
 

 

 
 
 

 

 

 

 

 
 
 
 

 
 

 
 
 

 

 

 
 Epoch AI 

 Subscribe Sign in The Epoch Brief The Epoch AI Brief - October 2025

 Report on decentralized training, new Epoch Capabilities Index for tracking AI progress, FrontierMath evaluations of leading models, revenue insights on OpenAI, and hiring for two open positions.

 Epoch AI & various writers Oct 31, 2025 11 Share Hi! In this edition of the Epoch AI brief:

 We published a report analyzing whether decentralized training could help solve power bottlenecks, and found that 10 GW training runs across thousands of kilometers is feasible. 

 We launched the Epoch Capabilities Index (ECI) , a new unified metric that combines scores from dozens of AI benchmarks into a single “general capability” scale to track long-term AI progress trends. 

 We’ve published four new Data Insights covering open-weight vs SotA models on the ECI, OpenAI’s rapid revenue growth , how OpenAI spends its compute , and steady AI capability improvements . 

 We’ve published four new Gradient Updates including analysis of FrontierMath difficulty bounds , OpenAI’s unprecedented revenue projections , and potential deployment of digital workers . 

 We benchmarked leading compute-intensive settings of major LLM models on FrontierMath and released many interviews from the Benchmark’s contributors. 

 We’re hiring a new Researcher for our Data Team to accelerate our work studying the future of AI, and a Lead Editor to help communicate our work. 

 Subscribe Publications & Announcements 

 Could decentralized training solve AI’s power problem? 

 Conventional wisdom in AI is that large scale pretraining needs to happen in contiguous massive datacenter campuses. But is this true? Our research suggests that conducting 10 GW training runs across 23 sites — linked by a network spanning 4,800 km long — is feasible and could help alleviate power bottlenecks.

 While this approach requires substantial network bandwidth—over 25x that of the highest-capacity transatlantic fiber cable for training a model with 72 trillion parameters—the incremental cost is manageable at an estimated 0.5% of datacenter construction costs.

 The bottom line is that conducting large decentralized training runs is perfectly possible without a large increase in either training time or budget. However, distributed clusters have many downsides and we expect that AI companies will prefer to scale AI campuses as much as they can, and only resort to distributed clusters to go beyond the scale that utilities are willing to provide through the grid.

 Find detailed analysis, calculations and sources in the full articl

... (truncated, 10 KB total)
Resource ID: f23e98169a4b7257 | Stable ID: ZWZkZDk3Yz