Skip to content
Longterm Wiki
Back

Credibility Rating

4/5
High(4)

High quality. Established institution or organization with editorial oversight and accountability.

Rating inherited from publication venue: Epoch AI

Relevant to AI governance and compute governance discussions; provides empirical grounding for debates about resource constraints on frontier AI development and the feasibility of compute-based regulatory thresholds.

Metadata

Importance: 62/100blog postanalysis

Summary

Epoch AI analyzes the rapidly growing electricity demands of training frontier AI models, examining trends in power consumption, infrastructure constraints, and implications for AI development trajectories. The analysis quantifies how compute scaling translates into energy requirements and identifies key bottlenecks in power availability that may shape the pace of AI progress.

Key Points

  • Frontier AI training runs are consuming increasing amounts of electricity, with leading models requiring hundreds of megawatts to gigawatts of power capacity.
  • Power infrastructure availability is emerging as a key constraint on AI scaling, potentially limiting how quickly compute can be expanded.
  • The analysis traces historical trends in training power consumption and projects future requirements under continued scaling.
  • Energy demand growth from AI may strain grid infrastructure and influence decisions about data center siting and energy sourcing.
  • Power constraints could affect the competitive landscape for AI development, favoring actors with greater access to energy infrastructure.

Cited by 1 page

PageTypeQuality
AI Capability Threshold ModelAnalysis72.0

Cached Content Preview

HTTP 200Fetched Feb 26, 202614 KB
![](https://epoch.ai/assets/images/icons/Search-Loupe-2.svg)
Search epoch.ai


[Search](https://epoch.ai/search)

Enter a query to search for results


Previous

1

...

2

...

3

Next

[Article](https://epoch.ai/blog) [How much power will frontier AI training demand in 2030?](https://epoch.ai/blog/power-demands-of-frontier-ai-training)

paper

# How much power will frontier AI training demand in 2030?

The power required to train the largest frontier models is growing by more than 2x per year, and is on trend to reaching multiple gigawatts by 2030.

![](https://epoch.ai/assets/images/posts/2025/power-demands-of-frontier-ai-training/projected-power-growth.png)

[![](https://epoch.ai/assets/images/icons/twitter-big.svg)](https://twitter.com/intent/tweet?url=https://epoch.ai/blog/power-demands-of-frontier-ai-training&via=EpochAIResearch)[![](https://epoch.ai/assets/images/icons/linkedin.svg)](https://www.linkedin.com/sharing/share-offsite?url=https://epoch.ai/blog/power-demands-of-frontier-ai-training)[![](https://epoch.ai/assets/images/icons/speech-balloons.svg)\\
Read X thread](https://x.com/EpochAIResearch/status/1954987311365575108)

![](https://epoch.ai/assets/images/icons/citation.svg)
Cite


### Published

Aug 11, 2025

### Authors

Josh You,
David Owen

### Resources

[![](https://epoch.ai/assets/images/icons/colab.svg)\\
Source Code](https://colab.research.google.com/drive/1uF9jAWP_lnNIe7nd5_0hjgpFzE2_jKDw?usp=sharing) [![](https://epoch.ai/assets/images/icons/document-paper-angle.svg)\\
Paper](https://www.epri.com/research/products/000000003002033669)

The electrical power required to train individual frontier AI models has been growing rapidly over time, driven by the growth in total training compute and the size of training clusters. Previously, we found that the power required to train a frontier model has been [more than doubling](https://epoch.ai/data-insights/power-usage-trend) every year. If trends continue, how high could these power demands become?

In a new [white paper](https://www.epri.com/research/products/000000003002033669), “Scaling Intelligence: The Exponential Growth of AI’s Power Needs”, written in collaboration with [EPRI](https://www.epri.com/), we analyze the factors driving power growth for frontier training, and forecast this growth out to 2030. We conclude that the largest individual frontier training runs in 2030 **will likely draw 4-16 gigawatts (GW) of power, or enough to power millions of US homes**.

## Forecasting power demands using model training compute

Power demands for frontier training runs have historically grown at a rate of 2.2x per year, with the largest runs now exceeding 100 MW. This has primarily been driven by frontier training compute, which has been growing at [4-5x](https://epoch.ai/blog/training-compute-of-frontier-ai-models-grows-by-4-5x-per-year) per year.

However, translating this compute growth trend into power demand requires dividing the compute growth rate by growth rates in two m

... (truncated, 14 KB total)
Resource ID: 95b25b23b19320df | Stable ID: ZWEzZjhiYm