Skip to content
Longterm Wiki
Back

OpenAI's compute costs

web

A trade-press article useful for tracking industry perspectives on compute scaling trends; relevant to capability forecasting and resource investment discussions in AI safety contexts.

Metadata

Importance: 38/100news articlenews

Summary

This article argues that AI model scaling has not hit a wall but is evolving beyond simple parameter and data scaling into new paradigms such as inference-time compute, synthetic data, and multimodal training. It discusses OpenAI's substantial compute costs as evidence of continued scaling investment. The piece frames scaling as entering a qualitatively different phase rather than plateauing.

Key Points

  • Traditional pre-training scaling (more parameters + more data) is facing diminishing returns, prompting a shift to new scaling approaches.
  • Inference-time compute scaling (e.g., chain-of-thought, test-time search) is emerging as a major new axis for improving model performance.
  • OpenAI's compute costs remain massive, signaling industry confidence that scaling investments still yield capability gains.
  • Synthetic data generation is becoming a key tool to overcome data scarcity limitations in continued scaling.
  • The narrative of 'scaling is dead' is challenged; instead, the field is diversifying how and where compute is applied.

Cited by 1 page

PageTypeQuality
The Case For AI Existential RiskArgument66.0

Cached Content Preview

HTTP 200Fetched Mar 20, 202631 KB
- [Language models](https://aibusiness.com/nlp/language-models)
- [Generative AI](https://aibusiness.com/generative-ai)
- [ML](https://aibusiness.com/ml)
- [Commentary](https://aibusiness.com/latest-commentary)

# AI Model Scaling Isn’t Over: It’s Entering a New Era

Throwing more resources at scaling delivers diminishing returns. To keep advancing requires smarter techniques beyond traditional scaling

[![Picture of Akash Sharma](https://eu-images.contentstack.com/v3/assets/blt6b0f74e5591baa03/blt333233fff4ac88f4/678fabe3d3bf2e1b79fd8aea/Akash_Sharma_(002).jpeg?width=100&auto=webp&quality=80&disable=upscale)](https://aibusiness.com/author/akash-sharma)

[Akash Sharma,](https://aibusiness.com/author/akash-sharma) CEO, Vellum,Vellum

January 21, 2025

11 Min Read

![generative AI prompts](https://eu-images.contentstack.com/v3/assets/blt6b0f74e5591baa03/bltbdd3e607a8cdddfd/66e1ebffdcb03d82fe7d3503/generative_AI_chatgpt_prompt_GettyImages-1478121660.jpg?width=1280&auto=webp&quality=80&format=jpg&disable=upscale)

Getty Images

[Linkedin](https://www.linkedin.com/sharing/share-offsite/?url=https://aibusiness.com/language-models/ai-model-scaling-isn-t-over-it-s-entering-a-new-era)[Facebook](http://www.facebook.com/sharer/sharer.php?u=https://aibusiness.com/language-models/ai-model-scaling-isn-t-over-it-s-entering-a-new-era)[Twitter](http://www.twitter.com/intent/tweet?url=https://aibusiness.com/language-models/ai-model-scaling-isn-t-over-it-s-entering-a-new-era)[Email](mailto:?subject=AI%20Model%20Scaling%20Isn%E2%80%99t%20Over:%20It%E2%80%99s%20Entering%20a%20New%20Era&body=I%20thought%20the%20following%20from%20AI%20Business%20might%20interest%20you.%0D%0A%0D%0A%20AI%20Model%20Scaling%20Isn%E2%80%99t%20Over%3A%20It%E2%80%99s%20Entering%20a%20New%20Era%0D%0Ahttps%3A%2F%2Faibusiness.com%2Flanguage-models%2Fai-model-scaling-isn-t-over-it-s-entering-a-new-era)

Over the past two years, model intelligence has advanced rapidly but heading into 2025, we’re seeing signs of diminishing returns.

The question on everyone’s mind: Is scaling model intelligence reaching its practical limits?

In a narrow sense—if you’re just looking at increasing model size, compute, or dataset size—it might seem like progress has stalled.

But new methods are opening up exciting possibilities.

It seems the road to artificial general intelligence (AGI) hasn’t hit a wall—it’s just under construction.

Or as machine learning expert Ilya Sutskever said in a recent interview: “The 2010s were the age of scaling, now we're back in the age of wonder and discovery once again. Everyone is looking for the next thing. Scaling the right thing matters more now than ever.”

Today, everyone is discussing neural scaling laws—whether models are reaching their limits and what new methods might unlock fresh horizons. To tackle these questions, this article breaks down everything about the neural scaling law, the limits and the next frontier.

Related: [Emergent Minds: How Agentic AI is Transformi

... (truncated, 31 KB total)
Resource ID: 9ce35082bc3ab2d4 | Stable ID: OGRkYzRhYj