Daniel Kokotajlo reveals ~50% AGI safety staff departed
webCredibility Rating
Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.
Rating inherited from publication venue: Fortune
Part of a broader pattern of safety-focused departures from OpenAI in 2024, relevant to debates about whether leading AI labs can maintain genuine safety cultures under competitive and commercial pressures.
Metadata
Summary
Fortune reports on Daniel Kokotajlo's revelations that approximately 50% of OpenAI's AGI safety-focused researchers have departed the organization. The exodus raises serious concerns about OpenAI's commitment to safety as it accelerates toward AGI development, with departing researchers citing misalignment between stated safety priorities and actual organizational behavior.
Key Points
- •Approximately 50% of staff working on AGI safety at OpenAI have departed, according to Daniel Kokotajlo's account
- •Kokotajlo himself left OpenAI earlier in 2024, forfeiting equity, citing concerns about the company's safety culture
- •The departures suggest a systemic tension between OpenAI's public safety commitments and internal resource allocation decisions
- •The brain drain raises questions about whether OpenAI retains sufficient safety expertise to responsibly develop frontier AI systems
- •This follows high-profile exits including Ilya Sutskever and other members of the superalignment team
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| Corporate Influence on AI Policy | Crux | 66.0 |
Cached Content Preview
- [Home](https://fortune.com/)
- [Latest](https://fortune.com/section/latest/)
- [Fortune 500](https://fortune.com/section/fortune-500/)
- [Finance](https://fortune.com/section/finance/)
- [Tech](https://fortune.com/section/tech/)
- [Leadership](https://fortune.com/section/leadership/)
- [Lifestyle](https://fortune.com/section/lifestyle/)
- [Rankings](https://fortune.com/ranking/)
- [Multimedia](https://fortune.com/2024/08/26/openai-agi-safety-researchers-exodus/#)
[Tech](https://fortune.com/section/tech/) [AI](https://fortune.com/section/artificial-intelligence/)
# Exodus at OpenAI: Nearly half of AGI safety staffers have left, says former researcher

By
[Sharon Goldman](https://fortune.com/author/sharon-goldman/)
Sharon Goldman
AI Reporter
Down Arrow Button Icon

By
[Sharon Goldman](https://fortune.com/author/sharon-goldman/)
Sharon Goldman
AI Reporter
Down Arrow Button Icon
August 26, 2024, 4:00 PM ET
Add us on

OpenAI CEO Sam Altman.Justin Sullivan—Getty Images
Nearly half the OpenAI staff that once focused on the long-term risks of superpowerful AI have left the company in the past several months, according to Daniel Kokotajlo, a former OpenAI governance researcher.
Recommended Video
* * *
OpenAI, the maker of AI assistant ChatGPT, is widely regarded as one of a handful of companies in the vanguard of AI development. Its mission, according to the company’s founding charter, is to develop a technology known as artificial general intelligence, or AGI, in a way that “benefits all of humanity.” OpenAI defines AGI as autonomous systems that can perform the most economically valuable work, like humans currently do.
Because such systems might pose significant risks including, according to some AI researchers, the possibility that they would escape human control and pose an existential threat to all of humanity, OpenAI has employed since its founding a large number of researchers focused on what is known as “AGI safety”—techniques for ensuring that a future AGI system does not pose catastrophic or even existential danger.
It’s these group of researchers whose ranks Kokotajlo says have been decimated by recent resignations. The departures include Jan Hendrik Kirchner, Collin Burns, Jeffrey Wu, Jonathan Uesato, Steven Bills, Yuri Burda, Todor Markov, and cofounder John Schulman. Their exits followed the high-profile resignations in May of chief scientist Ilya Sutskever and Jan Leike, another researcher, who together co-headed what the company called its “superalignment” team. In announcing his resignation on the social media platform [X]
... (truncated, 15 KB total)970d203f69571bd2 | Stable ID: NWMzNjdiMT