Skip to content
Longterm Wiki
Back

80,000 Hours: Toby Ord on The Precipice

web

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: 80,000 Hours

A long-running podcast from the 80,000 Hours career advice organization; widely listened to in the EA and AI safety communities as a source of accessible, substantive conversations with key researchers and thinkers.

Metadata

Importance: 62/100podcast episodeeducational

Summary

The 80,000 Hours Podcast hosts in-depth interviews with leading researchers and thinkers on AI safety, existential risk, effective altruism, and related high-impact topics. It covers technical AI safety, governance, alignment, superintelligence, AI deception, and emerging risks like AI-nuclear intersections. It serves as an accessible entry point and ongoing reference for the AI safety and EA communities.

Key Points

  • Features long-form interviews with prominent AI safety researchers including Ajeya Cotra, Toby Ord, and MIRI researchers on topics like deceptive alignment and corrigibility.
  • Covers a broad range from technical alignment (superintelligence, gradual disempowerment) to governance and policy (AI nuclear deterrence, AI timelines).
  • Includes episodes on adjacent topics like AI welfare research, consciousness, and effective altruism strategy to provide broader context for safety work.
  • Freely available across major podcast platforms and YouTube, making cutting-edge safety discussions widely accessible.
  • One of the most prominent public-facing media channels for communicating AI safety ideas beyond academic circles.

Cited by 3 pages

PageTypeQuality
80,000 HoursOrganization45.0
Bioweapons RiskRisk91.0
Multipolar Trap (AI Development)Risk91.0

Cached Content Preview

HTTP 200Fetched Mar 15, 202612 KB
# The most important conversations about artificial intelligence you won’t hear anywhere else.

- [![](https://80000hours.org/wp-content/uploads/2019/02/apple-nb.png)](https://podcasts.apple.com/us/podcast/80-000-hours-podcast/id1245002988)
- [![](https://80000hours.org/wp-content/uploads/2024/05/watchon3-square-FINAL.jpg)](https://www.youtube.com/playlist?list=PL-BRtcBm4Yj4aKn72p4PjyqHh0ZQFdI1A)
- [![](https://80000hours.org/wp-content/uploads/2019/02/spotify-nb.png)](https://open.spotify.com/show/2WzJwXWBDnn4iZ7odKwDib?si=T3Bboj1YQWGc383Tns0FaA)
- [![](https://80000hours.org/wp-content/uploads/2025/09/rss-feed-button-4.png)](https://feeds.transistor.fm/80000-hours-podcast)

![Subscribe!](https://80000hours.org/wp-content/uploads/2025/04/subarrow.png)

[![The 80,000 Hours Podcast](https://80000hours.org/wp-content/uploads/2023/12/80000HoursPodcast17March2023Revision.gif)](https://80000hours.org/podcast/episodes/)

## Selected highlights

[![](https://80000hours.org/wp-content/uploads/2025/07/Ajeya-thumbnail-scaled.jpg)](https://80000hours.org/podcast/episodes/ajeya-cotra-accidentally-teaching-ai-to-deceive-us/)[![](https://80000hours.org/wp-content/uploads/2025/07/Toby-thumbnail-scaled.jpg)](https://80000hours.org/podcast/episodes/toby-ord-inference-scaling-ai-governance/)[![](https://80000hours.org/wp-content/uploads/2025/07/Nate-thumbnail-scaled.jpg)](https://80000hours.org/podcast/episodes/nate-silver-effective-altruism-sbf-art-of-risk/)[![](https://80000hours.org/wp-content/uploads/2025/07/David-thumbnail-scaled.jpg)](https://80000hours.org/podcast/episodes/david-chalmers-nature-ethics-consciousness/)[![](data:image/svg+xml,%3Csvg%20xmlns=%22http://www.w3.org/2000/svg%22%20viewBox=%220%200%20210%20140%22%3E%3C/svg%3E)](https://80000hours.org/podcast/episodes/rachel-glennerster-market-shaping-incentives/)[![](data:image/svg+xml,%3Csvg%20xmlns=%22http://www.w3.org/2000/svg%22%20viewBox=%220%200%20210%20140%22%3E%3C/svg%3E)](https://80000hours.org/podcast/episodes/christopher-brown-slavery-abolition/)[![](data:image/svg+xml,%3Csvg%20xmlns=%22http://www.w3.org/2000/svg%22%20viewBox=%220%200%20210%20140%22%3E%3C/svg%3E)](https://80000hours.org/podcast/episodes/vitalik-buterin-techno-optimism/)[![](data:image/svg+xml,%3Csvg%20xmlns=%22http://www.w3.org/2000/svg%22%20viewBox=%220%200%20210%20140%22%3E%3C/svg%3E)](https://80000hours.org/podcast/episodes/randy-nesse-evolutionary-medicine-psychiatry/)[![](data:image/svg+xml,%3Csvg%20xmlns=%22http://www.w3.org/2000/svg%22%20viewBox=%220%200%20210%20140%22%3E%3C/svg%3E)](https://80000hours.org/podcast/episodes/sharon-hewitt-rawlette-hedonistic-utilitarianism/)

[See all episodes](https://80000hours.org/podcast/episodes/)

## Latest episodes

[![](data:image/svg+xml,%3Csvg%20xmlns=%22http://www.w3.org/2000/svg%22%20viewBox=%220%200%20210%20140%22%3E%3C/svg%3E)](https://80000hours.org/podcast/episodes/sam-winter-levy-nikita-lalwani-ai-nuclear-deterrence/)

[![](data:image/svg+xml,%3Csvg%20xmlns=%22http://

... (truncated, 12 KB total)
Resource ID: 2656524aca2f08c0 | Stable ID: MTUwNjM5ND