Skip to content
Longterm Wiki
Back

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: 80,000 Hours

This interview accompanies Toby Ord's book 'The Precipice' (2020) and serves as an accessible entry point to existential risk thinking, particularly relevant for understanding how AI risk fits within the broader landscape of civilizational-scale risks.

Metadata

Importance: 72/100podcast episodeeducational

Summary

A comprehensive podcast interview with philosopher Toby Ord discussing his book 'The Precipice', covering quantitative estimates of existential risks from natural and anthropogenic sources including AI, bioweapons, nuclear war, and climate change. Ord argues humanity is at a uniquely dangerous 'hinge of history' and outlines both the moral case for prioritizing existential risk reduction and practical policy recommendations.

Key Points

  • Ord provides specific probability estimates for various existential risks this century, with engineered pandemics and unaligned AI ranked as the highest-probability anthropogenic threats.
  • Natural risks (asteroids, supervolcanoes, stellar threats) are estimated to be much lower than anthropogenic risks, suggesting human-caused dangers dominate the overall risk landscape.
  • The 'hinge of history' concept argues that decisions made now about technology and governance may be uniquely consequential for humanity's long-term trajectory.
  • Climate change is discussed as a risk factor that could amplify other existential risks rather than being an existential risk in isolation for most scenarios.
  • The interview covers career and policy recommendations for those wanting to reduce existential risk, connecting abstract philosophy to actionable priorities.

Cited by 1 page

PageTypeQuality
Toby OrdPerson41.0

Cached Content Preview

HTTP 200Fetched Mar 20, 202698 KB
## On this page:

- [Introduction](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#top)
- [1 Highlights](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#highlights)
- [2 Articles, books, and other media discussed in the show](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#articles-books-and-other-media-discussed-in-the-show)
- [3 Transcript](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#transcript)
  - [3.1 Rob's intro \[00:00:00\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#robs-intro-000000)
  - [3.2 The interview begins \[00:02:15\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#the-interview-begins-000215)
  - [3.3 What Toby learned while writing the book \[00:05:04\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#what-toby-learned-while-writing-the-book-000504)
  - [3.4 Estimates for specific x-risks \[00:08:10\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#estimates-for-specific-x-risks-000810)
  - [3.5 Asteroids and comets \[00:16:52\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#asteroids-and-comets-001652)
  - [3.6 Supervolcanoes \[00:24:27\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#supervolcanoes-002427)
  - [3.7 Threats from space \[00:33:06\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#threats-from-space-003306)
  - [3.8 Estimating total natural risk \[00:36:34\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#estimating-total-natural-risk-003634)
  - [3.9 Distinction between natural and anthropogenic risks \[00:45:42\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#distinction-between-natural-and-anthropogenic-risks-004542)
  - [3.10 Climate change \[00:51:08\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#climate-change-005108)
  - [3.11 Risk factors \[01:10:53\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#risk-factors-011053)
  - [3.12 Biological threats \[01:26:59\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#biological-threats-012659)
  - [3.13 Nuclear war \[01:36:34\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-future-humanity/#nuclear-war-013634)
  - [3.14 Artificial intelligence \[01:48:55\]](https://80000hours.org/podcast/episodes/toby-ord-the-precipice-existential-risk-fu

... (truncated, 98 KB total)
Resource ID: 35cc64aad5b46421 | Stable ID: NTI2MTU2Nj