Skip to content
Longterm Wiki
Back

The Precipice: Existential Risk and the Future of Humanity (Ord, 2020)

web
precipice.com·precipice.com/

A landmark book by Oxford philosopher Toby Ord (founder of Giving What We Can), widely read in EA and AI safety communities as a foundational text on existential risk prioritization and the moral case for long-termism.

Metadata

Importance: 88/100bookprimary source

Summary

Toby Ord's 'The Precipice' argues that humanity stands at a critical juncture where existential risks—particularly from emerging technologies like AI—could permanently curtail our long-term potential. The book estimates probabilities of various catastrophic risks, makes the case for prioritizing existential risk reduction, and outlines a research and policy agenda for safeguarding humanity's future.

Key Points

  • Estimates a roughly 1-in-6 chance of existential catastrophe this century, with unaligned AI identified as the greatest single risk factor.
  • Introduces a moral framework emphasizing the astronomical value of humanity's long-term future as justification for prioritizing existential risk reduction.
  • Distinguishes existential risks from other catastrophes by their permanent, civilization-ending nature rather than just scale of immediate harm.
  • Surveys risks from nuclear war, pandemics, climate change, and emerging technologies, arguing AI poses the most severe near-term threat.
  • Calls for increased investment in existential risk research, international coordination, and governance frameworks to navigate this critical period.

Cited by 1 page

PageTypeQuality
AI Safety Research Value ModelAnalysis60.0

Cached Content Preview

HTTP 200Fetched Mar 20, 20260 KB
[The domain precipice.com is for sale! Click here to learn more.](https://www.precipice.com/_c)

Search for information

Search
Resource ID: c59350538c51c58e | Stable ID: YTY1OWZiNj