Skip to content
Longterm Wiki
Back

The Precipice: Existential Risk and the Future of Humanity – Toby Ord

web

Credibility Rating

4/5
High(4)

High quality. Established institution or organization with editorial oversight and accountability.

Rating inherited from publication venue: Future of Humanity Institute

Toby Ord's 'The Precipice' is a foundational book in the existential risk field, widely read by AI safety researchers and policymakers; the FHI page serves as the official book homepage with links to resources and endorsements.

Metadata

Importance: 78/100bookprimary source

Summary

This page presents Toby Ord's book 'The Precipice,' which argues humanity currently faces unprecedented existential risks, including from advanced AI, and makes a moral case for prioritizing their reduction. Ord provides probability estimates for various catastrophic and existential risks and argues this century is uniquely critical for humanity's long-term future.

Key Points

  • Estimates roughly 1-in-6 overall probability of existential catastrophe this century, with unaligned AI among the highest-risk individual factors (~10%).
  • Introduces the concept of 'existential risk' broadly, encompassing both human extinction and permanent, drastic curtailment of humanity's potential.
  • Makes a philosophical and ethical case that safeguarding humanity's long-run future is among the most important moral priorities of our time.
  • Surveys natural and anthropogenic risks, situating AI alongside bioweapons and nuclear war as key threats deserving serious attention.
  • Published by Oxford philosopher Toby Ord and affiliated with FHI; widely considered a foundational text in the existential risk research community.

Cited by 1 page

PageTypeQuality
AI Governance and PolicyCrux66.0
Resource ID: 781f94a18d149640 | Stable ID: NDk2YjYyMz