Skip to content
Longterm Wiki
Back

Nick Bostrom has argued

web

A November 2023 UnHerd interview with Nick Bostrom, one of the foundational thinkers on existential risk; accessible overview of his views on AI-enabled tyranny and extinction risk for general audiences.

Metadata

Importance: 42/100opinion piececommentary

Summary

An interview with Oxford philosopher Nick Bostrom discussing existential risk, AI-enabled surveillance dystopias, and the possibility of human extinction. Bostrom explains how advanced AI could enable permanent global totalitarianism or civilizational collapse, and reflects on how his long-standing concerns about AI have moved from fringe speculation to mainstream debate.

Key Points

  • Existential risk includes not just extinction but permanent lock-in to a radically suboptimal state, such as a global totalitarian surveillance dystopia.
  • Bostrom distinguishes collapse scenarios (potentially recoverable) from true existential catastrophes (indefinitely bad and irreversible).
  • AI's rapid progression from science fiction to near-reality has validated decades of warnings from thinkers like Bostrom.
  • Governments exploiting AI for surveillance is highlighted as a concrete near-term pathway to catastrophic outcomes.
  • The interview situates AI risk within broader concerns about institutional erosion and civilizational instability.

Cited by 1 page

PageTypeQuality
AI Value Lock-inRisk64.0

Cached Content Preview

HTTP 200Fetched Mar 20, 202647 KB
Log InSearch

[![U.K. Flag](https://unherd.com/wp-content/themes/UH2023/assets/img/uk-flag.webp)](https://unherd.com/2023/11/nick-bostrom-will-ai-lead-to-tyranny/?set_edition=en)[![U.S. Flag](https://unherd.com/wp-content/themes/UH2023/assets/img/us-flag.webp)](https://unherd.com/2023/11/nick-bostrom-will-ai-lead-to-tyranny/?set_edition=us)

[![Unherd Logo](https://unherd.com/wp-content/themes/UH2023/assets/img/unherd-logo.png)](https://unherd.com/?edition=us)

Search for:![Search](https://unherd.com/wp-content/themes/UH2023/assets/img/search-glass.webp)

Log In \| Select Edition:

[![U.K. Flag](https://unherd.com/wp-content/themes/UH2023/assets/img/uk-flag.webp)](https://unherd.com/2023/11/nick-bostrom-will-ai-lead-to-tyranny/?set_edition=en)[![U.S. Flag](https://unherd.com/wp-content/themes/UH2023/assets/img/us-flag.webp)](https://unherd.com/2023/11/nick-bostrom-will-ai-lead-to-tyranny/?set_edition=us)

Search for:![Search](https://unherd.com/wp-content/themes/UH2023/assets/img/search-glass.webp)

[X Close](https://unherd.com/2023/11/nick-bostrom-will-ai-lead-to-tyranny/?edition=us#)

# Nick Bostrom: Will AI lead to tyranny? We are entering an age of existential risk

![Nick Bostrom: Will AI lead to tyranny?](https://unherd.com/wp-content/uploads/2023/11/GettyImages-502680318-1-e1699639463835.jpg?w=640)

How worried should we be? (Tom Pilston for The Washington Post via Getty Images)

How worried should we be? (Tom Pilston for The Washington Post via Getty Images)

* * *

[ai](https://unherd.com/tag/ai/?edition=us) [Artificial intelligence](https://unherd.com/tag/artificial-intelligence/?edition=us) [none](https://unherd.com/tag/none/?edition=us) [Science](https://unherd.com/tag/science/?edition=us) [Tech & Data Sector](https://unherd.com/tag/tech-data-sector/?edition=us)

* * *

* * *

[![Flo Read](https://unherd.com/wp-content/uploads/2023/01/Flo-Website.jpg?w=96)](https://unherd.com/author/florence-read/?edition=us)

##### [Flo Read](https://unherd.com/author/florence-read/?edition=us "Posts by Flo Read")

###### Nov 12 2023 - 12:00am  9mins

* * *

In the last year, artificial intelligence has progressed from a science-fiction fantasy to an impending reality. We can see its power in everything from online gadgets to whispers of a new, “post-singularity” tech frontier — as well as in renewed fears of an AI takeover.

One intellectual who anticipated these developments decades ago is Nick Bostrom, a Swedish philosopher at Oxford University and director of its Future of Humanity Institute. He joined _UnHerd_’s Florence Read to discuss the AI era, how governments might exploit its power for surveillance, and the possibility of human extinction.

**Florence Read: You’re particularly well-known for your work on “existential risk” — what do you mean by that?**

**Nick Bostrom:** The concept of existential risk refers to ways that the human story could end prematurely. That might mean literal extinction. But it could also mean getting ourselves per

... (truncated, 47 KB total)
Resource ID: 713ad72e6bc4d52a | Stable ID: YmY3OGE2OT