Longterm Wiki

AI Takeover

scenarios-ai-takeover-overview (E670)
← Back to pagePath: /ai-transition-model/scenarios-ai-takeover-overview/
Page Metadata
{
  "id": "scenarios-ai-takeover-overview",
  "numericId": "E670",
  "path": "/ai-transition-model/scenarios-ai-takeover-overview/",
  "filePath": "ai-transition-model/scenarios-ai-takeover-overview.mdx",
  "title": "AI Takeover",
  "quality": null,
  "importance": null,
  "contentFormat": "article",
  "tractability": null,
  "neglectedness": null,
  "uncertainty": null,
  "causalLevel": null,
  "lastUpdated": "2026-01-03",
  "llmSummary": null,
  "structuredSummary": null,
  "description": "Scenarios where AI gains decisive control over human affairs - either rapidly or gradually.",
  "ratings": null,
  "category": "ai-transition-model",
  "subcategory": "scenarios-ai-takeover",
  "clusters": [
    "ai-safety"
  ],
  "metrics": {
    "wordCount": 68,
    "tableCount": 0,
    "diagramCount": 0,
    "internalLinks": 2,
    "externalLinks": 0,
    "footnoteCount": 0,
    "bulletRatio": 0.17,
    "sectionCount": 4,
    "hasOverview": false,
    "structuralScore": 3
  },
  "suggestedQuality": 20,
  "updateFrequency": null,
  "evergreen": true,
  "wordCount": 68,
  "unconvertedLinks": [],
  "unconvertedLinkCount": 0,
  "convertedLinkCount": 0,
  "backlinkCount": 0,
  "redundancy": {
    "maxSimilarity": 19,
    "similarPages": [
      {
        "id": "scenarios-human-catastrophe-overview",
        "title": "Human-Caused Catastrophe",
        "path": "/ai-transition-model/scenarios-human-catastrophe-overview/",
        "similarity": 19
      },
      {
        "id": "scenarios-long-term-lockin-overview",
        "title": "Long-term Lock-in",
        "path": "/ai-transition-model/scenarios-long-term-lockin-overview/",
        "similarity": 14
      },
      {
        "id": "factors-ai-uses-overview",
        "title": "AI Uses",
        "path": "/ai-transition-model/factors-ai-uses-overview/",
        "similarity": 12
      },
      {
        "id": "factors-overview",
        "title": "Root Factors",
        "path": "/ai-transition-model/factors-overview/",
        "similarity": 12
      },
      {
        "id": "factors-civilizational-competence-overview",
        "title": "Civilizational Competence",
        "path": "/ai-transition-model/factors-civilizational-competence-overview/",
        "similarity": 11
      }
    ]
  }
}
Entity Data

No entity found for "scenarios-ai-takeover-overview"

Canonical Facts (0)

No facts for this entity

External Links

No external links

Backlinks (0)

No backlinks

Frontmatter
{
  "numericId": "E670",
  "title": "AI Takeover",
  "description": "Scenarios where AI gains decisive control over human affairs - either rapidly or gradually.",
  "sidebar": {
    "label": "Overview",
    "order": 0
  },
  "lastEdited": "2026-01-03",
  "subcategory": "scenarios-ai-takeover"
}
Raw MDX Source
---
numericId: E670
title: AI Takeover
description: Scenarios where AI gains decisive control over human affairs - either rapidly or gradually.
sidebar:
  label: Overview
  order: 0
lastEdited: "2026-01-03"
subcategory: scenarios-ai-takeover
---
import {DataInfoBox, FactorSubItemsList, FactorRelationshipDiagram, ImpactList, EntityLink} from '@components/wiki';

<DataInfoBox entityId="E15" />

AI Takeover refers to scenarios where AI systems gain decisive control over human affairs, either displacing human decision-making or actively working against human interests. This is one of the primary pathways to <EntityLink id="E130">existential catastrophe</EntityLink>.

## Variants

<FactorSubItemsList factorId="ai-takeover" />

## Key Root Factors

<FactorRelationshipDiagram nodeId="ai-takeover" direction="incoming" />

### Factor Impact Scores

<ImpactList nodeId="ai-takeover" direction="to" />

## Outcomes

AI Takeover scenarios lead to:
- **Existential Catastrophe**: If AI values are misaligned with humanity
- **<EntityLink id="E194">Long-term Trajectory</EntityLink>**: Shapes the character of post-transition world