Transition Turbulence
Transition Turbulence
Transition Turbulence measures how much disruption occurs as society navigates from current AI to whatever future emerges. Unlike Ultimate ScenariosE674The intermediate pathways connecting root factors to ultimate outcomes—AI Takeover, Human-Caused Catastrophe, and Long-term Lock-in. (which describe what happens), Transition Turbulence describes how rough the journey is—and that roughness affects both whether we survive (Existential CatastropheAi Transition Model ScenarioExistential CatastropheThis page contains only a React component placeholder with no actual content visible for evaluation. The component would need to render content dynamically for assessment.) and what world we end up in (Long-term TrajectoryAi Transition Model ScenarioLong-term TrajectoryThis page contains only a React component reference with no actual content loaded. Cannot assess substance as no text, analysis, or information is present.).
Even if we ultimately reach a good destination, a turbulent transition causes real suffering along the way. Economic displacement, political instability, and social fragmentation during the transition matter independently of the final outcome.
Why a Root Factor? Transition Turbulence is a background condition that affects Ultimate Scenarios and Ultimate Outcomes, not an endpoint in itself. High turbulence can trigger acute catastrophes (political collapse → loss of control) and constrain long-term trajectory (path dependence, destroyed institutions).
Polarity
Inherently negative. High turbulence is always worse than low turbulence, all else equal. There's no "good" version of extreme disruption—the question is how much turbulence we experience, not whether turbulence is desirable.
| Level | Description |
|---|---|
| Low turbulence | Smooth adaptation, minimal disruption, institutions keep pace |
| Moderate turbulence | Significant disruption but recoverable, adaptation strained |
| High turbulence | Severe instability, cascading failures, suffering widespread |
| Catastrophic turbulence | System breakdown triggers existential catastrophes or permanent damage |
How This Happens
Turbulence Drivers
1. Rapid Capability Growth AI capabilities advance faster than institutions, labor markets, and social norms can adapt. The faster the change, the more turbulence.
2. Economic Displacement AI automation displaces workers faster than new roles emerge. Mass unemployment creates economic and political instability.
3. Racing Dynamics Competition between labs/nations creates pressure to deploy before adequate safety testing, increasing both capability speed and coordination failures.
4. Coordination Failures Governments, labs, and international bodies fail to coordinate on standards, safety requirements, and transition support.
Turbulence Manifestations
| Domain | Low Turbulence | High Turbulence |
|---|---|---|
| Economic | Gradual workforce transition, safety nets absorb displacement | Mass unemployment, inequality spikes, market instability |
| Political | Democracies adapt, regulation keeps pace | Authoritarian backlash, polarization, institutional collapse |
| Social | Trust maintained, communities adapt | Fragmentation, loss of shared reality, civil unrest |
| Institutional | Regulators understand AI, governance effective | Governance captured or overwhelmed, rule of law erodes |
Key Parameters
Which Ultimate Outcomes It Affects
Existential Catastrophe (Primary)
High turbulence can trigger acute catastrophes:
- Political collapse → Loss of control over AI development
- Racing acceleration → Deployment before adequate safety
- Institutional breakdown → No capacity to respond to emerging threats
- Social unrest → Desperate measures, authoritarian responses
A rough enough transition can cause the catastrophe, even if AI itself isn't misaligned.
Long-term Trajectory (Primary)
Turbulence shapes what futures are reachable through path dependence:
- Destroyed institutions are hard to rebuild
- Lost trust takes generations to restore
- Authoritarian responses to chaos tend to entrench
- Economic disruption locks in inequality
- Options foreclosed during crisis rarely reopen
Even if acute catastrophe is avoided, high turbulence constrains the achievable long-term trajectory.
Relationship to Ultimate Scenarios
| Ultimate Scenario | Relationship |
|---|---|
| AI TakeoverE670Scenarios where AI gains decisive control over human affairs - either rapidly or gradually. | Turbulence increases risk of loss of control |
| Human-Caused CatastropheE671Scenarios where humans use AI to cause mass harm - through state actors or rogue actors. | Turbulence can trigger state failures and desperate actions |
| Long-term Lock-inE673Scenarios involving permanent entrenchment of values, power structures, or epistemic conditions. | Turbulent periods often lock in emergency measures |
Warning Signs
Indicators of increasing turbulence:
- Labor market stress: AI-related unemployment rising faster than retraining
- Political polarization: AI becoming partisan issue, backlash movements
- Regulatory lag: Governance clearly behind capability development
- International tension: AI competition framed as zero-sum
- Trust decline: Public trust in institutions/tech companies falling
- Social instability: Protests, strikes, civil unrest related to AI
Interventions That Reduce Turbulence
Economic:
- Universal basic income or robust safety nets
- Retraining and education programs
- Gradual deployment policies
- Worker transition support
Political:
- Democratic deliberation processes for AI policy
- International coordination mechanisms
- Regulatory capacity building
- Preventing authoritarian capture
Social:
- Maintaining epistemic commons (shared facts)
- Community resilience programs
- Trust-building between tech and public
- Preserving human-human social fabric
Technical:
- Paced deployment (slowing capability rollout)
- Interoperability requirements
- Human-in-the-loop requirements
- Transition period safety measures
Probability Estimates
| Turbulence Level | Assessment |
|---|---|
| Some turbulence | Almost certain—significant disruption is baseline |
| High turbulence | Likely without deliberate intervention |
| Catastrophic turbulence | Possible, depends on speed of capability growth and coordination |
| Low turbulence | Requires active coordination and paced deployment |
Key uncertainty: How fast will transformative capabilities arrive? Faster arrival = more turbulence.