Skip to content
Longterm Wiki
Back

Technological Singularity — Wikipedia

reference

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: Wikipedia

A useful introductory reference for understanding the historical and conceptual roots of the technological singularity, which motivates much of the urgency behind AI safety research and concerns about superintelligent systems.

Metadata

Importance: 55/100wiki pagereference

Summary

Wikipedia's comprehensive overview of the technological singularity concept, describing the hypothetical future point at which technological growth becomes uncontrollable and irreversible, potentially resulting in unforeseeable changes to human civilization. It covers the history of the concept, key thinkers like Vernor Vinge and Ray Kurzweil, and debates around intelligence explosion and superintelligence.

Key Points

  • The singularity refers to a hypothetical moment when AI or other technology surpasses human intelligence, leading to rapid, unpredictable civilizational change.
  • Key theorists include Vernor Vinge, who popularized the term, and Ray Kurzweil, who predicted the singularity by 2045.
  • The intelligence explosion concept, originally from I.J. Good, posits that a sufficiently advanced AI could recursively self-improve at accelerating rates.
  • Critics argue the singularity is speculative and overly optimistic, while proponents see it as an existential inflection point requiring serious study.
  • The concept is foundational to many AI safety concerns, as a rapid intelligence explosion could outpace human ability to maintain oversight or alignment.

Cited by 1 page

PageTypeQuality
Early Warnings EraHistorical31.0

Cached Content Preview

HTTP 200Fetched Mar 20, 202698 KB
# Technological singularity

Technological singularity

Hypothetical event

"The Singularity" redirects here. For other uses, see [Singularity](https://en.wikipedia.org/wiki/Singularity "Singularity").

The **technological singularity**, often simply called **the singularity**,[\[1\]](https://en.wikipedia.org/wiki/Technological_singularity#cite_note-1) is a [hypothetical](https://en.wikipedia.org/wiki/Hypothetical "Hypothetical") event in which technological growth accelerates beyond human control, producing unpredictable changes in [human civilization](https://en.wikipedia.org/wiki/Human_civilization "Human civilization").[\[2\]](https://en.wikipedia.org/wiki/Technological_singularity#cite_note-2)[\[3\]](https://en.wikipedia.org/wiki/Technological_singularity#cite_note-Singularity_hypotheses-3) According to the most popular version of the singularity hypothesis, [I. J. Good](https://en.wikipedia.org/wiki/I._J._Good "I. J. Good")'s [intelligence explosion](https://en.wikipedia.org/wiki/Technological_singularity#Intelligence_explosion) model of 1965, an upgradable [intelligent agent](https://en.wikipedia.org/wiki/Intelligent_agent "Intelligent agent") could eventually enter a [positive feedback loop](https://en.wikipedia.org/wiki/Positive_feedback_loop "Positive feedback loop") of [successive self-improvement](https://en.wikipedia.org/wiki/Recursive_self-improvement "Recursive self-improvement") cycles; more intelligent generations would appear more and more rapidly, causing an explosive increase in intelligence that culminates in a powerful [superintelligence](https://en.wikipedia.org/wiki/Superintelligence "Superintelligence"), far surpassing [human intelligence](https://en.wikipedia.org/wiki/Human_intelligence "Human intelligence").[\[4\]](https://en.wikipedia.org/wiki/Technological_singularity#cite_note-vinge1993-4)

Some scientists, including [Stephen Hawking](https://en.wikipedia.org/wiki/Stephen_Hawking "Stephen Hawking"), have expressed concern that [artificial superintelligence](https://en.wikipedia.org/wiki/Superintelligence "Superintelligence") could result in [human extinction](https://en.wikipedia.org/wiki/Human_extinction "Human extinction").[\[5\]](https://en.wikipedia.org/wiki/Technological_singularity#cite_note-5)[\[6\]](https://en.wikipedia.org/wiki/Technological_singularity#cite_note-6) The consequences of a technological singularity and its potential benefit or harm to the human species have been intensely debated.

Prominent technologists and academics dispute the plausibility of a technological singularity and associated artificial intelligence "explosion", including [Paul Allen](https://en.wikipedia.org/wiki/Paul_Allen "Paul Allen"),[\[7\]](https://en.wikipedia.org/wiki/Technological_singularity#cite_note-Allen2011-7) [Jeff Hawkins](https://en.wikipedia.org/wiki/Jeff_Hawkins "Jeff Hawkins"),[\[8\]](https://en.wikipedia.org/wiki/Technological_singularity#cite_note-ieee-lumi-8) [John Holland](https://en.wikipedia.org/wiki/John

... (truncated, 98 KB total)
Resource ID: 8e616da1e9e8e30b | Stable ID: NjFiODdiNW