Skip to content
Longterm Wiki
Back

Superintelligence - Wikipedia

reference

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: Wikipedia

A useful introductory reference for those new to the concept of superintelligence; synthesizes mainstream definitions and debates but should be supplemented with primary sources like Bostrom's book for deeper analysis.

Metadata

Importance: 52/100wiki pagereference

Summary

A comprehensive Wikipedia overview of the concept of superintelligence, covering definitions, proposed forms (speed, collective, quality), potential risks, and the broader debate around artificial general intelligence surpassing human cognitive abilities. It synthesizes perspectives from key thinkers including Bostrom, Good, and others on the implications and timelines of superintelligent AI.

Key Points

  • Defines superintelligence as an intellect that greatly exceeds human cognitive performance in virtually all domains of interest.
  • Distinguishes multiple forms: speed superintelligence, collective superintelligence, and quality superintelligence.
  • Discusses the 'intelligence explosion' concept (I.J. Good) and recursive self-improvement as a path to superintelligence.
  • Covers major existential risk concerns including control problems, misaligned goals, and potential dangers from uncontrolled superintelligent systems.
  • Summarizes debates around timelines, feasibility, and the range of expert opinions on when or whether superintelligence may emerge.

Cited by 1 page

PageTypeQuality
SuperintelligenceConcept92.0

Cached Content Preview

HTTP 200Fetched Mar 15, 202657 KB
# Superintelligence

Superintelligence

Hypothetical agent surpassing human intelligence

For the book by Nick Bostrom, see [Superintelligence: Paths, Dangers, Strategies](https://en.wikipedia.org/wiki/Superintelligence:_Paths,_Dangers,_Strategies "Superintelligence: Paths, Dangers, Strategies"). For the 2020 film, see [Superintelligence (film)](https://en.wikipedia.org/wiki/Superintelligence_(film) "Superintelligence (film)").

A **superintelligence** is a hypothetical [agent](https://en.wikipedia.org/wiki/Intelligent_agent "Intelligent agent") that possesses [intelligence](https://en.wikipedia.org/wiki/Intelligence "Intelligence") surpassing that of the most [gifted](https://en.wikipedia.org/wiki/Intellectual_giftedness "Intellectual giftedness") [human](https://en.wikipedia.org/wiki/Human "Human") minds.[\[1\]](https://en.wikipedia.org/wiki/Superintelligence#cite_note-1) Philosopher [Nick Bostrom](https://en.wikipedia.org/wiki/Nick_Bostrom "Nick Bostrom") defines _superintelligence_ as "any [intellect](https://en.wikipedia.org/wiki/Intellect "Intellect") that greatly exceeds the cognitive performance of humans in virtually all domains of interest".[\[2\]](https://en.wikipedia.org/wiki/Superintelligence#cite_note-FOOTNOTEBostrom2014Chapter_2-2)

Technological researchers disagree about how likely present-day [human intelligence](https://en.wikipedia.org/wiki/Human_intelligence "Human intelligence") is to be surpassed. Some argue that advances in [artificial intelligence](https://en.wikipedia.org/wiki/Artificial_intelligence "Artificial intelligence") (AI) will probably result in general reasoning systems that lack human cognitive limitations. Others believe that humans will evolve or directly modify their biology to achieve radically greater intelligence.[\[3\]](https://en.wikipedia.org/wiki/Superintelligence#cite_note-3)[\[4\]](https://en.wikipedia.org/wiki/Superintelligence#cite_note-4) Several [future study](https://en.wikipedia.org/wiki/Futures_studies "Futures studies") scenarios combine elements from both of these possibilities, suggesting that humans are likely to [interface with computers](https://en.wikipedia.org/wiki/Brain%E2%80%93computer_interface "Brain–computer interface"), or [upload their minds to computers](https://en.wikipedia.org/wiki/Mind_uploading "Mind uploading"), in a way that enables substantial [intelligence amplification](https://en.wikipedia.org/wiki/Intelligence_amplification "Intelligence amplification"). The hypothetical creation of the first superintelligence may or may not result from an [intelligence explosion](https://en.wikipedia.org/wiki/Technological_singularity#Intelligence_explosion "Technological singularity") or a [technological singularity](https://en.wikipedia.org/wiki/Technological_singularity "Technological singularity").

Some researchers believe that superintelligence will likely follow shortly after the development of [artificial general intelligence](https://en.wikipedia.org/wiki/Artificial_gene

... (truncated, 57 KB total)
Resource ID: 15d16c5cf0f769e6 | Stable ID: NjdjMTYzOD