Back
Longtermism - Wikipedia
referenceCredibility Rating
3/5
Good(3)Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.
Rating inherited from publication venue: Wikipedia
A useful introductory reference for understanding the philosophical framework that motivates much of the AI safety and existential risk community's priorities; not a technical source but provides accessible conceptual grounding.
Metadata
Importance: 45/100wiki pagereference
Summary
Wikipedia's overview of longtermism, the ethical view that positively influencing the long-term future is a top moral priority. It covers the philosophical foundations, key proponents, criticism, and relationship to existential risk reduction and effective altruism.
Key Points
- •Longtermism holds that future people matter morally and that shaping the long-term trajectory of civilization is among the most important tasks.
- •Closely associated with philosophers Nick Bostrom and William MacAskill, and organizations like the Future of Humanity Institute and Effective Altruism.
- •Existential risk reduction—including risks from advanced AI—is a central practical focus, as preventing catastrophe preserves the long-run potential of humanity.
- •Critics argue longtermism can be speculative, may neglect present-day harms, and carries risks of misuse to justify radical or harmful actions.
- •The article situates longtermism within broader debates in population ethics, moral philosophy, and global priorities research.
Cited by 3 pages
| Page | Type | Quality |
|---|---|---|
| EA and Longtermist Wins and Losses | -- | 53.0 |
| Longtermism's Philosophical Credibility After FTX | -- | 50.0 |
| Nick Beckstead | Person | 60.0 |
Cached Content Preview
HTTP 200Fetched Mar 20, 202690 KB
# Longtermism
Longtermism
Philosophical view which prioritises the long-term future
[](https://en.wikipedia.org/wiki/File:Illustration_of_past,_present_and_future_population_sizes_(Our_World_in_Data).png) Infographic comparing the number of humans in the past (red), present (green), and next 800,000 years (yellow), in a scenario where humanity's population stabilizes at 11 billion with a life expectancy of 88 years[\[1\]](https://en.wikipedia.org/wiki/Longtermism#cite_note-1)
**Longtermism** is the [ethical view](https://en.wikipedia.org/wiki/Ethics "Ethics") that positively influencing the long-term [future](https://en.wikipedia.org/wiki/Future "Future") is a key moral priority. It is an important concept in [effective altruism](https://en.wikipedia.org/wiki/Effective_altruism "Effective altruism") and a primary motivation for efforts that aim to reduce [existential risks](https://en.wikipedia.org/wiki/Global_catastrophic_risk "Global catastrophic risk") to humanity.[\[2\]](https://en.wikipedia.org/wiki/Longtermism#cite_note-:10-2)[\[3\]](https://en.wikipedia.org/wiki/Longtermism#cite_note-:9-3)
The key argument for longtermism has been summarized as follows: " [future people](https://en.wikipedia.org/wiki/Future_generations "Future generations") matter morally just as much as people alive today;... there may well be more people alive in the future than there are in the present or have been in the past; and... we can positively affect future peoples' lives."[\[4\]](https://en.wikipedia.org/wiki/Longtermism#cite_note-4)[\[5\]](https://en.wikipedia.org/wiki/Longtermism#cite_note-:11-5) These three ideas taken together suggest, to those advocating longtermism, that it is the responsibility of those living now to ensure that future generations get to survive and flourish.[\[5\]](https://en.wikipedia.org/wiki/Longtermism#cite_note-:11-5)
## Definition
Philosopher [William MacAskill](https://en.wikipedia.org/wiki/William_MacAskill "William MacAskill") defines _longtermism_ as "the view that positively influencing the longterm future is a key moral priority of our time".[\[2\]](https://en.wikipedia.org/wiki/Longtermism#cite_note-:10-2)[\[6\]](https://en.wikipedia.org/wiki/Longtermism#cite_note-:04-6):4 He distinguishes it from _strong longtermism_, "the view that positively influencing the longterm future is _the_ key moral priority of our time".[\[7\]](https://en.wikipedia.org/wiki/Longtermism#cite_note-:1-7)[\[3\]](https://en.wikipedia.org/wiki/Longtermism#cite_note-:9-3)
In his book _[The Precipice: Existential Risk and the Future of Humanity](https://en.wikipedia.org/wiki/The_Precipice:_Existential_Risk_and_the_Future_of_Humanity "The Precipice: Existential Risk and the Future of Humanity")_, philosop
... (truncated, 90 KB total)Resource ID:
ce051025fd4b6a82 | Stable ID: NWEyY2ZiYT