Skip to content
Longterm Wiki
Back

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: 80,000 Hours

An accessible introduction to longtermism from 80,000 Hours, useful for understanding the philosophical motivation behind prioritizing AI safety and existential risk work in the effective altruism and AI safety communities.

Metadata

Importance: 62/100blog posteducational

Summary

This 80,000 Hours article introduces and defends longtermism — the view that positively influencing the long-term future is among the most important moral priorities. It explains why the vast number of potential future people gives strong ethical weight to existential risk reduction and civilizational flourishing, and how this framing shapes career and cause prioritization.

Key Points

  • Longtermism holds that future generations matter morally and that their sheer number makes long-run outcomes extraordinarily important.
  • Reducing existential and catastrophic risks is prioritized because these could permanently foreclose positive futures for countless potential people.
  • The article connects longtermism to practical career and cause selection, suggesting work on AI safety, biosecurity, and governance as high-impact paths.
  • It addresses common objections to longtermism, including uncertainty about the future and concerns about neglecting present-day suffering.
  • Longtermism underpins much of 80,000 Hours' prioritization framework and explains why AI safety is considered a top cause area.

Cited by 1 page

Cached Content Preview

HTTP 200Fetched Mar 20, 202696 KB
## On this page:

- [Introduction](https://80000hours.org/articles/future-generations/#top)
- [1 The case for longtermism](https://80000hours.org/articles/future-generations/#the-case-for-longtermism)
  - [1.1 1\. We should care about how the lives of future individuals go](https://80000hours.org/articles/future-generations/#1-we-should-care-about-how-the-lives-of-future-individuals-go)
  - [1.2 2\. The number of future individuals whose lives matter could be vast.](https://80000hours.org/articles/future-generations/#2-the-number-of-future-individuals-whose-lives-matter-could-be-vast)
  - [1.3 3\. We have an opportunity to affect how the long-run future goes](https://80000hours.org/articles/future-generations/#opportunity)
  - [1.4 Summing up the arguments](https://80000hours.org/articles/future-generations/#summing-up-the-arguments)
- [2 Objections to longtermism](https://80000hours.org/articles/future-generations/#objections)
- [3 If I don’t agree with 80,000 Hours about longtermism, can I still benefit from your advice?](https://80000hours.org/articles/future-generations/#if-i-dont-agree-with-80000-hours-about-longtermism-can-i-still-benefit-from-your-advice)
- [4 What are the best ways to help future generations right now?](https://80000hours.org/articles/future-generations/#what-are-the-best-ways-to-help-future-generations-right-now)
- [5 Learn more](https://80000hours.org/articles/future-generations/#learn-more)
- [6 Read next](https://80000hours.org/articles/future-generations/#read-next)
  - [6.1 Plus, join our newsletter and we’ll mail you a free book](https://80000hours.org/articles/future-generations/#plus-join-our-newsletter-and-well-mail-you-a-free-book)

![](https://80000hours.org/wp-content/uploads/2023/03/Joshua_Tree_Milky_Way-1800x1030.jpg)[Benjamin Inouye](https://commons.wikimedia.org/wiki/File:Joshua_Tree_Milky_Way.jpg), [CC BY 4.0](https://creativecommons.org/licenses/by/4.0), via Wikimedia Commons

1010

ChaptersSpeed 1XSubscribe

Longtermism: a call to protect future generations

00:00 / 01:02:33

Speed 1x

Chapter 1The case for longtermism

02:38The case for longtermism

05:081\. We should care about how the lives of future individuals go

07:352\. The number of future individuals whose lives matter could be vast.

12:413\. We have an opportunity to affect how the long-run future goes

14:18Reducing extinction risk

16:59Positive trajectory changes

19:06Longtermist research

19:51Capacity building

20:49Summing up the arguments

21:54Objections to longtermism

22:30Does longtermism mean we should focus on helping future people rather than people who need help today?

25:04Should we systematically discount future value?

28:26How does uncertainty about the future factor in to longtermism?

31:27Arent we just totally clueless about our effects on the future?

34:01What if my actions change the identities of individuals who are born in the future? (The non-identity problem)

37:38But should I care that future generations com

... (truncated, 96 KB total)
Resource ID: bfa3d31fa2cb4f74 | Stable ID: YWZmOTQ0OW