Back
The Toxic Ideology of Longtermism — Alice Crary, Radical Philosophy (2023)
webradicalphilosophy.com·radicalphilosophy.com/commentary/the-toxic-ideology-of-lo...
A critical philosophy essay from Radical Philosophy offering a left-academic challenge to longtermism; useful for understanding critiques of EA-aligned AI safety framings, though it does not engage deeply with technical safety literature.
Metadata
Importance: 42/100opinion piececommentary
Summary
Alice Crary critiques longtermism as a morally and politically dangerous ideology, arguing it prioritizes speculative future beings over present harms, reinforces techno-utopian power structures, and provides ideological cover for inaction on urgent injustices. The piece engages with Effective Altruism and figures like Nick Bostrom and William MacAskill, contending that longtermism's ethical framework is fundamentally flawed and serves elite interests.
Key Points
- •Longtermism's focus on astronomical future populations can rationalize ignoring or deprioritizing present-day suffering and systemic injustice.
- •Crary argues longtermism is not a neutral philosophical position but an ideology aligned with Silicon Valley techno-utopianism and concentrated wealth.
- •The framework's reliance on speculative modeling and expected-value reasoning is criticized as epistemically overconfident and ethically distorting.
- •Longtermism is presented as displacing democratic accountability by empowering a small techno-elite to make civilization-scale decisions.
- •The article challenges AI safety discourse that is shaped by longtermist assumptions, urging attention to near-term harms from AI deployment.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| Longtermism's Philosophical Credibility After FTX | -- | 50.0 |
Cached Content Preview
HTTP 200Fetched Mar 20, 202648 KB
[Skip to content](https://www.radicalphilosophy.com/commentary/the-toxic-ideology-of-longtermism#content)
[Download pdf](https://www.radicalphilosophy.com/wp-content/uploads/2023/04/rp214_crary.pdf) ~ [Purchase issue](https://www.radicalphilosophy.com/print)
The intellectual movement that calls itself longtermism is an outgrowth of Effective Altruism (EA), a utilitarianism-inspired philanthropic programme founded just over a decade ago by Oxford philosophers Toby Ord and William MacAskill. EA, which claims to guide charitable giving to do the ‘most good’ per expenditure of time or money, originally focused on mitigating the effects of poverty in the global South and the treatment of animals in factory farms. [1](https://www.radicalphilosophy.com/commentary/the-toxic-ideology-of-longtermism#fn1) This initially modestly-funded, Oxford-based enterprise soon had satellites in the UK, US and elsewhere in the world, several of which became multi-million-dollar organisations, while the amount of money directed by EA-affiliated groups swelled to over four hundred million dollars annually, with pledges in the tens of billions. [2](https://www.radicalphilosophy.com/commentary/the-toxic-ideology-of-longtermism#fn2) During this period, Ord and MacAskill starting using the term ‘longtermism’ to mark a view championed by members of a conspicuous subset of effective altruists, many affiliated with Oxford University’s Future of Humanity Institute. The view is that humanity is at a crossroads at which we may either self-destruct or realise a glorious future, and that we should prioritise responding to threats to the continued existence of human civilisation. The ‘existential risks’ – to use the term introduced by Oxford philosopher and Future of Humanity Institute founder Nick Bostrom [3](https://www.radicalphilosophy.com/commentary/the-toxic-ideology-of-longtermism#fn3) – that longtermists rank as most probable are AI unaligned with liberal values and deadly engineered pathogens. They urge us to combat these risks to make it likelier that humans (or our digitally intelligent descendants) will live on for millions, billions or even trillions of years, surviving until long after the sun has vaporised the earth, by colonising exoplanets.
Ord published a monograph defending a longtermist stance in early 2020, and MacAskill followed suit in the summer of 2022. [4](https://www.radicalphilosophy.com/commentary/the-toxic-ideology-of-longtermism#fn4) Ord’s book received plaudits in high-profile venues, [5](https://www.radicalphilosophy.com/commentary/the-toxic-ideology-of-longtermism#fn5) and MacAskill’s was a best-seller that came with a blitz of largely positive media attention, including a _New Yorker_ profile, a review featured on the cover of _Time_, and an appearance on _The Daily Show_, as well as an endorsement from Elon Musk. [6](https://www.radicalphilosophy.com/commentary/the-toxic-ideology-of-longtermism#fn6) This was the coming out party for a tradition t
... (truncated, 48 KB total)Resource ID:
c6e878699b4d0828 | Stable ID: NDY3NGE3ZT