Long-Term Future Fund
webCredibility Rating
Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.
Rating inherited from publication venue: Centre for Effective Altruism
The Long-Term Future Fund is a key funding source for early-career AI safety researchers and independent projects; reviewing its grant history offers insight into what the EA community considers high-priority work in existential risk reduction.
Metadata
Summary
The Long-Term Future Fund is an Effective Altruism-affiliated grantmaking fund focused on improving humanity's prospects over the long run, particularly by supporting work on reducing existential and catastrophic risks. It funds research, advocacy, and capacity-building projects related to AI safety, biosecurity, and other global priorities. The fund is managed by a committee of EA community members and operates on a rolling grants basis.
Key Points
- •Funds projects aimed at reducing existential and global catastrophic risks, with a strong emphasis on AI safety and alignment research.
- •Operates as part of the EA Funds platform, allowing individual donors to pool resources toward high-impact long-term causes.
- •Supports a broad range of activities including technical research, policy work, field-building, and individual researcher grants.
- •Grant decisions are made by a rotating committee of experienced EA and AI safety community members.
- •Publishes grant reports detailing funding decisions and rationale, providing transparency into priorities and reasoning.
Cited by 4 pages
| Page | Type | Quality |
|---|---|---|
| AI Safety Research Value Model | Analysis | 60.0 |
| Coefficient Giving | Organization | 55.0 |
| Long-Term Future Fund (LTFF) | Organization | 56.0 |
| Survival and Flourishing Fund | Organization | 59.0 |
Cached Content Preview
[Funds](https://funds.effectivealtruism.org/funds) Long-Term Future Fund
## Long-Term Future Fund
We make grants that address global catastrophic risks, especially potential risks from advanced artificial intelligence and pandemics. We also seek to promote longtermist ideas and increasing the likelihood that future generations will flourish.
[Donate\\
\\
\\
\\
Donate\\
\\
](https://www.every.org/ea-long-term-future-fund?donateTo=ea-long-term-future-fund#/donate)
[Donate on every.orgRecommended for donors **outside** the UK or Netherlands](https://www.every.org/ea-long-term-future-fund?donateTo=ea-long-term-future-fund#/donate) [Donate on Giving What We CanRecommended for donors in the UK or Netherlands](https://www.givingwhatwecan.org/charities/long-term-future-fund?utm_source=eafunds)
[Apply for funding](https://av20jp3z.paperform.co/?fund=Long-Term%20Future%20Fund)
## Impact
The Long-Term Future Fund has recommended several million dollars' worth of grants to a range of organizations, including:
#### Created an instruction-generalization benchmark for LLMs
[Read more](https://funds.effectivealtruism.org/funds/far-future#)
#### Built and maintained digital infrastructure for the AI safety ecosystem
[Read more](https://funds.effectivealtruism.org/funds/far-future#)
#### Conducted public and expert surveys on AI governance and forecasting
[Read more](https://funds.effectivealtruism.org/funds/far-future#)
#### Ran a biorisk summit
[Read more](https://funds.effectivealtruism.org/funds/far-future#)
#### Ran an AI safety independent research program
[Read more](https://funds.effectivealtruism.org/funds/far-future#)
## About the fund
The Fund has historically supported researchers in areas such as cause prioritization, existential risk identification and mitigation, and technical research on the development of safe and secure artificial intelligence—where it was among the first funders. Most of our fund managers have built their careers working full time in areas directly relevant to the Fund’s mission.
**The Fund managers can be contacted at longtermfuture\[at\]effectivealtruismfunds.org**
## Focus areas
The Fund has a broad remit to make grants that promote, implement and advocate for longtermist ideas. Many of our grants aim to address potential risks from advanced artificial intelligence and to build infrastructure and advocate for longtermist projects. However, we welcome applications related to long-term institutional reform or other global catastrophic risks (e.g., pandemics or nuclear conflict). We intend to support:
- Projects that directly contribute to reducing [existential risks](https://www.nickbostrom.com/existential/risks.html) through technical research, policy an
... (truncated, 13 KB total)9baa7f54db71864d | Stable ID: NGM5MTlmYz