Back
An Overview of the AI Safety Funding Situation
webAuthor
Stephen McAleese
Credibility Rating
3/5
Good(3)Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.
Rating inherited from publication venue: EA Forum
Useful reference for understanding the financial infrastructure of AI safety research as of mid-2023, particularly relevant given the post-FTX collapse reshaping of the funding landscape.
Forum Post Details
Karma
142
Comments
15
Forum
eaforum
Forum Tags
AI safetyBuilding effective altruismForecastingEffective altruism fundingBuilding the field of AI safety
Metadata
Importance: 55/100blog postanalysis
Summary
A comprehensive survey of the AI safety funding landscape as of mid-2023, cataloging major philanthropic sources including Open Philanthropy, the FTX Future Fund, and the Long-Term Future Fund. The post maps the distribution of financial resources across AI safety research mechanisms and identifies key institutional players shaping the field's financial ecosystem.
Key Points
- •Open Philanthropy remains the dominant funder in AI safety, with the FTX Future Fund's collapse creating a significant funding gap in the ecosystem.
- •Emerging funders including AI companies and academic institutions are beginning to fill some gaps left by philanthropic contraction.
- •The analysis reveals concentration risk in AI safety funding, with a small number of funders controlling a large share of resources.
- •Different funding mechanisms (grants, fellowships, research programs) are mapped across the major institutions supporting AI safety work.
- •The post identifies underserved areas and potential misalignments between available funding and research priorities in the field.
Cited by 2 pages
| Page | Type | Quality |
|---|---|---|
| Mainstream Era | Historical | 42.0 |
| AI Safety Field Building and Community | Crux | 0.0 |
1 FactBase fact citing this source
| Entity | Property | Value | As Of |
|---|---|---|---|
| Coefficient Giving | Total Funding Raised | 63600000 | 2024 |
Cached Content Preview
HTTP 200Fetched Mar 20, 202656 KB
Hide table of contents
# [An Overview of the AI Safety FundingSituation](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation)
by [Stephen McAleese](https://forum.effectivealtruism.org/users/stephen-mcaleese-1?from=post_header)
Jul 12 202318 min read15
# 142
[AI safety](https://forum.effectivealtruism.org/topics/ai-safety)[Building effective altruism](https://forum.effectivealtruism.org/topics/building-effective-altruism)[Forecasting](https://forum.effectivealtruism.org/topics/forecasting)[Effective altruism funding](https://forum.effectivealtruism.org/topics/effective-altruism-funding)[Building the field of AI safety](https://forum.effectivealtruism.org/topics/building-the-field-of-ai-safety) [Frontpage](https://forum.effectivealtruism.org/about#Finding_content)
Show all topics
[An Overview of the AI Safety Funding Situation](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#)
[Introduction](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#Introduction)
[Past work](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#Past_work)
[Overview of global AI safety funding](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#Overview_of_global_AI_safety_funding)
[Descriptions of major AI safety funds](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#Descriptions_of_major_AI_safety_funds)
[Open Philanthropy (Open Phil)](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#Open_Philanthropy__Open_Phil_)
[Survival and Flourishing Fund (SFF)](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#Survival_and_Flourishing_Fund__SFF_)
[FTX Future Fund](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#FTX_Future_Fund)
[Long-Term Future Fund (LTFF)](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#Long_Term_Future_Fund__LTFF_)
[Other sources of funding](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#Other_sources_of_funding)
[Foundation Model Taskforce](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#Foundation_Model_Taskforce)
[Frontier Model Forum AI safety fund](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#Frontier_Model_Forum_AI_safety_fund)
[Superalignment Fast Grants](https://forum.effectivealtruism.org/posts/XdhwXppfqrpPL2YDX/an-overview-of-the-ai-safety-funding-situation#Superalignment_Fast_Grants)
[Vitalik B
... (truncated, 56 KB total)Resource ID:
80125fcaf04609b8 | Stable ID: MDk0MWZkMT