Skip to content
Longterm Wiki

Donations List Website

active

Comprehensive documentation of an open-source database tracking $72.8B in philanthropic donations (1969-2023) across 75+ donors, with particular coverage of EA/AI safety funding. The page thoroughly describes the tool's features, data coverage, and limitations, but is purely descriptive reference material about a data tool rather than analysis of AI safety funding patterns.

Organizations

5
Coefficient GivingCoefficient Giving (formerly Open Philanthropy) has directed $4B+ in grants since 2014, including $336M to AI safety (~60% of external funding). The organization spent ~$50M on AI safety in 2024, with 68% going to evaluations/benchmarking, and launched a $40M Technical AI Safety RFP in 2025 covering 8 research areas.
Machine Intelligence Research Institute (MIRI)The Machine Intelligence Research Institute (MIRI) is one of the oldest organizations focused on AI existential risk, founded in 2000 as the Singularity Institute for Artificial Intelligence (SIAI).
LessWrongLessWrong is a rationality-focused community blog founded in 2009 that has influenced AI safety discourse, receiving $5M+ in funding and serving as the origin point for ~31% of EA survey respondents in 2014. Survey participation peaked at 3,000+ in 2016, declining to 558 by 2023, with the community increasingly focused on AI alignment discussions.
Rethink PrioritiesRethink Priorities is a research organization founded in 2018 that grew from 2 to ~130 people by 2022, conducting evidence-based analysis across animal welfare, global health, and AI governance. The organization reported influencing >$10M in grants by 2023 but acknowledges significant failures in impact measurement and shifting organizational priorities.
Survival and Flourishing Fund (SFF)SFF distributed $141M since 2019 (primarily from Jaan Tallinn's ~$900M fortune), with the 2025 round totaling $34.33M (86% to AI safety). Uses unique S-process mechanism where 6-12 recommenders express utility functions and an algorithm allocates grants favoring projects with enthusiastic champions, though the mechanism can produce volatile grant distributions year-to-year.

People

3
Vipul NaikVipul Naik is a mathematician and EA community member who has funded ~$255K in contract research (primarily to Sebastian Sanchez and Issa Rice) and created the Donations List Website tracking $72.8B in philanthropic donations. His main contribution is transparency infrastructure for EA funding patterns and donation tracking.
Issa RiceIssa Rice is an independent researcher who has created valuable knowledge infrastructure tools like Timelines Wiki and AI Watch for the EA and AI safety communities, though his work focuses on data aggregation rather than original research. His contributions are primarily utilitarian reference material rather than original analytical contributions to AI safety.
Vitalik ButerinCo-founder of Ethereum and major philanthropic donor. Has given over \$1.5B to charity including MIRI, SENS Research Foundation, GiveDirectly, and various EA-aligned organizations.

Related Projects

3
Wikipedia ViewsThis article provides a comprehensive overview of Wikipedia pageview analytics tools and their declining traffic due to AI summaries reducing direct visits. While well-documented, it's primarily about web analytics infrastructure rather than core AI safety concerns.
AI WatchAI Watch is a tracking database by Issa Rice that monitors AI safety organizations, people, funding, and publications as part of his broader knowledge infrastructure ecosystem. The article provides useful context about Rice's systematic approach to documentation but lacks concrete details about AI Watch's actual scope, methodology, or current operational status.
Timelines WikiTimelines Wiki is a specialized MediaWiki project documenting chronological histories of AI safety and EA organizations, created by Issa Rice with funding from Vipul Naik in 2017. While useful as a historical reference source, it primarily serves as documentation infrastructure rather than providing original analytical insight.

Related Wiki Pages

Top Related Pages

Organizations

LessWrongRethink PrioritiesSurvival and Flourishing Fund

Analysis

Wikipedia ViewsAI WatchTimelines Wiki

Other

Vitalik Buterin

Concepts

Ea Longtermist Wins Losses

Clusters

ai-safety

Quick Links