Oxford-based organization that coordinates the effective altruism movement, running EA Global conferences, supporting local groups, and maintaining the EA Forum.
Anthropic (Funder)AnalysisAnthropic (Funder)Comprehensive model of EA-aligned philanthropic capital at Anthropic. At $380B valuation (Series G, Feb 2026, $30B raised): $27-76B risk-adjusted EA capital expected. Total funding raised exceeds $...Quality: 65/100Planning for Frontier Lab ScalingAnalysisPlanning for Frontier Lab ScalingStrategic framework analyzing how non-lab actors could respond to frontier AI labs deploying $100-300B+ pre-TAI. For philanthropies: analysis of potential shifts from matching spend to maximizing l...Quality: 55/100
Other
Nick BecksteadPersonNick BecksteadNick Beckstead is a philosopher and EA/longtermism figure whose 2013 dissertation formalized longtermist ethics; this article covers his career arc from academic philosopher to Coefficient Giving (...Quality: 60/100Will MacAskillPersonWill MacAskillComprehensive biographical reference on Will MacAskill covering his founding of EA organizations, academic work on moral uncertainty and longtermism, AGI preparedness advocacy, and controversies in...Quality: 60/100Toby OrdPersonToby OrdComprehensive biographical profile of Toby Ord documenting his 10% AI extinction estimate and role founding effective altruism, with detailed tables on risk assessments, academic background, and in...Quality: 41/100Stuart RussellPersonStuart RussellStuart Russell (born 1962) is a British computer scientist and UC Berkeley professor who co-authored the dominant AI textbook 'Artificial Intelligence: A Modern Approach' (used in over 1,500 univer...Quality: 30/100
Organizations
80,000 HoursOrganization80,000 Hours80,000 Hours is the largest EA career organization, reaching 10M+ readers and reporting 3,000+ significant career plan changes, with 80% of $10M+ funding from Coefficient Giving. Since 2016 they've...Quality: 45/100EA GlobalOrganizationEA GlobalEA Global is a series of selective conferences organized by the Centre for Effective Altruism that connects committed EA practitioners to collaborate on global challenges, with AI safety becoming i...Quality: 38/100LessWrongOrganizationLessWrongLessWrong is a rationality-focused community blog founded in 2009 that has influenced AI safety discourse, receiving $5M+ in funding and serving as the origin point for ~31% of EA survey respondent...Quality: 44/100MacArthur FoundationOrganizationMacArthur FoundationComprehensive profile of the $9 billion MacArthur Foundation documenting its evolution from 1978 to present, with $8.27 billion in total grants across climate, criminal justice, nuclear threats, an...Quality: 65/100Long-Term Future Fund (LTFF)OrganizationLong-Term Future Fund (LTFF)LTFF is a regranting program that has distributed $20M since 2017 (approximately $10M to AI safety) with median grants of $25K, filling a critical niche between personal savings and institutional f...Quality: 56/100Coefficient GivingOrganizationCoefficient GivingCoefficient Giving (formerly Open Philanthropy) has directed $4B+ in grants since 2014, including $336M to AI safety (~60% of external funding). The organization spent ~$50M on AI safety in 2024, w...Quality: 55/100
Concepts
Existential Risk from AIConceptExistential Risk from AIHypotheses concerning risks from advanced AI systems that some researchers believe could result in human extinction or permanent global catastrophe, including institutional frameworks developed by ...Quality: 92/100Ea Epistemic Failures In The Ftx EraEa Epistemic Failures In The Ftx EraThis page synthesizes post-FTX critiques of EA's epistemic and governance failures, identifying interlocking problems including donor hero-worship, funding concentration in volatile crypto assets, ...Quality: 84/100FTX Collapse: Lessons for EA Funding ResilienceConceptFTX Collapse: Lessons for EA Funding ResilienceThe November 2022 collapse of FTX resulted in approximately $160M in committed EA grants that were not disbursed, organizational restructuring across the ecosystem, and revealed structural vulnerab...Quality: 78/100Ea Institutions Response To The Ftx CollapseEa Institutions Response To The Ftx CollapseEA institutions responded to the FTX collapse through public condemnation, funding pauses, and community surveys, but were damaged by revelations that warnings about SBF's conduct had been downplay...Quality: 53/100Earning To GiveEarning To GiveEarning to Give is an EA career strategy emphasizing high-income jobs to fund effective charities, which peaked around 2012–2015 before being de-emphasized by major EA organizations; the FTX collap...Quality: 63/100Ftx Red Flags Pre Collapse Warning Signs That Were OverlookedFtx Red Flags Pre Collapse Warning Signs That Were OverlookedA well-structured retrospective on the FTX collapse identifying six major pre-collapse warning signs (FTT overreliance, fund commingling, governance failures, regulatory vacuum, opaque revenue mode...Quality: 53/100
Historical
The MIRI EraHistoricalThe MIRI EraComprehensive chronological account of AI safety's institutional emergence (2000-2015), from MIRI's founding through Bostrom's Superintelligence to mainstream recognition. Covers key organizations,...Quality: 31/100