Longterm Wiki

Longtermist Funders

funders-overview (E641)
← Back to pagePath: /knowledge-base/organizations/funders-overview/
Page Metadata
{
  "id": "funders-overview",
  "numericId": "E641",
  "path": "/knowledge-base/organizations/funders-overview/",
  "filePath": "knowledge-base/organizations/funders-overview.mdx",
  "title": "Longtermist Funders",
  "quality": 3,
  "importance": 75,
  "contentFormat": "article",
  "tractability": null,
  "neglectedness": null,
  "uncertainty": null,
  "causalLevel": null,
  "lastUpdated": "2026-02-03",
  "llmSummary": null,
  "structuredSummary": null,
  "description": "Overview of major funders supporting AI safety, existential risk reduction, and longtermist causes. These organizations and individuals collectively provide hundreds of millions of dollars annually to research, policy, and field-building efforts aimed at ensuring beneficial AI development.",
  "ratings": null,
  "category": "organizations",
  "subcategory": "funders",
  "clusters": [
    "ai-safety"
  ],
  "metrics": {
    "wordCount": 1188,
    "tableCount": 8,
    "diagramCount": 2,
    "internalLinks": 39,
    "externalLinks": 0,
    "footnoteCount": 0,
    "bulletRatio": 0.06,
    "sectionCount": 15,
    "hasOverview": true,
    "structuralScore": 12
  },
  "suggestedQuality": 80,
  "updateFrequency": 45,
  "evergreen": true,
  "wordCount": 1188,
  "unconvertedLinks": [],
  "unconvertedLinkCount": 0,
  "convertedLinkCount": 0,
  "backlinkCount": 0,
  "redundancy": {
    "maxSimilarity": 13,
    "similarPages": [
      {
        "id": "elon-musk-philanthropy",
        "title": "Elon Musk (Funder)",
        "path": "/knowledge-base/organizations/elon-musk-philanthropy/",
        "similarity": 13
      },
      {
        "id": "vitalik-buterin-philanthropy",
        "title": "Vitalik Buterin (Funder)",
        "path": "/knowledge-base/organizations/vitalik-buterin-philanthropy/",
        "similarity": 12
      },
      {
        "id": "ai-risk-portfolio-analysis",
        "title": "AI Risk Portfolio Analysis",
        "path": "/knowledge-base/models/ai-risk-portfolio-analysis/",
        "similarity": 11
      },
      {
        "id": "coefficient-giving",
        "title": "Coefficient Giving",
        "path": "/knowledge-base/organizations/coefficient-giving/",
        "similarity": 11
      },
      {
        "id": "ltff",
        "title": "Long-Term Future Fund (LTFF)",
        "path": "/knowledge-base/organizations/ltff/",
        "similarity": 11
      }
    ]
  }
}
Entity Data
{
  "id": "funders-overview",
  "type": "organization",
  "title": "Longtermist Funders",
  "tags": [],
  "relatedEntries": [],
  "sources": [],
  "customFields": []
}
Canonical Facts (0)

No facts for this entity

External Links

No external links

Backlinks (0)

No backlinks

Frontmatter
{
  "numericId": "E641",
  "title": "Longtermist Funders",
  "description": "Overview of major funders supporting AI safety, existential risk reduction, and longtermist causes. These organizations and individuals collectively provide hundreds of millions of dollars annually to research, policy, and field-building efforts aimed at ensuring beneficial AI development.",
  "sidebar": {
    "label": "Overview",
    "order": 0
  },
  "quality": 3,
  "lastEdited": "2026-02-03",
  "importance": 75,
  "update_frequency": 45,
  "subcategory": "funders",
  "entityType": "organization"
}
Raw MDX Source
---
numericId: E641
title: Longtermist Funders
description: Overview of major funders supporting AI safety, existential risk reduction, and longtermist causes. These organizations and individuals collectively provide hundreds of millions of dollars annually to research, policy, and field-building efforts aimed at ensuring beneficial AI development.
sidebar:
  label: Overview
  order: 0
quality: 3
lastEdited: "2026-02-03"
importance: 75
update_frequency: 45
subcategory: funders
entityType: organization
---
import {DataInfoBox, Mermaid, EntityLink} from '@components/wiki';

## Overview

Longtermist funders provide critical financial support for organizations working on AI safety, existential risk reduction, and related cause areas. The funding landscape is characterized by a relatively small number of major philanthropists and foundations that provide the majority of resources, with additional support from regranting programs and smaller donors.

The field has experienced significant growth in funding over the past decade, though it remains small relative to overall AI development spending. Major shifts occurred in 2022-2023 with the FTX collapse eliminating a significant planned funding source, though other funders have partially filled the gap.

## Comprehensive Funder Comparison

### By Annual Giving and Focus Area

| Funder | Annual Giving | AI Safety | Global Health | Science | Education | Other |
|--------|---------------|-----------|---------------|---------|-----------|-------|
| **Gates Foundation** | ≈\$7B | Minimal | \$4B | \$1B | \$500M | \$1B |
| **Wellcome Trust** | ≈\$1.5B | Minimal | \$500M | \$800M | — | \$200M |
| **<EntityLink id="E519">Chan Zuckerberg Initiative</EntityLink>** | ≈\$1B | \$0 | \$200M | \$800M | \$30M | — |
| **Howard Hughes Medical Institute** | ≈\$1B | \$0 | Minimal | \$1B | — | — |
| **<EntityLink id="E521">Coefficient Giving</EntityLink>** | ≈\$700M | **\$65M** | \$300M | \$50M | — | \$285M |
| **<EntityLink id="E544">MacArthur Foundation</EntityLink>** | ≈\$260M | Minimal | — | \$50M | — | \$200M |
| **<EntityLink id="E535">Hewlett Foundation</EntityLink>** | ≈\$473M | \$8M | — | — | \$100M | \$365M |
| **<EntityLink id="E567">Survival and Flourishing Fund</EntityLink>** | ≈\$35M | **\$30M** | — | — | — | \$5M |
| **<EntityLink id="E561">Schmidt Futures</EntityLink>** | ≈\$200M | \$5M | — | \$100M | \$50M | \$45M |
| **<EntityLink id="E543">Long-Term Future Fund</EntityLink>** | ≈\$5-10M | **\$5-10M** | — | — | — | — |
| **<EntityLink id="E547">Manifund</EntityLink>** | ≈\$2-5M | **\$1-3M** | — | — | — | \$1-2M |

### Key Individual Philanthropists

| Person | Net Worth | Annual Giving | AI Safety | Lifetime Total | Primary Vehicle |
|--------|-----------|---------------|-----------|----------------|-----------------|
| **Bill Gates** | ≈\$130B | ≈\$5B | Minimal | \$50B+ | Gates Foundation |
| **<EntityLink id="E410">Elon Musk (Funder)</EntityLink>** | ≈\$400B | ≈\$250M | Minimal | ≈\$8B | Musk Foundation |
| **Mark Zuckerberg** | ≈\$200B | ≈\$1B | \$0 | ≈\$8B | <EntityLink id="E519">CZI</EntityLink> |
| **<EntityLink id="E436">Dustin Moskovitz</EntityLink>** | ≈\$17B | ≈\$700M | **\$65M** | \$4B+ | <EntityLink id="E521">Coefficient Giving</EntityLink> |
| **MacKenzie Scott** | ≈\$35B | ≈\$3-4B | Unknown | \$17B+ | Direct giving |
| **<EntityLink id="E577">Jaan Tallinn</EntityLink>** | ≈\$500M | ≈\$50M | **\$40M+** | \$100M+ | <EntityLink id="E567">SFF</EntityLink>, direct |
| **<EntityLink id="E573">Vitalik Buterin (Funder)</EntityLink>** | ≈\$500M | ≈\$50M | **\$15M+** | **\$800M+** | <EntityLink id="E528">FLI</EntityLink> (\$665M), MIRI, Balvi |
| **Eric Schmidt** | ≈\$25B | ≈\$200M | \$5M | \$1B+ | Schmidt Futures |

### AI Safety Funding Concentration

The AI safety funding landscape is highly concentrated among a few donors:

| Funder | AI Safety (Annual) | % of Total AI Safety Funding |
|--------|-------------------|------------------------------|
| <EntityLink id="E521">Coefficient Giving</EntityLink> | \$65M | ≈55% |
| <EntityLink id="E567">Survival and Flourishing Fund</EntityLink> | \$30M | ≈25% |
| <EntityLink id="E577">Jaan Tallinn</EntityLink> (direct) | \$10M | ≈8% |
| <EntityLink id="E573">Vitalik Buterin</EntityLink> | \$5-15M | ≈5-10% |
| <EntityLink id="E543">Long-Term Future Fund</EntityLink> | \$5-10M | ≈5% |
| Other sources | \$5-10M | ≈5% |
| **Total estimated** | **≈\$120-150M/year** | **100%** |

### Untapped Philanthropic Potential

Several major philanthropists have significant resources but minimal AI safety engagement:

| Person | Net Worth | Current AI Safety | Potential (1% of net worth) |
|--------|-----------|-------------------|----------------------------|
| <EntityLink id="E410">Elon Musk</EntityLink> | \$400B | ≈\$0 | \$4B/year |
| Mark Zuckerberg | \$200B | \$0 | \$2B/year |
| Bill Gates | \$130B | Minimal | \$1.3B/year |
| Larry Ellison | \$230B | \$0 | \$2.3B/year |
| Jeff Bezos | \$200B | \$0 | \$2B/year |

If these five individuals allocated just 1% of their net worth annually to AI safety, it would represent **\$11.6B/year** — roughly **80x current total funding**.

## AI Safety Funders (Detailed)

| Organization | Type | Annual Giving (Est.) | Primary Focus | Key Grantees |
|--------------|------|---------------------|---------------|--------------|
| <EntityLink id="E521">Coefficient Giving</EntityLink> | Foundation | \$65M AI safety | Technical alignment, governance, evals | <EntityLink id="E202">MIRI</EntityLink>, <EntityLink id="E557">Redwood</EntityLink>, <EntityLink id="E201">METR</EntityLink>, GovAI |
| <EntityLink id="E567">Survival and Flourishing Fund</EntityLink> | Donor Lottery | \$30M | AI safety, x-risk | MIRI, ARC Evals, SERI, <EntityLink id="E47">CAIS</EntityLink> |
| <EntityLink id="E543">Long-Term Future Fund</EntityLink> | Regranting | \$5-10M | AI safety, x-risk research | Individual researchers, small orgs |
| <EntityLink id="E547">Manifund</EntityLink> | Regranting Platform | \$2-5M | EA causes broadly | Community projects |

## Non-AI-Safety Major Funders

| Organization | Type | Annual Giving | Focus Areas | AI Safety |
|--------------|------|---------------|-------------|-----------|
| **Gates Foundation** | Foundation | \$7B | Global health, poverty, education | Minimal |
| **Wellcome Trust** | Foundation | \$1.5B | Health research, science | Minimal |
| <EntityLink id="E519">Chan Zuckerberg Initiative</EntityLink> | LLC | \$1B | AI-biology, disease cures | \$0 |
| <EntityLink id="E535">Hewlett Foundation</EntityLink> | Foundation | \$473M | Environment, democracy, education | \$8M (cybersecurity) |
| <EntityLink id="E544">MacArthur Foundation</EntityLink> | Foundation | \$260M | Climate, justice, nuclear risk | Minimal |
| <EntityLink id="E561">Schmidt Futures</EntityLink> | LLC | \$200M | Science, AI applications, talent | \$5M |

## AI Safety Funding Landscape

<Mermaid chart={`
flowchart TD
    subgraph AISafetyDonors["AI Safety Donors (≈120M/year)"]
        DM[Dustin Moskovitz<br/>65M/year]
        JT[Jaan Tallinn<br/>50M/year]
        VB[Vitalik Buterin<br/>15M/year]
    end

    subgraph AISafetyVehicles["AI Safety Vehicles"]
        CG[Coefficient Giving<br/>65M AI safety]
        SFF[SFF<br/>30M]
        LTFF[LTFF<br/>5-10M]
        MF[Manifund<br/>2-5M]
    end

    subgraph Recipients["AI Safety Recipients"]
        RESEARCH[Research Orgs<br/>MIRI, Redwood, METR]
        POLICY[Policy & Governance<br/>GovAI, CAIS, RAND]
        FIELD[Field Building<br/>80K, Atlas, SERI]
        EVALS[Evaluations<br/>METR, Epoch]
    end

    DM --> CG
    JT --> SFF
    VB -->|Direct| RESEARCH
    CG --> RESEARCH
    CG --> POLICY
    CG --> EVALS
    SFF --> RESEARCH
    SFF --> POLICY
    LTFF --> RESEARCH
    LTFF --> FIELD
    MF --> FIELD

    style AISafetyDonors fill:#e6f3ff
    style AISafetyVehicles fill:#ccffcc
    style Recipients fill:#ffffcc
`} />

## Broader Philanthropy Landscape (For Context)

<Mermaid chart={`
flowchart TD
    subgraph MegaDonors["Mega-Donors (Minimal AI Safety)"]
        GATES[Bill Gates<br/>130B NW, 5B/year]
        MUSK[Elon Musk<br/>400B NW, 250M/year]
        ZUCK[Mark Zuckerberg<br/>200B NW, 1B/year]
        SCOTT[MacKenzie Scott<br/>35B NW, 4B/year]
    end

    subgraph MegaFoundations["Major Foundations"]
        GATESF[Gates Foundation<br/>7B/year]
        CZI[CZI<br/>1B/year]
        WELLCOME[Wellcome Trust<br/>1.5B/year]
        HEWLETT[Hewlett<br/>473M/year]
    end

    subgraph NonAIFocus["Primary Focus Areas"]
        HEALTH[Global Health<br/>5B+/year]
        SCIENCE[Science<br/>3B+/year]
        CLIMATE[Climate<br/>500M+/year]
        EDU[Education<br/>1B+/year]
    end

    GATES --> GATESF
    ZUCK --> CZI
    GATESF --> HEALTH
    GATESF --> SCIENCE
    CZI --> SCIENCE
    WELLCOME --> HEALTH
    WELLCOME --> SCIENCE
    HEWLETT --> CLIMATE
    HEWLETT --> EDU

    style MegaDonors fill:#ffe6e6
    style MegaFoundations fill:#fff0cc
    style NonAIFocus fill:#e6ffe6
`} />

### The Scale Gap

| Category | Annual Funding | Notes |
|----------|----------------|-------|
| **AI Safety (total)** | ≈\$120-150M | Highly concentrated |
| **Gates Foundation alone** | ≈\$7,000M | 50x AI safety total |
| **AI capabilities (industry)** | ≈\$50,000M+ | 400x AI safety total |
| **Global philanthropy** | ≈\$500,000M | 4,000x AI safety total |

## Pending Major Funding Sources

### Anthropic-Derived Capital

<EntityLink id="E406">Anthropic (Funder)</EntityLink> represents potentially the largest future source of longtermist philanthropic capital. At Anthropic's current \$350B valuation:

| Source | Estimated Value | EA Likelihood | Notes |
|--------|-----------------|---------------|-------|
| Founder pledges (7 founders, 80%) | \$39-59B | 2/7 strongly EA-aligned | Only Dario & Daniela have documented EA connections |
| <EntityLink id="E577">Jaan Tallinn</EntityLink> stake | \$2-6B (conservative) | Very high | Series A lead investor |
| <EntityLink id="E436">Dustin Moskovitz</EntityLink> stake | \$3-9B | Certain | \$500M+ already in nonprofit |
| Employee pledges + matching | \$20-40B | High (in DAFs) | Historical 3:1 matching reduced to 1:1 for new hires |
| **Total risk-adjusted** | **\$25-70B** | — | Wide range reflects cause allocation uncertainty |

**Key uncertainties:**
- Only 2/7 founders have documented strong EA connections—71% of founder equity may go to non-EA causes
- Matching program reduced from 3:1 at 50% to 1:1 at 25% for new employees
- IPO timeline: 2026-2027 expected; capital deployment likely 2027-2035

For comparison, this \$25-70B range represents **170-470x current annual AI safety funding** of ≈\$150M. Even if only 10% ultimately reaches EA causes, it would still be transformative.

See <EntityLink id="E406">Anthropic (Funder)</EntityLink> for comprehensive analysis.

### OpenAI Foundation

The <EntityLink id="E421">OpenAI Foundation</EntityLink> holds 26% of OpenAI, worth approximately \$130B at current valuations. Unlike Anthropic's pledge-based model, the Foundation has direct legal control over these assets. Cause allocation is uncertain—the Foundation's stated mission focuses on "safe AGI" but specific philanthropic priorities are undisclosed.

## Recent Trends

**2024-2025 Developments:**
- Coefficient Giving launched \$40M AI Safety Request for Proposals (January 2025)
- SFF allocated \$34.33M, with 86% going to AI-related projects
- Coefficient Giving (formerly Open Philanthropy) rebranded in November 2025
- LTFF continued steady grantmaking at ≈\$5M annually
- Anthropic founders announced 80% donation pledges (January 2026)

**Post-FTX Landscape:**
- Future Fund's collapse eliminated ≈\$160M in committed grants
- Some organizations faced funding crises; others found alternative support
- Field-wide diversification of funding sources