Longterm Wiki

Longview Philanthropy

longview-philanthropy (E542)
← Back to pagePath: /knowledge-base/organizations/longview-philanthropy/
Page Metadata
{
  "id": "longview-philanthropy",
  "numericId": null,
  "path": "/knowledge-base/organizations/longview-philanthropy/",
  "filePath": "knowledge-base/organizations/longview-philanthropy.mdx",
  "title": "Longview Philanthropy",
  "quality": 45,
  "importance": 37,
  "contentFormat": "article",
  "tractability": null,
  "neglectedness": null,
  "uncertainty": null,
  "causalLevel": null,
  "lastUpdated": "2026-01-29",
  "llmSummary": "Longview Philanthropy is a philanthropic advisory organization founded in 2018 that has directed $140M+ to longtermist causes ($89M+ to AI risk), primarily through UHNW donor advising and managed funds (Frontier AI Fund: $13M raised, $11.1M disbursed to 18 orgs). Funded primarily by Coefficient Giving ($21M+ in grants), it operates advisory services for $1M+/year donors and public funds (ECF, NWPF) with 15-20 staff.",
  "structuredSummary": null,
  "description": "Longview Philanthropy is a philanthropic advisory and grantmaking organization founded in 2018 by Natalie Cargill that has directed over $140 million to longtermist causes. As of late 2025, they have moved $89M+ specifically toward AI risk reduction, $50M+ in 2025 alone, and launched the Frontier AI Fund (raising $13M, disbursing $11.1M to 18 organizations in its first 9 months). Led by CEO Simran Dhaliwal and President Natalie Cargill, Longview operates two legal entities (UK and US) and manages public funds (Emerging Challenges Fund, Nuclear Weapons Policy Fund) alongside bespoke UHNW donor advisory services.",
  "ratings": {
    "novelty": 2.5,
    "rigor": 4.5,
    "actionability": 3,
    "completeness": 6
  },
  "category": "organizations",
  "subcategory": "funders",
  "clusters": [
    "community",
    "ai-safety",
    "governance",
    "biorisks"
  ],
  "metrics": {
    "wordCount": 3480,
    "tableCount": 26,
    "diagramCount": 2,
    "internalLinks": 6,
    "externalLinks": 114,
    "footnoteCount": 0,
    "bulletRatio": 0.12,
    "sectionCount": 48,
    "hasOverview": true,
    "structuralScore": 15
  },
  "suggestedQuality": 100,
  "updateFrequency": 45,
  "evergreen": true,
  "wordCount": 3480,
  "unconvertedLinks": [],
  "unconvertedLinkCount": 0,
  "convertedLinkCount": 0,
  "backlinkCount": 0,
  "redundancy": {
    "maxSimilarity": 16,
    "similarPages": [
      {
        "id": "coefficient-giving",
        "title": "Coefficient Giving",
        "path": "/knowledge-base/organizations/coefficient-giving/",
        "similarity": 16
      },
      {
        "id": "80000-hours",
        "title": "80,000 Hours",
        "path": "/knowledge-base/organizations/80000-hours/",
        "similarity": 15
      },
      {
        "id": "dustin-moskovitz",
        "title": "Dustin Moskovitz (AI Safety Funder)",
        "path": "/knowledge-base/people/dustin-moskovitz/",
        "similarity": 15
      },
      {
        "id": "fli",
        "title": "Future of Life Institute (FLI)",
        "path": "/knowledge-base/organizations/fli/",
        "similarity": 14
      },
      {
        "id": "ltff",
        "title": "Long-Term Future Fund (LTFF)",
        "path": "/knowledge-base/organizations/ltff/",
        "similarity": 14
      }
    ]
  }
}
Entity Data
{
  "id": "longview-philanthropy",
  "type": "organization",
  "title": "Longview Philanthropy",
  "description": "Longview Philanthropy is a philanthropic advisory organization founded in 2018 that has directed $140M+ to longtermist causes ($89M+ to AI risk), primarily through UHNW donor advising and managed funds (Frontier AI Fund: $13M raised, $11.1M disbursed to 18 orgs). Funded primarily by Coefficient Givi",
  "tags": [],
  "relatedEntries": [],
  "sources": [],
  "lastUpdated": "2026-02",
  "customFields": []
}
Canonical Facts (0)

No facts for this entity

External Links

No external links

Backlinks (0)

No backlinks

Frontmatter
{
  "title": "Longview Philanthropy",
  "description": "Longview Philanthropy is a philanthropic advisory and grantmaking organization founded in 2018 by Natalie Cargill that has directed over $140 million to longtermist causes. As of late 2025, they have moved $89M+ specifically toward AI risk reduction, $50M+ in 2025 alone, and launched the Frontier AI Fund (raising $13M, disbursing $11.1M to 18 organizations in its first 9 months). Led by CEO Simran Dhaliwal and President Natalie Cargill, Longview operates two legal entities (UK and US) and manages public funds (Emerging Challenges Fund, Nuclear Weapons Policy Fund) alongside bespoke UHNW donor advisory services.",
  "sidebar": {
    "order": 6
  },
  "quality": 45,
  "llmSummary": "Longview Philanthropy is a philanthropic advisory organization founded in 2018 that has directed $140M+ to longtermist causes ($89M+ to AI risk), primarily through UHNW donor advising and managed funds (Frontier AI Fund: $13M raised, $11.1M disbursed to 18 orgs). Funded primarily by Coefficient Giving ($21M+ in grants), it operates advisory services for $1M+/year donors and public funds (ECF, NWPF) with 15-20 staff.",
  "lastEdited": "2026-01-29",
  "importance": 37.5,
  "update_frequency": 45,
  "ratings": {
    "novelty": 2.5,
    "rigor": 4.5,
    "actionability": 3,
    "completeness": 6
  },
  "clusters": [
    "community",
    "ai-safety",
    "governance",
    "biorisks"
  ],
  "subcategory": "funders",
  "entityType": "organization"
}
Raw MDX Source
---
title: Longview Philanthropy
description: Longview Philanthropy is a philanthropic advisory and grantmaking organization founded in 2018 by Natalie Cargill that has directed over $140 million to longtermist causes. As of late 2025, they have moved $89M+ specifically toward AI risk reduction, $50M+ in 2025 alone, and launched the Frontier AI Fund (raising $13M, disbursing $11.1M to 18 organizations in its first 9 months). Led by CEO Simran Dhaliwal and President Natalie Cargill, Longview operates two legal entities (UK and US) and manages public funds (Emerging Challenges Fund, Nuclear Weapons Policy Fund) alongside bespoke UHNW donor advisory services.
sidebar:
  order: 6
quality: 45
llmSummary: "Longview Philanthropy is a philanthropic advisory organization founded in 2018 that has directed $140M+ to longtermist causes ($89M+ to AI risk), primarily through UHNW donor advising and managed funds (Frontier AI Fund: $13M raised, $11.1M disbursed to 18 orgs). Funded primarily by Coefficient Giving ($21M+ in grants), it operates advisory services for $1M+/year donors and public funds (ECF, NWPF) with 15-20 staff."
lastEdited: "2026-01-29"
importance: 37.5
update_frequency: 45
ratings:
  novelty: 2.5
  rigor: 4.5
  actionability: 3
  completeness: 6
clusters:
  - community
  - ai-safety
  - governance
  - biorisks
subcategory: funders
entityType: organization
---
import {DataInfoBox, Mermaid, EntityLink} from '@components/wiki';

## Quick Assessment

| Dimension | Assessment | Evidence |
|-----------|------------|----------|
| **Scale** | Major | [\$140M+ moved since 2018](https://forum.effectivealtruism.org/posts/WRGBW8DMNhRGWA9xB/longview-is-now-offering-ai-grant-recommendations-to-donors-1); \$89M+ to AI risk reduction; [\$50M+ in 2025 alone](https://www.longview.org/grantmaking/) |
| **Role** | Advisory + Grantmaking | UHNW donor advising (\$1M+/year clients), public funds (ECF, NWPF), private funds (Frontier AI Fund) |
| **Focus** | Longtermist x-risk | AI safety (primary), biosecurity, nuclear weapons policy |
| **Team** | ≈15-20 staff | London HQ + US remote; led by Simran Dhaliwal (CEO) and Natalie Cargill (President) |
| **Key Funders** | <EntityLink id="E521">Coefficient Giving</EntityLink> (\$21M+ in grants), Justin Rockefeller, Crowley family | Major grants in 2023 (\$4M) and 2024 (\$16M) |
| **Independence** | [Separate legal entities](https://www.longview.org/organisational-structure/) | Longview Inc. Ltd (UK) and Longview Philanthropy USA Inc. (US); formerly EV project |

## Organization Details

| Attribute | Details |
|-----------|---------|
| **Full Name** | Longview Philanthropy |
| **Type** | Philanthropic advisory and grantmaking organization |
| **Founded** | 2018 by [Natalie Cargill](https://www.longview.org/about/natalie-cargill/) |
| **Leadership** | Simran Dhaliwal (CEO), Natalie Cargill (President & Founder) |
| **Total Directed** | \$140M+ since founding; \$89M+ to AI safety; [\$50M+ in 2025](https://www.longview.org/grantmaking/) |
| **Legal Structure** | [Longview Inc. Ltd](https://www.longview.org/organisational-structure/) (UK, company 14444004) + Longview Philanthropy USA Inc. (US, EIN 93-2664730) |
| **Location** | London HQ (UK team) + distributed US team |
| **Key Funders** | <EntityLink id="E521">Coefficient Giving</EntityLink>, Justin Rockefeller, Martin & Tom Crowley, Likith Govindaiah, Rafael Albert, Ben Delo |
| **Website** | [longview.org](https://www.longview.org/) |
| **Status** | Independent nonprofit (formerly Effective Ventures project) |

## Overview

[Longview Philanthropy](https://www.longview.org/) is a philanthropic advisory and grantmaking organization founded in 2018 by Natalie Cargill that has become one of the most significant funders and donor advisors in the longtermist ecosystem. As of late 2025, the organization has [directly influenced or moved over \$140 million](https://forum.effectivealtruism.org/posts/WRGBW8DMNhRGWA9xB/longview-is-now-offering-ai-grant-recommendations-to-donors-1) toward reducing existential risk, with [\$89 million specifically directed to AI risk reduction](https://www.longview.org/artificial-intelligence/) and \$50 million moved in 2025 alone supporting more than 50 projects.

The organization operates across three interconnected modes:

1. **UHNW Donor Advisory**: Bespoke end-to-end services for donors giving over \$1 million annually, including education, expert introductions, grant recommendations, due diligence, and impact assessment
2. **Public Fund Management**: Operating the [Emerging Challenges Fund](https://www.longview.org/fund/emerging-challenges-fund/) (open to all donors) and the [Nuclear Weapons Policy Fund](https://www.longview.org/fund/nuclear-weapons-policy-fund/)
3. **Private Grantmaking**: Managing the [Frontier AI Fund](https://www.longview.org/fund/frontier-ai-fund/) (raised \$13M, disbursed \$11.1M in first 9 months) and bespoke donor-advised grants

Longview fills a critical niche in the longtermist funding landscape by serving donors who want more personalized guidance than pooled funds provide but lack the capacity for independent evaluation. Unlike many EA organizations, Longview's [operational costs are fully funded by a group of philanthropists](https://www.longview.org/about/) who have no influence over grant recommendations, ensuring independence. The organization transitioned from being a project within Effective Ventures Foundation to [operating as independent legal entities](https://www.longview.org/organisational-structure/) in both the UK (Longview Inc. Ltd) and US (Longview Philanthropy USA Inc.).

Longview's core focus areas are AI safety and governance (primary), biosecurity (strengthening defense-focused biotechnologies), and nuclear weapons policy (opposing destabilizing systems and arms races). Their work aims to reduce the risk of global catastrophe from emerging technologies and ensure future generations inherit a safe world.

## Founding and Leadership

### Natalie Cargill (Founder & President)

[Natalie Cargill](https://www.longview.org/about/natalie-cargill/) founded Longview Philanthropy in 2018 after leaving a career in human rights law. She holds a double first-class degree from the University of Oxford in English Language and Literature, where she was awarded the highest score across all humanities subjects at Lincoln College.

| Period | Role | Focus |
|--------|------|-------|
| Pre-2018 | Barrister, [Serjeants' Inn Chambers](https://www.ted.com/speakers/natalie_cargill) | Human rights law |
| Earlier | UN Project Officer, Legal Resources Centre | International human rights |
| 2018-present | Founder & President, Longview | Major donor philanthropy |

Cargill has argued that the top one percent of society should donate 10% of their wealth to address existential challenges. She delivered a [TED talk on effective philanthropy](https://www.ted.com/speakers/natalie_cargill) in April 2023 and has presented at the University of Cambridge, King's College London, and Web Summit. She co-edited [The Long View: Essays on Policy, Philanthropy, and the Long-term Future](https://philarchive.org/rec/CARTLV-2) with Tyler M. John.

### Simran Dhaliwal (CEO)

[Simran Dhaliwal](https://www.longview.org/about/simran-dhaliwal/) serves as CEO, coordinating Longview's research, grantmaking, and advising work. Before joining Longview, she was a research analyst at Goldman Sachs, working on a team recognized as the [best sell-side stockpickers in London in 2018](https://www.givingwhatwecan.org/blog/podcast-simran-dhaliwal). She studied at the University of Oxford and previously worked as a mathematics teacher through Teach First.

### Key Staff

| Name | Role | Background |
|------|------|------------|
| **Gavin** | Senior Leadership | Organizational strategy, AI grantmaking oversight |
| **Carl Robichaud** | Nuclear Weapons Policy Lead | Former Carnegie Corporation (\$30M+ annual nuclear security grantmaking), The Century Foundation |
| **Aidan O'Gara** | AI Grantmaking | Former <EntityLink id="E153">GovAI</EntityLink>, Epoch, <EntityLink id="E47">CAIS</EntityLink> |
| **Zach Freitas-Groff** | AI Grantmaking | PhD Economics, Stanford |
| **Page** | Programme Director | Works with CEO on strategic priorities |
| **Katie** | Operations & Events | Systems and events management |
| **Alysha** | Advisory & Content | Philanthropist relations, events |
| **Matthew** | Nuclear Policy Research | Grant investigations |

The organization maintains teams in London (UK headquarters) and remotely across the US. Notably, [many senior staff have signed the Giving What We Can Pledge](https://www.longview.org/about/), donating at least 10% of their income to the kinds of projects they recommend.

## Services

<Mermaid chart={`
flowchart TD
    DONORS[Major Donors] --> LONG[Longview Philanthropy]

    LONG --> ADVISE[Donor Advisory<br/>Personalized guidance]
    LONG --> GRANTS[Grantmaking<br/>Direct funding]
    LONG --> RESEARCH[Research<br/>Landscape analysis]

    ADVISE --> REC[Recommendations<br/>Tailored opportunities]
    GRANTS --> ORGS[Recipient Organizations]
    RESEARCH --> INSIGHTS[Funding Insights]

    style DONORS fill:#e6f3ff
    style LONG fill:#ccffcc
    style ADVISE fill:#ffffcc
    style GRANTS fill:#ffffcc
    style RESEARCH fill:#ffffcc
`} />

### UHNW Donor Advisory

[Longview's primary service](https://forum.effectivealtruism.org/posts/WRGBW8DMNhRGWA9xB/longview-is-now-offering-ai-grant-recommendations-to-donors-1) is advising ultra-high-net-worth donors (giving more than \$1 million annually) who want to maximize impact on existential risk reduction:

| Service | Description | Typical Output |
|---------|-------------|----------------|
| **Bespoke Education** | Tailored briefings on AI, biosecurity, nuclear risk | Multi-session learning programs |
| **Expert Introductions** | Connections to researchers, policymakers, peer philanthropists | Curated meetings and dinners |
| **Grant Recommendations** | Researched, prioritized giving opportunities | Ranked list with rationale |
| **Due Diligence** | Deep investigation of organizations | Detailed assessment reports |
| **Grant Logistics** | Transfer execution and tax optimization | Seamless donation processing |
| **Impact Assessment** | Ongoing monitoring and outcome reporting | Bi-annual updates for major donors |

Everything is [provided free of charge](https://www.longview.org/about/) with no commission or fees, as operational costs are covered by a separate group of funders who have no influence over recommendations.

### HNW Donor Services (\$100K+)

In 2025, Longview expanded to offer services for high-net-worth donors giving [\$100,000 or more per year to AI safety](https://forum.effectivealtruism.org/posts/WRGBW8DMNhRGWA9xB/longview-is-now-offering-ai-grant-recommendations-to-donors-1):

| Offering | Description |
|----------|-------------|
| **Top AI Grant Recommendations** | Access to the AI grantmaking team's prioritized list |
| **Frontier AI Fund Access** | Participation in Longview's private AI fund |
| **Group Sessions** | Educational dinners with AI presentations and Q&A |
| **Expert Briefings** | Direct access to AI researchers and policy experts |

Longview hosted [group sessions for finance professionals (December 2024)](https://forum.effectivealtruism.org/posts/WRGBW8DMNhRGWA9xB/longview-is-now-offering-ai-grant-recommendations-to-donors-1) and employees of a tech company (February 2025) to introduce potential donors to AI safety philanthropy.

### Grantmaking Programs

Longview operates several funds with distinct focus areas:

| Fund | Focus | Status | Key Metrics |
|------|-------|--------|-------------|
| **[Frontier AI Fund](https://www.longview.org/fund/frontier-ai-fund/)** | AI safety research, policy, field-building | Private (\$100K+ donors) | [\$13M raised, \$11.1M disbursed](https://www.longview.org/fund/frontier-ai-fund/) to 18 orgs (Dec 2024-Sep 2025) |
| **[Emerging Challenges Fund](https://www.longview.org/fund/emerging-challenges-fund/)** | AI, biosecurity, nuclear (GCR broadly) | Public (open to all) | [2,000+ donors](https://forum.effectivealtruism.org/posts/ZuRmENHzmivjxNafG/longview-s-emerging-challenges-fund-can-effectively-absorb); 2024: EU AI Act Code of Practice orgs |
| **[Nuclear Weapons Policy Fund](https://www.longview.org/fund/nuclear-weapons-policy-fund/)** | Nuclear risk reduction | Public | Led by Carl Robichaud (former Carnegie Corporation) |
| **[Digital Sentience Fund](https://www.longview.org/fund/digital-sentience-fund/)** | AI consciousness research | Public | Career transition fellowships available |

Contributors to the Frontier AI Fund and Nuclear Weapons Policy Fund receive reports every six months detailing grants made, reasoning, and program updates.

### Research and Analysis

| Output | Description | Examples |
|--------|-------------|----------|
| **Annual Reports** | Comprehensive grantmaking summaries | [2025 ECF Annual Report](https://www.longview.org/fund/emerging-challenges-fund/), 2024 Report |
| **Landscape Mapping** | Identifying funding gaps and opportunities | AI governance funding needs |
| **Grant Reports** | Detailed reasoning for specific grants | Published on fund pages |
| **Donor Intelligence** | Understanding philanthropic flows in x-risk | Internal analysis shared with donors |

### Community Building and Retreats

Longview runs [donor community programs](https://www.longview.org/community/) bringing together philanthropists, researchers, and policymakers:

| Event Type | Description | Recent Examples |
|------------|-------------|-----------------|
| **Annual Retreat** | Multi-day gathering of philanthropists and experts | Talks from Coefficient Giving, FHI, DeepMind |
| **Nordic AI Retreat** | Regional focused event co-hosted with Astralis Foundation | [Stockholm 2025](https://www.longview.org/community/): 25 Nordic philanthropists with frontier AI lab leaders |
| **Donor Dinners** | Small group sessions with expert presentations | Finance professionals (Dec 2024), tech company (Feb 2025) |
| **Expert Briefings** | Targeted educational sessions | AI safety crash courses for new donors |

Attendees have noted: "The conversations and contacts from this workshop significantly accelerated my understanding of the issues, and helped to advance my organisation's efforts to address global existential risks."

## Notable Grantees and Funding Areas

### AI Safety Grantmaking (2023-2025)

[Longview's AI program](https://www.longview.org/artificial-intelligence/) funds "interventions most likely to shape the trajectory of advanced AI for the better," including technical research, policy development, and field-building:

| Grantee | Focus | Grant Details |
|---------|-------|---------------|
| **[METR](https://www.longview.org/artificial-intelligence/)** | AI capability evaluation | Evaluated GPT-5, Claude 4 before release; tests dangerous capabilities |
| **[Center for Human-Compatible AI (CHAI)](https://www.longview.org/artificial-intelligence/)** | Technical AI safety | UC Berkeley; trains AI safety PhDs |
| **[FAR AI](https://www.longview.org/fund/emerging-challenges-fund-december-2023-grants-report/)** | Robustness, value alignment | December 2023 ECF grant |
| **[Panoplia Laboratories](https://www.longview.org/fund/emerging-challenges-fund/)** | AI biosecurity research | Assessing AI misuse potential |
| **[Harvard AI Interpretability](https://www.givingwhatwecan.org/charities/emerging-challenges-fund/longtermism-fund-august-2023-grants-report)** | Wattenberg & Viégas | \$110,000 (August 2023) |
| **[Alignment Research Center](https://www.givingwhatwecan.org/charities/emerging-challenges-fund/longtermism-fund-august-2023-grants-report)** | Evaluations project | \$220,000 (August 2023) |
| **EU AI Act Organizations** | Code of Practice | [Over 50% of 2024 ECF allocation](https://forum.effectivealtruism.org/posts/PBonnCzpMjij4X3Mr/announcing-the-emerging-challenges-fund-s-2024-report) |

### Biosecurity Grantmaking

| Grantee | Focus | Grant Details |
|---------|-------|---------------|
| **[NTI Biosecurity](https://www.givingwhatwecan.org/charities/emerging-challenges-fund/longtermism-fund-august-2023-grants-report)** | Disincentivizing state bio-weapons | \$100,000 |
| **[Blueprint Biosecurity](https://www.longview.org/fund/emerging-challenges-fund-december-2023-grants-report/)** | Far-UVC safety research | \$50,000 (December 2023) |
| **[CCDD (Harvard)](https://www.givingwhatwecan.org/charities/emerging-challenges-fund/longtermism-fund-august-2023-grants-report)** | Communicable disease dynamics | \$80,000 |

### Nuclear Weapons Policy Grantmaking

Carl Robichaud leads Longview's nuclear program after over a decade at Carnegie Corporation (\$30M+ annually):

| Grantee | Focus | Grant Details |
|---------|-------|---------------|
| **[Carnegie Endowment (CEIP)](https://www.givingwhatwecan.org/charities/emerging-challenges-fund/longtermism-fund-august-2023-grants-report)** | Nuclear policy research | \$52,000 |
| **US-China Dialogue Projects** | AI and arms control | 2025 ECF priority |
| **Government Talent Pipelines** | Nuclear security capacity | 2025 ECF priority |

## Funding Sources and Financial Scale

### Coefficient Giving Relationship

<EntityLink id="E521">Coefficient Giving</EntityLink> is Longview's largest institutional funder:

| Grant | Amount | Date | Purpose |
|-------|--------|------|---------|
| **[General Support 2024](https://www.openphilanthropy.org/grants/longview-philanthropy-general-support-october-2024/)** | \$15,961,273 | October 2024 | Operational costs |
| **[General Support 2023](https://www.openphilanthropy.org/grants/longview-philanthropy-general-support/)** | ≈\$4,020,258 | 2023 | Operational costs |
| **[Nuclear Security](https://www.openphilanthropy.org/grants/longview-philanthropy-nuclear-security-grantmaking/)** | \$500,000 | 2 years | Carl Robichaud's program |
| **[OECD AI Policy](https://www.openphilanthropy.org/grants/longview-philanthropy-ai-policy-development-at-the-oecd/)** | ≈\$770,076 | N/A | AI policy development |
| **Total OP Funding** | ≈\$21M+ | 2023-2024 | |

### Other Key Funders

[Current funders](https://www.longview.org/about/) include:

| Funder | Notes |
|--------|-------|
| **Justin Rockefeller** | Great-great grandson of John D. Rockefeller |
| **Martin Crowley** | Private philanthropist |
| **Tom Crowley** | Private philanthropist |
| **Likith Govindaiah** | Private philanthropist |
| **Rafael Albert** | Private philanthropist |
| **Ben Delo** | BitMEX co-founder |

### Grantmaking Scale (2018-2025)

| Metric | Value | Source |
|--------|-------|--------|
| **Total Directed** | \$140M+ | [EA Forum](https://forum.effectivealtruism.org/posts/WRGBW8DMNhRGWA9xB/longview-is-now-offering-ai-grant-recommendations-to-donors-1) |
| **AI Risk Reduction** | \$89M+ | [Longview AI](https://www.longview.org/artificial-intelligence/) |
| **2025 Grantmaking** | \$50M+ | [Grantmaking page](https://www.longview.org/grantmaking/) |
| **Projects (2025)** | 50+ | Longview reports |
| **FAIF (9 months)** | \$13M raised, \$11.1M disbursed | [FAIF page](https://www.longview.org/fund/frontier-ai-fund/) |
| **FAIF Organizations** | 18 | FAIF report |

## Target Donors

Longview works primarily with:

| Donor Type | Description | Services Offered |
|------------|-------------|------------------|
| **Ultra-High-Net-Worth** | \$1M+/year giving capacity | Full bespoke advisory |
| **High-Net-Worth (AI)** | \$100K+/year to AI safety | FAIF access, grant recs |
| **Tech Founders** | Post-liquidity entrepreneurs | Education + recommendations |
| **Institutional Donors** | Foundations seeking x-risk focus | Strategic consulting |
| **General Public** | Any amount | ECF, NWPF donations |

### Why Donors Work with Longview

| Reason | Description |
|--------|-------------|
| **Expertise** | Deep cause area knowledge from dedicated research team |
| **Personalization** | Tailored recommendations based on donor values and capacity |
| **Time Savings** | Professional due diligence eliminates donor research burden |
| **Independence** | [No commission or fees](https://www.longview.org/about/); operational costs separately funded |
| **Access** | Connections to top researchers, labs, and policymakers |
| **Trust** | [GWWC evaluation](https://forum.effectivealtruism.org/posts/PTHskHoNpcRDZtJoh/gwwc-s-evaluations-of-evaluators): "solid grantmaking processes" |

## Relationship to Other Funders

### Position in the Longtermist Funding Ecosystem

| Funder | Relationship with Longview |
|--------|---------------------------|
| **<EntityLink id="E521">Coefficient Giving</EntityLink>** | Primary operational funder (\$21M+); complementary grantmaking serving different donor types |
| **SFF (Survival and Flourishing Fund)** | Parallel funder; Longview targets gaps SFF doesn't fill |
| **LTFF (Long-Term Future Fund)** | May recommend to clients; serves smaller donors |
| **Founders Pledge** | Similar model but different donor base; some staff crossover (Christian Ruhl) |
| **Manifund** | Regranting platform; complementary mechanism |

### How Longview Complements Other Funders

Longview explicitly [targets grants that major donors like Coefficient Giving are unwilling or unable to make](https://forum.effectivealtruism.org/posts/ZuRmENHzmivjxNafG/longview-s-emerging-challenges-fund-can-effectively-absorb), minimizing displacement effects:

| Niche | Longview's Role |
|-------|-----------------|
| **Speed** | Can move faster than large foundations on time-sensitive opportunities |
| **Political Funding** | Greater flexibility for advocacy and political work |
| **Small Grants** | Makes grants too small for OP's cost-effectiveness threshold |
| **Donor Activation** | Brings new capital into longtermism that wouldn't otherwise flow |
| **International** | US-China dialogue, Nordic philanthropy mobilization |

### Ecosystem Diagram

<Mermaid chart={`
flowchart LR
    subgraph Large["Major Foundations"]
        OP[Coefficient Giving<br/>Moskovitz/Tuna]
        SFF[SFF<br/>Tallinn]
    end

    subgraph Advisory["Advisory Organizations"]
        LONG[Longview<br/>UHNW advisory]
        FP[Founders Pledge<br/>Founder advisory]
    end

    subgraph Pools["Pooled Funds"]
        LTFF[LTFF]
        MF[Manifund]
        ECF[ECF<br/>Longview-managed]
    end

    subgraph New["New Donor Capital"]
        TECH[Tech Founders]
        TRAD[Traditional Philanthropists]
    end

    OP -->|funds operations| LONG
    LONG -->|advises| TECH
    LONG -->|advises| TRAD
    TECH -->|gives through| ECF
    TECH -->|gives to| POOLS[Direct Grantees]
    LONG -->|may recommend| LTFF

    style LONG fill:#ccffcc
    style ECF fill:#ccffcc
`} />

## Methodology

### Grant Evaluation Framework

Longview applies an [ITN (Importance, Tractability, Neglectedness) framework](https://www.longview.org/grantmaking/) adapted for existential risk:

| Criterion | Description | Key Questions |
|-----------|-------------|---------------|
| **Impact Potential** | Expected value of success weighted by probability | What's the upside if this works? How likely is success? |
| **Neglectedness** | Funding gap relative to optimal allocation | Would this get funded anyway? By whom? |
| **Tractability** | Whether additional funding helps | Can money solve this? What's the bottleneck? |
| **Team Quality** | Organizational capacity and track record | Have they delivered before? Do they have the right skills? |
| **Counterfactual** | What happens without this grant | What's the marginal value of Longview's funding? |
| **Speed Sensitivity** | Time-criticality of the opportunity | Does waiting cost impact? |

### Due Diligence Process

| Stage | Activities | Timeline |
|-------|------------|----------|
| **Screening** | Initial opportunity review, quick assessment | Days |
| **Investigation** | Deep organizational analysis, financials review | 1-4 weeks |
| **Expert Consultation** | Domain expert calls, reference checks | 1-2 weeks |
| **Site Visits** | In-person meetings when warranted | As needed |
| **Internal Review** | Team discussion, challenge process | 1 week |
| **Recommendation** | Final analysis and ranking for donor | Ongoing |
| **Post-Grant Monitoring** | Progress reports, outcome assessment | 6-month cycles |

### Grantmaking Philosophy

[According to GWWC's evaluation](https://forum.effectivealtruism.org/posts/PTHskHoNpcRDZtJoh/gwwc-s-evaluations-of-evaluators): "Longview has solid grantmaking processes in place to find highly cost-effective funding opportunities. In the grants we evaluated, we generally saw these processes working as intended."

Longview emphasizes:
- **Flexibility**: Can fund advocacy, political work, and controversial areas major foundations avoid
- **Speed**: Faster decision-making than large institutional funders
- **Gap-filling**: Explicitly targets opportunities OP and others can't or won't fund
- **Donor alignment**: Recommendations tailored to individual donor values and risk tolerance

## Critical Assessment

### Strengths

| Strength | Evidence |
|----------|----------|
| **Deep Expertise** | Dedicated teams for AI, nuclear, biosecurity; staff from Carnegie, GovAI, Epoch |
| **Personalization** | Bespoke advisory for UHNW donors vs. one-size-fits-all recommendations |
| **Independence** | [Operational costs separately funded](https://www.longview.org/about/); no commission model |
| **Network Access** | Retreats with Coefficient Giving, FHI, DeepMind; Nordic philanthropist convening |
| **Research Quality** | [GWWC: "solid grantmaking processes"](https://forum.effectivealtruism.org/posts/PTHskHoNpcRDZtJoh/gwwc-s-evaluations-of-evaluators) |
| **Gap-Filling** | Targets opportunities too small/controversial for OP |
| **Speed** | Faster than institutional foundations on time-sensitive opportunities |

### Limitations

| Limitation | Context |
|------------|---------|
| **Scale** | \$50M/year vs. Coefficient Giving's \$500M+; still building capacity |
| **Donor Concentration** | Dependent on small number of UHNW clients |
| **Coefficient Dependency** | \$21M+ in operational support from Coefficient Giving |
| **Visibility** | Less public profile than OP, SFF, or LTFF |
| **Staff Size** | ≈15-20 staff limits simultaneous investigations |

### Criticisms and Controversies

**Governance History**: Longview was [previously a project within Effective Ventures Foundation](https://forum.effectivealtruism.org/posts/o6LNeNoHBA7Bv9kGE/bad-omens-in-current-ea-governance), the legal entity housing CEA, 80,000 Hours, and EA Funds. A December 2022 post raised concerns about EVF's governance structure. Longview has since [transitioned to independent legal entities](https://www.longview.org/organisational-structure/).

**Political Expertise Concerns**: A [June 2025 EA Forum post](https://forum.effectivealtruism.org/posts/fveRS7iTK83KKW3my/political-funding-expertise-post-6-of-7-on-ai-governance) raised concerns about grantmaker expertise in political advocacy:

> "In June 2025, Longview Philanthropy advertised an opening for an AI policy expert. However, these efforts are too modest and too recent to fully address the problem. It takes more than one or two experts to adequately evaluate an entire field's worth of advocacy proposals."

The post recommended Longview ensure new positions are filled by people with advocacy backgrounds, not just AI governance research experience.

**Digital Sentience Fund Skepticism**: One [EA Forum commenter noted](https://forum.effectivealtruism.org/posts/s8gS2Lh39usPJgJeL/should-you-donate-through-funds): "I do not endorse Longview's Digital Sentience Fund... I expect it'll fund misc empirical and philosophical 'digital sentience' work plus unfocused field-building."

### The Donor Advisory Model: Benefits and Risks

| Dimension | Benefits | Risks |
|-----------|----------|-------|
| **Donor Activation** | Brings new capital into longtermism | May create dependency on advisor recommendations |
| **Personalization** | Aligns grants with donor values | Could fragment funding toward idiosyncratic preferences |
| **Independence** | No commission means no incentive to maximize AUM | Heavy OP funding creates potential conflicts |
| **Speed** | Faster than institutional foundations | Less time for thorough due diligence |
| **Flexibility** | Can fund political/controversial work | Less public accountability than traditional foundations |

## How to Engage with Longview

### For Donors

| Giving Level | How to Engage | Contact |
|--------------|---------------|---------|
| **\$1M+/year** | Full bespoke advisory services | [Contact page](https://www.longview.org/contact/) |
| **\$100K+/year (AI)** | AI grant recommendations, FAIF access | [AI advisory signup](https://forum.effectivealtruism.org/posts/WRGBW8DMNhRGWA9xB/longview-is-now-offering-ai-grant-recommendations-to-donors-1) |
| **Any amount** | Donate to [ECF](https://www.longview.org/fund/emerging-challenges-fund/) or [NWPF](https://www.longview.org/fund/nuclear-weapons-policy-fund/) | Fund pages |
| **Group interest** | Host a donor dinner with AI presentation | Contact for group sessions |

### For Organizations Seeking Funding

Longview does not accept unsolicited funding requests. Organizations are identified through:
- Proactive research by grantmaking teams
- Expert referrals and network recommendations
- Donor-initiated due diligence requests

### For Job Seekers

Longview regularly hires for positions including:
- [AI Grantmaker (Generalist or US AI Policy)](https://www.longview.org/careers/ai-grantmaker-generalist-or-us-ai-policy/)
- [AI Programme Director](https://www.longview.org/careers/ai-programme-director/)
- [Chief Partnerships Officer](https://www.longview.org/careers/chief-partnerships-officer/)
- Research analysts, operations, events

See [80,000 Hours job board](https://jobs.80000hours.org/organisations/longview-philanthropy) for current openings.

## Organizational Timeline

| Year | Milestone |
|------|-----------|
| **2018** | [Natalie Cargill founds Longview](https://www.longview.org/about/natalie-cargill/) after leaving legal career |
| **2018-2022** | Operates as Effective Ventures Foundation project |
| **2023** | [Transitions to independent legal entities](https://www.longview.org/organisational-structure/) (UK and US) |
| **2023** | [\$4M+ Coefficient Giving general support grant](https://www.openphilanthropy.org/grants/longview-philanthropy-general-support/) |
| **2024** | [ECF exceeds 2,000 donors](https://forum.effectivealtruism.org/posts/ZuRmENHzmivjxNafG/longview-s-emerging-challenges-fund-can-effectively-absorb) |
| **2024** | [Over 50% of ECF allocation to EU AI Act work](https://forum.effectivealtruism.org/posts/PBonnCzpMjij4X3Mr/announcing-the-emerging-challenges-fund-s-2024-report) |
| **Oct 2024** | [\$16M Coefficient Giving general support grant](https://www.openphilanthropy.org/grants/longview-philanthropy-general-support-october-2024/) |
| **Dec 2024** | [Frontier AI Fund launches](https://www.longview.org/fund/frontier-ai-fund/) |
| **2025** | [\$100K+ AI advisory service launches](https://forum.effectivealtruism.org/posts/WRGBW8DMNhRGWA9xB/longview-is-now-offering-ai-grant-recommendations-to-donors-1) |
| **2025** | [Nordic AI Retreat with Astralis Foundation](https://www.longview.org/community/) (Stockholm, 25 philanthropists) |
| **Sep 2025** | FAIF reaches \$13M raised, \$11.1M disbursed to 18 organizations |
| **2025** | \$50M+ directed across all programs; \$89M+ cumulative AI funding |

## Sources and External Links

### Official Sources
- [Longview Philanthropy Website](https://www.longview.org/)
- [About Page](https://www.longview.org/about/)
- [Grantmaking](https://www.longview.org/grantmaking/)
- [Organisational Structure](https://www.longview.org/organisational-structure/)
- [AI Program](https://www.longview.org/artificial-intelligence/)
- [Nuclear Weapons Policy](https://www.longview.org/nuclear/)
- [Community](https://www.longview.org/community/)

### Funds
- [Emerging Challenges Fund](https://www.longview.org/fund/emerging-challenges-fund/)
- [Frontier AI Fund](https://www.longview.org/fund/frontier-ai-fund/)
- [Nuclear Weapons Policy Fund](https://www.longview.org/fund/nuclear-weapons-policy-fund/)
- [Digital Sentience Fund](https://www.longview.org/fund/digital-sentience-fund/)

### EA Forum
- [EA Forum Longview Tag](https://forum.effectivealtruism.org/tag/longview-philanthropy)
- [AI Grant Recommendations Announcement](https://forum.effectivealtruism.org/posts/WRGBW8DMNhRGWA9xB/longview-is-now-offering-ai-grant-recommendations-to-donors-1)
- [ECF 2024 Report Announcement](https://forum.effectivealtruism.org/posts/PBonnCzpMjij4X3Mr/announcing-the-emerging-challenges-fund-s-2024-report)
- [ECF Marginal Funding Case](https://forum.effectivealtruism.org/posts/ZuRmENHzmivjxNafG/longview-s-emerging-challenges-fund-can-effectively-absorb)
- [GWWC Evaluations of Evaluators](https://forum.effectivealtruism.org/posts/PTHskHoNpcRDZtJoh/gwwc-s-evaluations-of-evaluators)

### Coefficient Giving Grants
- [General Support 2024 (\$16M)](https://www.openphilanthropy.org/grants/longview-philanthropy-general-support-october-2024/)
- [General Support 2023 (\$4M)](https://www.openphilanthropy.org/grants/longview-philanthropy-general-support/)
- [Nuclear Security Grantmaking](https://www.openphilanthropy.org/grants/longview-philanthropy-nuclear-security-grantmaking/)
- [OECD AI Policy Development](https://www.openphilanthropy.org/grants/longview-philanthropy-ai-policy-development-at-the-oecd/)

### Third-Party Sources
- [Giving What We Can: Longview](https://www.givingwhatwecan.org/charities/longview-philanthropy)
- [Giving What We Can: ECF](https://www.givingwhatwecan.org/charities/emerging-challenges-fund)
- [InfluenceWatch Profile](https://www.influencewatch.org/non-profit/longview-philanthropy/)
- [ProPublica Nonprofit Explorer](https://projects.propublica.org/nonprofits/organizations/932664730)
- [80,000 Hours Job Board](https://jobs.80000hours.org/organisations/longview-philanthropy)

### Leadership
- [Natalie Cargill Bio](https://www.longview.org/about/natalie-cargill/)
- [Natalie Cargill TED](https://www.ted.com/speakers/natalie_cargill)
- [Simran Dhaliwal Bio](https://www.longview.org/about/simran-dhaliwal/)
- [Simran Dhaliwal GWWC Podcast](https://www.givingwhatwecan.org/blog/podcast-simran-dhaliwal)
- [The Long View (book)](https://philarchive.org/rec/CARTLV-2)