Skip to content
Longterm Wiki
Back

Open Philanthropy: Our Progress in 2023 and Plans for 2024

web

Author

Alexander_Berger

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: EA Forum

This annual report from Open Philanthropy, one of the largest funders of AI safety research, provides insight into institutional priorities, funding strategy, and resource allocation across AI safety and other cause areas as of 2023-2024.

Forum Post Details

Karma
139
Comments
24
Forum
eaforum
Forum Tags
Building effective altruismCoefficient GivingOrganization updatesGood things & impact storiesPostmortems & retrospectivesAnnouncements and updates

Metadata

Importance: 52/100organizational reportprimary source

Summary

Open Philanthropy's 2023 annual report details major organizational scaling—doubling staff to ~110 and giving over $750M—while describing impact across global health, AI safety, animal welfare, and policy domains. The report candidly addresses significant challenges including a 50% asset decline, FTX Future Fund collapse fallout, and leadership transition. It provides insight into how one of EA's largest funders is prioritizing and resourcing AI safety work relative to other cause areas.

Key Points

  • Open Philanthropy nearly doubled annual giving to over $750M in 2023, expanding to five new program areas while doubling staff to ~110.
  • AI safety contributions included funding the Center for AI Safety's extinction risk statement and supporting policy awareness efforts.
  • The organization navigated a 50% asset decline and recovery, plus funding gaps left by the FTX Future Fund collapse.
  • Co-founder Holden Karnofsky departed day-to-day operations to focus personally on AI safety, signaling the cause area's growing priority.
  • Report reflects Open Philanthropy's strategic balance between near-term causes (global health, animal welfare) and long-term risks (AI safety, biosecurity).

Cited by 1 page

Cached Content Preview

HTTP 200Fetched Mar 15, 202628 KB
Open Philanthropy: Our Progress in 2023 and Plans for 2024 — EA Forum 
 
 This website requires javascript to properly function. Consider activating javascript to get access to all site functionality. Open Philanthropy: Our Progress in 2023 and Plans for 2024 

 by Alexander_Berger Mar 27 2024 3 min read 24 139

 Building effective altruism Coefficient Giving Organization updates Good things & impact stories Postmortems & retrospectives Announcements and updates Frontpage This is a linkpost for https://www.openphilanthropy.org/research/our-progress-in-2023-and-plans-for-2024/ Like many organizations, Open Philanthropy has had multiple founding moments. Depending on how you count , we will be either seven, ten, or thirteen years old this year. Regardless of when you start the clock, it’s possible that we’ve changed more in the last two years than over our full prior history. We’ve more than doubled the size of our team (to ~110), nearly doubled our annual giving (to >$750M), and added five new program areas .

 As our track record and volume of giving have grown, we are seeing more of our impact in the world. Across our focus areas, our funding played a (sometimes modest) role in some of 2023’s most important developments:

 We were among the supporters of the clinical trials that led to the World Health Organization (WHO) officially recommending the R21 malaria vaccine. This is the second malaria vaccine recommended by WHO, which expects it to enable “sufficient vaccine supply to benefit all children living in areas where malaria is a public health risk.” Although the late-stage clinical trial funding was Open Philanthropy’s first involvement with R21 research, that isn’t the case for our new global health R&D program officer, Katharine Collins , who invented R21 as a grad student.
 Our early commitment to AI safety has contributed to increased awareness of the associated risks and to early steps to reduce them. The Center for AI Safety, one of our AI grantees , made headlines across the globe with its statement calling for AI extinction risk to be a “global priority alongside other societal-scale risks,” signed by many of the world’s leading AI researchers and experts. Other grantees contributed to many of the year’s other big AI policy events, including the UK’s AI Safety Summit , the US executive order on AI , and the first International Dialogue on AI Safety , which brought together scientists from the US and China to lay the foundations for future cooperation on AI risk (à la the Pugwash Conferences in support of nuclear disarmament).
 The US Supreme Court upheld California’s Proposition 12, the nation’s strongest farm animal welfare law. We were major supporters of the original initiative and helped fund its successful legal defense .
 Our grantees in the YIMBY (“yes in my backyard”) movement — which works to increase the supply of housing in order to lower prices and rents — helped drive major middle housing reforms in Washington state and

... (truncated, 28 KB total)
Resource ID: ed738bca353bdb0c | Stable ID: OTlmZjFhMT