Back
Support for Career Transition Plans — Open Philanthropy
webCredibility Rating
4/5
High(4)High quality. Established institution or organization with editorial oversight and accountability.
Rating inherited from publication venue: Coefficient Giving
This grant page is relevant to those studying how philanthropy supports AI safety field-building and talent development, particularly via career transitions from adjacent disciplines.
Metadata
Importance: 30/100organizational reportreference
Summary
This Open Philanthropy grant page documents funding provided to support individuals transitioning into AI safety or biosecurity careers. It reflects Open Philanthropy's strategy of building talent pipelines in high-priority cause areas by helping researchers and professionals shift focus toward existential risk reduction.
Key Points
- •Open Philanthropy funds career transition plans to grow talent in AI safety and related fields.
- •Reflects a talent-building strategy aimed at addressing bottlenecks in high-priority cause areas.
- •Supports individuals moving from adjacent fields into direct work on existential risks.
- •Part of a broader philanthropic approach to field-building and human capital development in AI safety.
- •Demonstrates philanthropic investment in ecosystem development beyond direct technical research.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| EA Institutions' Response to the FTX Collapse | -- | 53.0 |
Cached Content Preview
HTTP 200Fetched Mar 15, 20267 KB
Global Catastrophic Risks Opportunities | Coefficient Giving
Skip to Content
Global Catastrophic Risks Opportunities
We support work on reducing global catastrophic risks that falls outside our dedicated AI and biosecurity funds.
250+
grants made
$400+
million given
Contents
About the Fund
Funding Opportunities
Featured Grants
About the Fund
Team
Eli Rose
Program Director, Global Catastrophic Risks Capacity Building
Partners
Good Ventures
Interested in providing funding within this space? Reach out to partnerwithus@coefficientgiving.org .
A global catastrophic risk (GCR) is one that could cause severe or even irreversible harm to humans on a global scale, like a pandemic that collapses the world economy. And as technology advances, new GCRs could emerge from fields like advanced AI or bioengineering; we primarily focus on these two areas.
Despite the stakes, relatively few people research GCRs or work full-time to prevent them, and the work receives relatively little philanthropic and institutional funding.
Coefficient’s work on global catastrophic risks largely falls under two other funds: Navigating Transformative AI and Biosecurity & Pandemic Preparedness . This fund complements those efforts by supporting cross-cutting and foundational work, including projects that strengthen the broader ecosystem of people and organizations addressing global catastrophic risks.
Our grantmaking is focused on:
Academic research and research fellowships to deepen understanding of these issues and of potential interventions.
Field-building, coordination, and community support, including projects that strengthen the effective altruism movement and other networks directing talent and resources toward mitigating GCRs.
Career development and transition support for students and professionals seeking to work on reducing GCRs.
Funding Opportunities
Prev
Next
Request for Proposals
Funding for Programs and Events
This funding supports programs and events in a variety of areas related to global catastrophic risks, including scholarship or fellowship programs, internships, residencies, visitor programs, courses, seminars, conferences, workshops, and retreats.
Learn more and apply
Request for Proposals
Career Development and Transition Funding
This program provides support — in the form of funding for graduate study, unpaid internships, independent study, career transition and exploration periods, and other activities — for individuals at any career stage who want to pursue careers t
... (truncated, 7 KB total)Resource ID:
11ad80854a242b73 | Stable ID: YzI0ODY5Nj