The Alignment Project
governmentCredibility Rating
High quality. Established institution or organization with editorial oversight and accountability.
Rating inherited from publication venue: UK Government
This is the official homepage for a significant government-backed alignment funding program; useful for researchers seeking grants and for tracking the scale and priorities of state-level AI safety investment as of 2025.
Metadata
Summary
The Alignment Project is a major international funding initiative launched in 2025 by the UK AI Security Institute and a broad coalition of government, industry, and philanthropic partners to accelerate AI alignment research. In its first round, it awarded over £27 million to 60+ projects, with grants up to £1 million supplemented by compute access, expert support, and potential venture capital. The initiative aims to ensure advanced AI systems remain safe, reliable, and under human control.
Key Points
- •Launched in 2025 by AISI with partners including OpenAI, Anthropic, Microsoft, AWS, CIFAR, and multiple national AI safety institutes across UK, Canada, and Australia.
- •First round awarded £27M+ to 60+ projects; grants range from £50,000 to £1,000,000 with potential for higher-value projects in future rounds.
- •Recipients receive not just funding but compute credits, venture capital introductions, and dedicated support from AISI's alignment research team.
- •Applications are currently closed and expected to reopen in summer 2026, signaling ongoing multi-round commitment to alignment funding.
- •Covers interdisciplinary research spanning computer science, cognitive science, and other fields relevant to making AI systems behave as intended.
Cited by 2 pages
| Page | Type | Quality |
|---|---|---|
| UK AI Safety Institute | Organization | 52.0 |
| AI Control | Research Area | 75.0 |
Cached Content Preview
The Alignment Project by AISI — The AI Security Institute
Visit AISI.gov.uk Please enable javascript for this website.
A
A
The
Alignment
Project
Funding groundbreaking AI alignment research to keep advanced AI systems safe, secure and beneficial to society.
View priority research areas Note: Applications for the Alignment Project are currently closed. They are expected to reopen in summer 2026. Please check back or email aisialignmentproject@dsit.gov.uk to receive updates.
Overview
An international, cross-sector coalition offering funding of up to £1 million to advance the field of alignment.
Transformative AI has the potential to deliver unprecedented benefits to humanity, from medical breakthroughs and sustainable energy to solving the global housing crisis. But this future depends on ensuring powerful AI systems reliably act as we intend them to, without unintended or harmful behaviours. Without advances in alignment research, future systems risk operating in ways we cannot fully understand or control, with profound implications for global safety and security.
The Alignment Project is a global fund launched in 2025. In its first round over £27 million were awarded to over 60 projects dedicated to accelerating progress in AI alignment research. Our aim is to promote the development of advanced AI systems that are safe, reliable, and beneficial to society. We provide funding of up to £1 million to researchers from across disciplines.
Who are we?
The Alignment Project is supported by an international coalition of government, industry, and philanthropic funders — including the UK AI Security Institute, Canadian AI Safety Institute (CAISI), Canadian Institute for Advanced Research (CIFAR), Australian Department of Industry, Science and Resources’ AI Safety Institute, OpenAI, Microsoft, Amazon Web Services (AWS), Schmidt Sciences, Anthropic, the AI Safety Tactical Opportunities Fund, Halcyon Futures, the Safe AI Fund, Sympatico Ventures, Renaissance Philanthropy, UK Research and Innovation, and the Advanced Research and Invention Agency (ARIA) — and a world-leading expert advisory board.
By fostering interdisciplinary collaboration, providing financial support and dedicated compute resources, we are tackling one of AI’s most urgent problems: developing AI systems that are beneficial, reliable and remain under human control at every step.
How to apply Priority research areas Why apply
In 2025, the Alignment Fund awarded grants ranging from £50,000 to £1,000,000 .
The coalition of funders may consider higher value projects in the future. Funding recipients receive:
Compute access:
The Alignment Project can offer dedicated cloud computing and API credits, enabling technical experiments beyond typical academic reach.
Venture capital:
Investment from private funders to accelerate commercial alignment solutions.
Support from leading experts:
AISI's Alignment team can provide dedic
... (truncated, 4 KB total)2c54187a89647ed5 | Stable ID: NTBhYWYxMD