Skip to content
Longterm Wiki
Back

Centre for Long-Term Resilience - Official Website

web

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: Long-Term Resilience

CLTR is a UK policy-focused think tank relevant to AI governance and existential risk; useful for understanding how safety concerns translate into government policy recommendations and institutional reform efforts.

Metadata

Importance: 52/100homepage

Summary

The Centre for Long-Term Resilience (CLTR) is a UK-based think tank focused on building resilience to extreme risks, including AI-related risks and biosecurity threats. It works to influence government policy and decision-making to better prepare societies for catastrophic and existential risks. CLTR engages with policymakers, publishes research, and advocates for institutional reforms to address long-term and emerging threats.

Key Points

  • UK-based policy think tank focused on extreme and catastrophic risk reduction, including AI safety and biosecurity
  • Works directly with governments and policymakers to embed long-term resilience into institutional decision-making
  • Publishes reports and policy recommendations on AI governance, pandemic preparedness, and systemic risk
  • Advocates for structural reforms to improve how governments identify and respond to emerging global risks
  • Part of the broader ecosystem of organizations working on existential risk reduction through policy channels

Cited by 1 page

PageTypeQuality
Centre for Long-Term ResilienceOrganization63.0

Cached Content Preview

HTTP 200Fetched Mar 20, 20265 KB
![Shape of a globe](https://www.longtermresilience.org/wp-content/themes/cltr/dist/images/hero-home.png)

# Striving for a safe and flourishing world

An independent think tank with a mission to transform global resilience to extreme risks

![](https://www.longtermresilience.org/wp-content/uploads/2024/08/CLTR-Icon-Updated.png)

## Our Vision

Our vision is a safe and flourishing world with high resilience to extreme risks, such as those from pandemics and emerging technologies.

## Our Mission

Our mission is to transform global resilience to extreme risks — both in the UK and internationally. Our core focus areas are AI risk, biological risk and government risk management

[Find Out More](https://www.longtermresilience.org/about/)

## We help governments and other institutions transform resilience to extreme risks by:

Helping decision-makers and the wider public to understand extreme risks.

Providing expert advice and red-teaming on policy decisions.

Convening cross-sector conversations and workshops related to extreme risks.

Developing and advocating for policy recommendations and effective risk management frameworks and systems.

Providing an exchange for specialist knowledge, including by facilitating expert placements into government.

## Our Latest Work

![](https://www.longtermresilience.org/wp-content/uploads/2024/07/our-latest-work-ai.jpg)

### Artificial Intelligence

Risks stemming from improper application, unintended behaviours of AI systems in critical domains, and the broader socioeconomic impacts of AI on both the economy and society.

[Explore AI](https://www.longtermresilience.org/ai/)

### How the UK Government can govern the risk of loss of control

* * *

Feb 3, 2026

[Find Out More](https://www.longtermresilience.org/reports/how-the-uk-government-can-govern-the-risk-of-loss-of-control/ "How the UK Government can govern the risk of loss of control")

### The Loss of Control Observatory: a prototype to detect real-world AI control incidents

* * *

CLTR is developing a new methodology to systematically detect and analyse concerning autonomous behaviours, as part of a broader programme of work on

Feb 2, 2026

[Find Out More](https://www.longtermresilience.org/reports/the-loss-of-control-observatory-a-prototype-to-detect-real-world-ai-control-incidents/ "The Loss of Control Observatory: a prototype to detect real-world AI control incidents")

![](https://www.longtermresilience.org/wp-content/uploads/2024/07/our-latest-work-biosecurity.jpg)

### Biosecurity

Risks arising from natural pandemics, laboratory leaks, bioweapons, and ‘dual-use’ research — advancements with the potential for both beneficial and harmful applications.

[Explore Biosecurity](https://www.longtermresilience.org/biosecurity/)

### Informing the European Biotech Act and Inclusions of our Recommendations

* * *

In November 2025, the Biosecurity Policy Unit submitted written evidence to the European Union (EU)’s Public Consultation informing the European

... (truncated, 5 KB total)
Resource ID: 9f66f85c28a27fc3 | Stable ID: YmM1ZmVkN2