Skip to content
Longterm Wiki
Navigation
Updated 2026-02-01HistoryData
Citations verified2 accurate44 unchecked
Page StatusContent
Edited 2 months ago2.1k words4 backlinksUpdated every 3 weeksOverdue by 42 days
58QualityAdequate •35.5ImportanceReference80ResearchHigh
Content6/13
SummaryScheduleEntityEdit historyOverview
Tables1/ ~8Diagrams0/ ~1Int. links11/ ~16Ext. links0/ ~10Footnotes46/ ~6References36/ ~6Quotes2/46Accuracy2/46RatingsN:2 R:6.5 A:1.5 C:7.5Backlinks4
Issues2
QualityRated 58 but structure suggests 87 (underrated by 29 points)
StaleLast edited 63 days ago - may need review

CSER (Centre for the Study of Existential Risk)

Academic

CSER (Centre for the Study of Existential Risk)

CSER is a Cambridge-based existential risk research centre founded in 2012, now funded at ~$1M+ annually from FLI and other sources, producing 24+ publications in 2022 across AI safety, biosecurity, climate catastrophes, and nuclear risks. The centre has advised UN, WHO, and multiple governments on pandemic preparedness and AI governance, though measuring actual risk reduction from academic research remains difficult.

TypeAcademic
Websitecser.ac.uk
Related
People
Huw PriceMartin ReesJaan TallinnSeán Ó hÉigeartaigh
2.1k words · 4 backlinks

Quick Assessment

AttributeValue
Founded2012
LocationUniversity of Cambridge, UK
Key FocusExistential risks from AI, biotechnology, climate, nuclear threats
Notable Funding$1M from Future of Life Institute (2023); over $200K from Survival and Flourishing Fund
Key Publications"The Malicious Use of Artificial Intelligence" (2018); "Climate Endgame" in Nature (2022)
LeadershipProfessor Matthew Connelly (Director, from 2023); Dr. Seán Ó hÉigeartaigh (Executive Director)

Overview

The Centre for the Study of Existential Risk (CSER) is an interdisciplinary research centre within the University of Cambridge that studies threats capable of causing human extinction or civilizational collapse. Founded in 2012 by philosopher Huw Price, cosmologist Lord Martin Rees, and Skype co-founder Jaan Tallinn, CSER represents one of the first major academic institutions dedicated to existential risk research.12

CSER's research spans four primary domains: risks from artificial intelligence, extreme technological risks, global catastrophic biological risks, and extreme environmental risks including climate change.3 The centre operates through a three-pillar strategy focused on advancing understanding of existential risks through rigorous research, developing collaborative mitigation strategies, and building a global field of researchers, technologists, and policymakers committed to addressing these challenges.4

Hosted within Cambridge's Centre for Research in the Arts, Social Sciences and Humanities (CRASSH), CSER has produced influential research including publications in Nature and other top-tier journals, organized major conferences on catastrophic risk, and advised governments and international organizations including the UN, WHO, and OECD on pandemic preparedness, nuclear risks, and AI governance.56

History

Founding and Early Years (2012-2015)

CSER was established in 2012 through an unusual collaboration between a philosopher (Huw Price, Bertrand Russell Professor of Philosophy at Cambridge), a scientist (Lord Martin Rees, Astronomer Royal and former President of the Royal Society), and a software entrepreneur (Jaan Tallinn, co-founder of Skype and early investor in Anthropic).78 The founders were motivated by concerns that advancing technologies—particularly artificial intelligence, biotechnology, nanotechnology, and anthropogenic climate change—posed extinction-level risks that were comparatively neglected in academia.9

Jaan Tallinn, who had begun engaging with the existential risk community in 2009, provided seed funding for the centre's establishment.10 In its early years (circa 2013), CSER submitted ambitious grant applications including a proposal to the European Research Council for a "New Science of Existential Risk" five-year program, which was highly ranked but ultimately not funded.11

Expansion and Recognition (2015-2020)

By 2015, CSER had secured initial time-limited grants primarily focused on philosophy and social science research, funding operations through mid-2018.12 The centre began building expertise to support future science, technology, and AI safety grant applications. During this period, CSER developed the TERRA bibliography tool for existential risk publications and began organizing academic conferences linking decision theory and AI safety starting in 2017.13

In 2018, CSER achieved significant recognition with two major publications: The Malicious Use of Artificial Intelligence: Forecasting, Preventing and Mitigation (co-authored with tech companies and security think-tanks) and An AI Race: Rhetoric and Risks, which won the Best Paper award at the AAAI/ACM AI Ethics and Society conference.14 The centre also established the UK's first All-Party Parliamentary Group for Future Generations during this period.15

Recent Developments (2020-Present)

In April 2020, Dr. Catherine Rhodes took on the role of Executive Director, with Dr. Seán Ó hÉigeartaigh serving as Co-Director.16 The centre's research output accelerated significantly: in 2022 alone, CSER produced 24 publications including papers in Nature, Nature Sustainability, and Proceedings of the National Academies of Science. Notable works included "Climate Endgame: Exploring catastrophic climate change scenarios" and "Huge volcanic eruptions: time to prepare," both of which received extensive media coverage.17

In July 2023, the Future of Life Institute granted $1 million to the University of Cambridge specifically for CSER, enabling funding for a full five-year position for Professor Matthew Connelly as the centre's new Director.18 This was complemented by a multi-million-dollar endowment from Carl Feinberg (via Cambridge in America) establishing the Rees Feinberg Professor of Global Risk, aimed at supporting CSER's long-term expansion and permanence.19

Recent activities include the Cambridge Conference on Catastrophic Risk (September 2024), which brought together researchers, diplomats, UN representatives, and government officials to discuss emerging risks including biological and technological threats, space warfare, and systemic resilience.20

Research Areas and Activities

Artificial Intelligence Safety

CSER has been actively engaged in AI safety research since its founding. The centre organized a series of academic conferences on decision theory and AI safety beginning in 2017, exploring the theoretical foundations necessary for developing safe artificial intelligence systems.21

The centre's 2018 report The Malicious Use of Artificial Intelligence: Forecasting, Preventing and Mitigation, produced in collaboration with technology companies and security think-tanks, examined how AI could be weaponized for physical attacks, digital security threats, and political disruption.22 This work helped establish frameworks for understanding dual-use AI risks that continue to inform policy discussions.

CSER researchers have advised multiple governments and international organizations on AI and AGI governance, including consultations with the UN Secretary-General, UK government, EU, and US agencies.23 The centre's work emphasizes rigorous, multidisciplinary approaches to AI safety that bridge technical research, policy analysis, and ethical considerations.

Biological and Pandemic Risks

Managing extreme technological risks, particularly in biotechnology, represents a core research area for CSER. The centre has conducted horizon-scanning work for the Biological Weapons Convention (BWC), identifying emerging biotechnologies that could pose catastrophic risks.24 CSER researchers emphasize that governance bodies like the BWC struggle to keep pace with rapid advances in synthetic biology and gene editing technologies.25

During and after the COVID-19 pandemic, CSER significantly expanded its work on pandemic preparedness and biological catastrophic risks. In 2022, the centre advised the World Health Organization (WHO), governments, and international bodies on building resilience against future pandemics.26 This work addresses both naturally occurring pandemic threats and the potential for engineered biological weapons.

Climate and Environmental Risks

CSER's 2022 paper "Climate Endgame: Exploring catastrophic climate change scenarios," published in Proceedings of the National Academies of Science, examines worst-case climate scenarios that could lead to civilizational collapse or human extinction.27 This research challenged the field to take seriously the tail risks of climate change beyond conventional projections.

The centre also published research on huge volcanic eruptions and the need to prepare for low-probability, high-impact geological events.28 CSER's environmental work is supported by funding from the Grantham Foundation for the Protection of the Environment.29

Martin Rees, CSER's co-founder, co-organized workshops with the Vatican that influenced the 2015 Papal Encyclical on Climate Change and contributed to momentum for the Paris Agreement.30

Nuclear Risks and Conflict

CSER researchers have provided analysis and policy advice on nuclear risks, particularly following Russia's invasion of Ukraine. The centre has advised governments on nuclear security, deterrence stability, and the intersection of emerging technologies with nuclear risks.31

Funding and Institutional Support

CSER has developed a diverse funding base combining philanthropic support, institutional grants, and university integration. Major funding sources include:3233

  • Future of Life Institute: $1,000,000 (2023) supporting the Director position and operations
  • Survival and Flourishing Fund: Over $200,000 (as of June 2022)
  • Templeton World Charity Foundation: Major research projects including the Managing Extreme Technological Risks program
  • Grantham Foundation for the Protection of the Environment: Environmental risk research
  • Hauser-Raspe Foundation, Blavatnik Foundation (public lectures), Libra Foundation, Musk Foundation, Milner Foundation: Additional project support
  • Carl Feinberg (via Cambridge in America): Multi-million-dollar endowment for the Rees Feinberg Professor of Global Risk

Jaan Tallinn provided critical seed funding in 2012 that enabled CSER's establishment.34

Academic Partnerships and Collaborations

CSER has established multiple international academic partnerships to expand existential risk research capacity. In 2018, the centre signed a Memorandum of Understanding with the Graduate School of Advanced Integrated Studies in Human Survivability (GSAIS) at Kyoto University, extended in 2023. This partnership supports joint research funding applications, regular workshops, faculty and student exchanges, and collaborative publications on topics including cascading natural risks and space technology in risk mitigation.35

CSER has advised on the establishment of global risk research programs at Australian National University, University of California Los Angeles (UCLA), and the University of Warwick.36 The centre has organized two international Cambridge Conferences on Catastrophic Risk and over 30 specialized workshops on topics including cybersecurity, nuclear security, climate change, and gene drives.37

Community Engagement and Public Impact

CSER maintains an active public engagement program including the CSER Public Lectures series (supported by the Blavatnik Foundation), which has been viewed over 500,000 times online.38 The centre's media engagement has been extensive, with research findings regularly covered by major international outlets.

Within the effective altruism and existential risk research communities, CSER is viewed as a key academic institution lending rigor and legitimacy to the field. The centre has a dedicated topic page on the EA Forum, where community members discuss CSER's work and its alignment with effective altruism priorities.39

In March 2025, CSER hosted an "Exploring Careers in Existential Risk" event featuring speakers from 80,000 Hours and ERA Fellowship, connecting students and early-career researchers with opportunities in the field.40 The centre has also developed self-guided educational trails and contributed researchers as Lead Authors to the IPCC's Seventh Assessment Report.41

Leadership and Key Personnel

Founders

  • Huw Price: Bertrand Russell Professor of Philosophy at Cambridge; Academic Director
  • Lord Martin Rees: Astronomer Royal, former President of the Royal Society, Emeritus Professor of Cosmology and Astrophysics
  • Jaan Tallinn: Co-founder of Skype, early investor in Anthropic; provided seed funding

Current Leadership

  • Professor Matthew Connelly: Director (from July 2023), supported by five-year position funded by Future of Life Institute
  • Dr. Seán Ó hÉigeartaigh: Executive Director/Co-Director, manages CSER within CRASSH
  • Dr. Catherine Rhodes: Former Executive Director (as of April 2020), Academic Project Manager

Notable Researchers

  • SJ Beard: Senior Research Associate; works on ethics of extinction, extreme event methodologies, decision-maker constraints, and existential hope
  • Dr. Charlotte Hammer: Assistant Professor in Global Risk and Resilience
  • Dr. Luke Kemp: Research Associate; focuses on global catastrophic risks and civilizational collapse
  • Partha Dasgupta: Senior Advisor; co-organized Vatican workshops on climate and extinction

Criticisms and Concerns

Available sources contain no direct criticisms or controversies targeting CSER specifically. However, some broader concerns affecting existential risk research are relevant to understanding CSER's context:

Field Immaturity and Methodological Challenges

Existential risk studies remains a young, interdisciplinary subfield still developing consensus methodologies.42 CSER researchers acknowledge that measuring extreme tail risks and validating models for unprecedented catastrophic events presents fundamental epistemic challenges. The centre's TERRA bibliography project revealed data limitations in tracking existential risk research productivity, including undercounting of publications and incomplete capture of non-academic work.43

Gender Representation

Research analyzing the broader existential risk field (including CSER-associated publications) has identified underrepresentation of women researchers, aligning with patterns observed in effective altruism-adjacent communities.44 While not specific to CSER, this demographic pattern affects the diversity of perspectives in the field.

Governance and Urgency Gaps

CSER researchers themselves identify significant barriers to effective existential risk mitigation, including short-termism in government decision-making, institutional lag (exemplified by UN Security Council ineffectiveness), and lack of representation for future generations in policy processes.45 These structural challenges constrain the real-world impact of academic research, regardless of its quality.

CSER emphasizes that managing extreme technological risks remains "urgent" but "comparatively neglected" in academia, with governance institutions struggling to keep pace with technological advances in areas like synthetic biology.46

Key Uncertainties

  1. Impact measurement: While CSER has produced numerous publications and policy engagements, measuring the actual reduction in existential risk resulting from this work remains extremely difficult. How effectively does academic research translate into policy changes that meaningfully reduce catastrophic risks?

  2. Research prioritization: With limited resources and multiple potential existential threats, how should CSER balance attention across AI risks, biological threats, climate scenarios, and other catastrophic risks? Are current priorities optimally calibrated to actual risk levels?

  3. Institutional growth trajectory: With new endowed positions and expanded funding, can CSER scale its research capacity while maintaining quality and interdisciplinary rigor? What is the optimal size for an existential risk research centre?

  4. Policy influence mechanisms: What are the most effective pathways for translating existential risk research into policy action? Should CSER prioritize direct government advising, public engagement, training future policymakers, or other approaches?

  5. Collaboration vs. competition: As the existential risk research field grows with multiple institutions now active, how should CSER balance collaborative field-building with maintaining its distinctive research identity and institutional competitiveness for funding?

Sources

Footnotes

  1. Citation rc-b85d

  2. Citation rc-5331

  3. Citation rc-c37b

  4. Citation rc-12f6

  5. Citation rc-af02

  6. Citation rc-5c26

  7. Citation rc-df42

  8. Citation rc-d1b7

  9. Citation rc-63c8

  10. Citation rc-fa6c

  11. Citation rc-ab69

  12. Citation rc-27cc

  13. Citation rc-d7e3

  14. Citation rc-62f6

  15. Citation rc-19bc

  16. Citation rc-89ae

  17. Citation rc-e8ea

  18. Citation rc-ada2

  19. Citation rc-dd50

  20. Citation rc-0c3f

  21. Citation rc-f7ec

  22. Citation rc-4755

  23. Citation rc-6f4d

  24. Citation rc-91a5

  25. Citation rc-e100

  26. Citation rc-e76b

  27. Citation rc-e4ce

  28. Citation rc-834a

  29. Citation rc-8bff

  30. Citation rc-cbde

  31. Citation rc-969d

  32. FLI Grant 2023FLI Grant 2023

  33. CSER Major SupportersCSER Major Supporters

  34. Citation rc-346f

  35. Citation rc-3a89

  36. Citation rc-7fd8

  37. Citation rc-86f9

  38. Citation rc-fc3a

  39. Citation rc-ed19

  40. Citation rc-e2cc

  41. Citation rc-42d2

  42. Citation rc-cf4b

  43. Citation rc-f241

  44. Citation rc-3cd8

  45. Citation rc-8b3d

  46. Citation rc-cb34

References

1CSER Conferences and WorkshopsCentre for the Study of Existential Risk

This page lists public events, conferences, and workshops organized by the Centre for the Study of Existential Risk (CSER) at the University of Cambridge. CSER focuses on risks from advanced technologies including AI, biotechnology, and other global catastrophic risks. The events page serves as a hub for academic and policy discussions on existential and catastrophic risk reduction.

★★★★☆
Claims (1)
CSER has advised on the establishment of global risk research programs at Australian National University, University of California Los Angeles (UCLA), and the University of Warwick. The centre has organized two international Cambridge Conferences on Catastrophic Risk and over 30 specialized workshops on topics including cybersecurity, nuclear security, climate change, and gene drives.
2Long Problems Lecture 2025Centre for the Study of Existential Risk

The Long Problems Lecture Series is an annual public lecture event organized by the Centre for the Study of Existential Risk (CSER) at the University of Cambridge, focusing on humanity's most pressing long-term challenges including existential and catastrophic risks. The series brings together researchers and thought leaders to discuss systemic risks that operate on multi-generational timescales. It serves as a public engagement and outreach forum for CSER's research agenda.

★★★★☆
Claims (1)
CSER researchers themselves identify significant barriers to effective existential risk mitigation, including short-termism in government decision-making, institutional lag (exemplified by UN Security Council ineffectiveness), and lack of representation for future generations in policy processes. These structural challenges constrain the real-world impact of academic research, regardless of its quality.
3CSER Researchers Join IPCCCentre for the Study of Existential Risk

This news item from the Centre for the Study of Existential Risk (CSER) announces that CSER researchers have been appointed as contributors to the IPCC's Seventh Assessment Report (AR7). This represents CSER's involvement in shaping international climate science policy at the highest levels, potentially integrating existential and catastrophic risk perspectives into mainstream climate assessments.

★★★★☆
Claims (1)
In March 2025, CSER hosted an "Exploring Careers in Existential Risk" event featuring speakers from 80,000 Hours and ERA Fellowship, connecting students and early-career researchers with opportunities in the field. The centre has also developed self-guided educational trails and contributed researchers as Lead Authors to the IPCC's Seventh Assessment Report.

This page returns a 404 error, indicating the grant announcement for the Centre for the Study of Existential Risk (CSER) from the Future of Life Institute is no longer available at this URL. The resource was intended to document FLI funding support for CSER, a Cambridge-based existential risk research organization.

★★★☆☆
Claims (3)
Hosted within Cambridge's Centre for Research in the Arts, Social Sciences and Humanities (CRASSH), CSER has produced influential research including publications in Nature and other top-tier journals, organized major conferences on catastrophic risk, and advised governments and international organizations including the UN, WHO, and OECD on pandemic preparedness, nuclear risks, and AI governance.
In July 2023, the Future of Life Institute granted \$1 million to the University of Cambridge specifically for CSER, enabling funding for a full five-year position for Professor Matthew Connelly as the centre's new Director. This was complemented by a multi-million-dollar endowment from Carl Feinberg (via Cambridge in America) establishing the Rees Feinberg Professor of Global Risk, aimed at supporting CSER's long-term expansion and permanence.
Major funding sources include:
5CSER Pandemic Advisory 2022Centre for the Study of Existential Risk

A policy advisory from the Centre for the Study of Existential Risk (CSER) addressing pandemic preparedness and biosecurity risks in 2022. The advisory likely draws on lessons from COVID-19 to recommend improvements in global pandemic governance and response infrastructure to mitigate future catastrophic biological risks.

★★★★☆
Claims (1)
In 2022, the centre advised the World Health Organization (WHO), governments, and international bodies on building resilience against future pandemics. This work addresses both naturally occurring pandemic threats and the potential for engineered biological weapons.

This page announces or describes the Rees Feinberg Professorship endowment at the University of Cambridge, established through Cambridge in America fundraising. The professorship likely supports academic research in a specific field, potentially including AI safety or related disciplines, funded through philanthropic giving.

Claims (1)
In July 2023, the Future of Life Institute granted \$1 million to the University of Cambridge specifically for CSER, enabling funding for a full five-year position for Professor Matthew Connelly as the centre's new Director. This was complemented by a multi-million-dollar endowment from Carl Feinberg (via Cambridge in America) establishing the Rees Feinberg Professor of Global Risk, aimed at supporting CSER's long-term expansion and permanence.
7TERRA Database LimitationsCentre for the Study of Existential Risk

This page from the Centre for the Study of Existential Risk (CSER) documents the known limitations and caveats of the TERRA (Tracking Existential Risk Research and Analysis) database, which catalogues existential and global catastrophic risk research. It provides transparency about gaps, biases, and methodological constraints users should be aware of when using the database.

★★★★☆
Claims (1)
The centre's TERRA bibliography project revealed data limitations in tracking existential risk research productivity, including undercounting of publications and incomplete capture of non-academic work.
8Climate Endgame Paper - PNASPNAS (peer-reviewed)

This PNAS paper examines severely underexplored catastrophic and existential risks from climate change, arguing that worst-case scenarios including societal collapse and human extinction deserve serious scientific attention. The authors call for a dedicated research agenda on 'Climate Endgame' scenarios involving 3°C or more of warming, cascading risks, and interactions with other global stressors. It parallels existential risk frameworks common in AI safety discourse.

★★★★★
Claims (2)
Notable works included "Climate Endgame: Exploring catastrophic climate change scenarios" and "Huge volcanic eruptions: time to prepare," both of which received extensive media coverage.
CSER's 2022 paper "Climate Endgame: Exploring catastrophic climate change scenarios," published in Proceedings of the National Academies of Science, examines worst-case climate scenarios that could lead to civilizational collapse or human extinction. This research challenged the field to take seriously the tail risks of climate change beyond conventional projections.
9CSER BWC Horizon ScanningCentre for the Study of Existential Risk

The Centre for the Study of Existential Risk (CSER) conducts horizon scanning research on biosecurity threats, particularly in the context of the Biological Weapons Convention (BWC). This work identifies emerging biological risks and informs international policy frameworks aimed at preventing catastrophic biological events. The research bridges technical threat assessment with governance recommendations for policymakers.

★★★★☆
Claims (1)
The centre has conducted horizon-scanning work for the Biological Weapons Convention (BWC), identifying emerging biotechnologies that could pose catastrophic risks. CSER researchers emphasize that governance bodies like the BWC struggle to keep pace with rapid advances in synthetic biology and gene editing technologies.
10CSER-GSAIS Partnership MoUCentre for the Study of Existential Risk

Announcement of a formal partnership between the Centre for the Study of Existential Risk (CSER) at Cambridge and the Graduate School of AI Safety (GSAIS), formalized through a Memorandum of Understanding. The partnership aims to advance AI safety research and education through institutional collaboration. It represents a step toward building international academic networks focused on existential risk and AI safety.

★★★★☆
Claims (1)
This partnership supports joint research funding applications, regular workshops, faculty and student exchanges, and collaborative publications on topics including cascading natural risks and space technology in risk mitigation.
11Volcanic Eruptions Paper - NatureNature (peer-reviewed)·Paper

The requested Nature article is no longer available, returning a 404 page not found error. The original content cannot be retrieved or assessed. No meaningful summary can be provided.

★★★★★
Claims (1)
The centre also published research on huge volcanic eruptions and the need to prepare for low-probability, high-impact geological events. CSER's environmental work is supported by funding from the Grantham Foundation for the Protection of the Environment.
12CSER Leadership Changes 2020Centre for the Study of Existential Risk

An announcement from the Centre for the Study of Existential Risk (CSER) regarding leadership transitions in 2020. The page documents organizational changes in the leadership team of one of the prominent existential risk research centers based at the University of Cambridge.

★★★★☆
Claims (1)
Seán Ó hÉigeartaigh serving as Co-Director. The centre's research output accelerated significantly: in 2022 alone, CSER produced 24 publications including papers in Nature, Nature Sustainability, and Proceedings of the National Academies of Science.

This resource appears to be unavailable (404 error), so its content cannot be assessed. It likely discussed gender diversity and representation within existential risk research communities based on the title.

★★★☆☆
Claims (1)
Research analyzing the broader existential risk field (including CSER-associated publications) has identified underrepresentation of women researchers, aligning with patterns observed in effective altruism-adjacent communities. While not specific to CSER, this demographic pattern affects the diversity of perspectives in the field.
14The Malicious Use of AI Reportmaliciousaireport.com

A report examining how AI technologies can be exploited by malicious actors across cybersecurity, physical security, and political domains. It analyzes near-term threats from AI misuse and offers recommendations for researchers, policymakers, and industry to mitigate these risks.

Claims (2)
In 2018, CSER achieved significant recognition with two major publications: The Malicious Use of Artificial Intelligence: Forecasting, Preventing and Mitigation (co-authored with tech companies and security think-tanks) and An AI Race: Rhetoric and Risks, which won the Best Paper award at the AAAI/ACM AI Ethics and Society conference. The centre also established the UK's first All-Party Parliamentary Group for Future Generations during this period.
The centre's 2018 report The Malicious Use of Artificial Intelligence: Forecasting, Preventing and Mitigation, produced in collaboration with technology companies and security think-tanks, examined how AI could be weaponized for physical attacks, digital security threats, and political disruption. This work helped establish frameworks for understanding dual-use AI risks that continue to inform policy discussions.
15CSER 2022 Activities ReportCentre for the Study of Existential Risk

The Centre for the Study of Existential Risk (CSER) at the University of Cambridge presents its 2022 activities report, summarizing research outputs, policy engagements, and organizational developments. The report highlights CSER's interdisciplinary work on existential and catastrophic risks including AI safety, biosecurity, and environmental risks. It provides an overview of publications, events, collaborations, and funding during the year.

★★★★☆
Claims (1)
Hosted within Cambridge's Centre for Research in the Arts, Social Sciences and Humanities (CRASSH), CSER has produced influential research including publications in Nature and other top-tier journals, organized major conferences on catastrophic risk, and advised governments and international organizations including the UN, WHO, and OECD on pandemic preparedness, nuclear risks, and AI governance.

This resource appears to be a dead link to an 80,000 Hours podcast episode featuring Jaan Tallinn discussing AI risks and existential risk. The page no longer exists at the given URL, so content cannot be verified or summarized.

★★★☆☆
Claims (1)
Jaan Tallinn, who had begun engaging with the existential risk community in 2009, provided seed funding for the centre's establishment. In its early years (circa 2013), CSER submitted ambitious grant applications including a proposal to the European Research Council for a "New Science of Existential Risk" five-year program, which was highly ranked but ultimately not funded.
17Existential Risk Studies Field ReviewCentre for the Study of Existential Risk

A comprehensive review of the existential risk studies field produced by the Centre for the Study of Existential Risk (CSER) at Cambridge. It maps the landscape of research, identifies key themes, methodologies, and gaps in the field, and situates AI risk within the broader existential risk ecosystem.

★★★★☆
Claims (1)
Existential risk studies remains a young, interdisciplinary subfield still developing consensus methodologies. CSER researchers acknowledge that measuring extreme tail risks and validating models for unprecedented catastrophic events presents fundamental epistemic challenges.
18Managing Extreme Technological RisksCentre for the Study of Existential Risk

This is the Centre for the Study of Existential Risk (CSER) research page focused on understanding and managing risks from emerging and potentially catastrophic technologies. CSER's work in this area examines how advanced technologies—including AI, biotechnology, and others—could pose existential or civilizational-scale threats, and what governance and policy frameworks might mitigate them.

★★★★☆
Claims (2)
The centre has conducted horizon-scanning work for the Biological Weapons Convention (BWC), identifying emerging biotechnologies that could pose catastrophic risks. CSER researchers emphasize that governance bodies like the BWC struggle to keep pace with rapid advances in synthetic biology and gene editing technologies.
CSER emphasizes that managing extreme technological risks remains "urgent" but "comparatively neglected" in academia, with governance institutions struggling to keep pace with technological advances in areas like synthetic biology.
19CSER Funding History 2015-2018Centre for the Study of Existential Risk

This page documents the funding sources and financial history of CSER, a Cambridge-based research center focused on existential and catastrophic risks from advanced technologies including AI. It provides transparency into the organization's financial backers and grant history during its early operational years.

★★★★☆
Claims (3)
By 2015, CSER had secured initial time-limited grants primarily focused on philosophy and social science research, funding operations through mid-2018. The centre began building expertise to support future science, technology, and AI safety grant applications.
The centre also published research on huge volcanic eruptions and the need to prepare for low-probability, high-impact geological events. CSER's environmental work is supported by funding from the Grantham Foundation for the Protection of the Environment.
Major funding sources include:
20CSER Overview - CambridgeCentre for the Study of Existential Risk

CSER is a multidisciplinary research centre at the University of Cambridge dedicated to studying and mitigating existential and global catastrophic risks, including those from advanced AI, biotechnology, and other emerging technologies. It brings together researchers from natural sciences, social sciences, and humanities to develop strategies for reducing civilisation-scale risks. CSER produces academic research, policy recommendations, and public engagement on long-term risks to humanity.

★★★★☆
Claims (1)
Founded in 2012 by philosopher Huw Price, cosmologist Lord Martin Rees, and Skype co-founder Jaan Tallinn, CSER represents one of the first major academic institutions dedicated to existential risk research.
21UK All-Party Parliamentary Group for Future GenerationsCentre for the Study of Existential Risk

This page from the Centre for the Study of Existential Risk (CSER) covers the UK's All-Party Parliamentary Group for Future Generations, a cross-party parliamentary initiative focused on long-term and intergenerational policymaking. The APPG aims to embed long-termist thinking into UK governance, which has relevance for existential risk reduction and AI safety policy. CSER's involvement reflects its mission to connect academic existential risk research with legislative and policy processes.

★★★★☆
Claims (1)
In 2018, CSER achieved significant recognition with two major publications: The Malicious Use of Artificial Intelligence: Forecasting, Preventing and Mitigation (co-authored with tech companies and security think-tanks) and An AI Race: Rhetoric and Risks, which won the Best Paper award at the AAAI/ACM AI Ethics and Society conference. The centre also established the UK's first All-Party Parliamentary Group for Future Generations during this period.
22CSER Public LecturesYouTube·Talk

The YouTube channel for the Centre for the Study of Existential Risk (CSER) at Cambridge University, hosting public lectures and talks on existential and global catastrophic risks including AI safety, biosecurity, nuclear risk, and environmental threats. The channel features leading researchers and thinkers discussing long-term risks to humanity and policy responses. It serves as a public outreach and education platform for one of the world's leading existential risk research institutions.

★★☆☆☆
Claims (1)
CSER maintains an active public engagement program including the CSER Public Lectures series (supported by the Blavatnik Foundation), which has been viewed over 500,000 times online. The centre's media engagement has been extensive, with research findings regularly covered by major international outlets.
23Vatican Workshop on ClimateCentre for the Study of Existential Risk

This CSER news item covers a Vatican-hosted workshop bringing together scientists, theologians, and policymakers to discuss climate change as a global and existential risk. The event reflects growing interfaith and cross-sector engagement on long-term civilizational threats. It highlights collaboration between academic institutions and religious organizations on catastrophic risk governance.

★★★★☆
Claims (1)
Martin Rees, CSER's co-founder, co-organized workshops with the Vatican that influenced the 2015 Papal Encyclical on Climate Change and contributed to momentum for the Paris Agreement.
24Jaan Tallinn ProfileCentre for Effective Altruism

This Effective Altruism profile covers Jaan Tallinn, co-founder of Skype and Kazaa, who became a major funder and advocate for AI safety research. It outlines his motivations for prioritizing existential risks from advanced AI and his philanthropic contributions to organizations like MIRI, CSER, and FHI.

★★★☆☆
Claims (1)
CSER was established in 2012 through an unusual collaboration between a philosopher (Huw Price, Bertrand Russell Professor of Philosophy at Cambridge), a scientist (Lord Martin Rees, Astronomer Royal and former President of the Royal Society), and a software entrepreneur (Jaan Tallinn, co-founder of Skype and early investor in Anthropic). The founders were motivated by concerns that advancing technologies—particularly artificial intelligence, biotechnology, nanotechnology, and anthropogenic climate change—posed extinction-level risks that were comparatively neglected in academia.
25CSER Founding HistoryCentre for the Study of Existential Risk

This page describes the founding history of CSER, a Cambridge University research center established to study and mitigate existential risks from advanced technologies including AI. It documents the origins and institutional development of one of the pioneering academic organizations dedicated to existential risk research.

★★★★☆
Claims (3)
CSER was established in 2012 through an unusual collaboration between a philosopher (Huw Price, Bertrand Russell Professor of Philosophy at Cambridge), a scientist (Lord Martin Rees, Astronomer Royal and former President of the Royal Society), and a software entrepreneur (Jaan Tallinn, co-founder of Skype and early investor in Anthropic). The founders were motivated by concerns that advancing technologies—particularly artificial intelligence, biotechnology, nanotechnology, and anthropogenic climate change—posed extinction-level risks that were comparatively neglected in academia.
Jaan Tallinn, who had begun engaging with the existential risk community in 2009, provided seed funding for the centre's establishment. In its early years (circa 2013), CSER submitted ambitious grant applications including a proposal to the European Research Council for a "New Science of Existential Risk" five-year program, which was highly ranked but ultimately not funded.
Jaan Tallinn provided critical seed funding in 2012 that enabled CSER's establishment.
26CSER AI Safety ConferencesCentre for the Study of Existential Risk

The Centre for the Study of Existential Risk (CSER) at Cambridge University conducts research on AI safety with a focus on existential and catastrophic risks from advanced AI systems. CSER brings together researchers from multiple disciplines to study and develop governance frameworks and technical safety approaches. Their work spans policy recommendations, risk analysis, and coordination efforts among researchers and institutions.

★★★★☆
Claims (1)
During this period, CSER developed the TERRA bibliography tool for existential risk publications and began organizing academic conferences linking decision theory and AI safety starting in 2017.
27CSER Nuclear Risk AdvisoryCentre for the Study of Existential Risk

The Centre for the Study of Existential Risk (CSER) at Cambridge University conducts research on nuclear risks as part of its broader existential risk mitigation mission. This page outlines CSER's advisory and research work examining how nuclear weapons, nuclear accidents, and related escalation dynamics pose catastrophic and existential threats to humanity.

★★★★☆
Claims (1)
The centre has advised governments on nuclear security, deterrence stability, and the intersection of emerging technologies with nuclear risks.
28CSER Global Risk Project AdvisoryCentre for the Study of Existential Risk

This page outlines the Centre for the Study of Existential Risk's (CSER) partnerships and collaborations with global institutions focused on researching and mitigating extreme risks to humanity. CSER works with academic, governmental, and civil society partners to address risks including AI, biotechnology, and environmental threats. These partnerships support interdisciplinary research and policy engagement on existential and global catastrophic risks.

★★★★☆
Claims (1)
CSER has advised on the establishment of global risk research programs at Australian National University, University of California Los Angeles (UCLA), and the University of Warwick. The centre has organized two international Cambridge Conferences on Catastrophic Risk and over 30 specialized workshops on topics including cybersecurity, nuclear security, climate change, and gene drives.
29CSER Decision Theory ConferencesCentre for the Study of Existential Risk

This page describes the Centre for the Study of Existential Risk's (CSER) research program on decision theory as it relates to AI safety and existential risk. It covers conferences and workshops exploring how formal decision-theoretic frameworks can inform the development of safe and beneficial AI systems.

★★★★☆
Claims (1)
The centre organized a series of academic conferences on decision theory and AI safety beginning in 2017, exploring the theoretical foundations necessary for developing safe artificial intelligence systems.

The Centre for the Study of Existential Risk (CSER) is a University of Cambridge research center founded in 2012 by Huw Price, Jaan Tallinn, and Lord Martin Rees, dedicated to studying and mitigating existential and global catastrophic risks. This EA Forum topic page aggregates discussions, evaluations, and funding information related to CSER's work within the effective altruism community.

★★★☆☆
Claims (2)
Founded in 2012 by philosopher Huw Price, cosmologist Lord Martin Rees, and Skype co-founder Jaan Tallinn, CSER represents one of the first major academic institutions dedicated to existential risk research.
Accurate100%Feb 22, 2026
CSER was founded in 2012 by Huw Price, Jaan Tallinn , and Lord Martin Rees.
The centre has a dedicated topic page on the EA Forum, where community members discuss CSER's work and its alignment with effective altruism priorities.
Accurate100%Feb 22, 2026
Centre for the Study of Existential Risk - EA Forum This website requires javascript to properly function.
31Centre for the Study of Existential Risk (CSER) Research AreasCentre for the Study of Existential Risk

CSER is a multidisciplinary research centre at the University of Cambridge focused on studying and mitigating risks that could lead to human extinction or civilizational collapse. Their research spans AI safety, biosecurity, extreme climate risks, and global governance of emerging technologies. The page provides an overview of their active research programs and focus areas.

★★★★☆
Claims (1)
CSER's research spans four primary domains: risks from artificial intelligence, extreme technological risks, global catastrophic biological risks, and extreme environmental risks including climate change. The centre operates through a three-pillar strategy focused on advancing understanding of existential risks through rigorous research, developing collaborative mitigation strategies, and building a global field of researchers, technologists, and policymakers committed to addressing these challenges.
32Cambridge Conference on Catastrophic Risk 2024Centre for the Study of Existential Risk

This page covers the 2024 Cambridge Conference on Catastrophic Risk hosted by the Centre for the Study of Existential Risk (CSER), a gathering focused on understanding and mitigating global catastrophic and existential risks. The conference brings together researchers, policymakers, and practitioners to discuss risks including those from advanced AI, biosecurity, and other emerging threats. It serves as a key networking and knowledge-sharing event for the existential risk research community.

★★★★☆
Claims (1)
Recent activities include the Cambridge Conference on Catastrophic Risk (September 2024), which brought together researchers, diplomats, UN representatives, and government officials to discuss emerging risks including biological and technological threats, space warfare, and systemic resilience.
33CSER Mission StatementCentre for the Study of Existential Risk

CSER is an interdisciplinary research centre at the University of Cambridge dedicated to studying and mitigating risks that could lead to human extinction or civilizational collapse. The centre focuses on emerging technologies including AI, biotechnology, and environmental risks, combining academic research with policy engagement. Their mission emphasizes both understanding the nature of existential risks and developing actionable strategies to reduce them.

★★★★☆
Claims (1)
CSER was established in 2012 through an unusual collaboration between a philosopher (Huw Price, Bertrand Russell Professor of Philosophy at Cambridge), a scientist (Lord Martin Rees, Astronomer Royal and former President of the Royal Society), and a software entrepreneur (Jaan Tallinn, co-founder of Skype and early investor in Anthropic). The founders were motivated by concerns that advancing technologies—particularly artificial intelligence, biotechnology, nanotechnology, and anthropogenic climate change—posed extinction-level risks that were comparatively neglected in academia.
34CSER Impact StrategyCentre for the Study of Existential Risk

This page outlines the Centre for the Study of Existential Risk's (CSER) strategy for translating its research into real-world impact, focusing on how it engages with policymakers, industry, and civil society to reduce global catastrophic and existential risks. It describes CSER's theory of change and mechanisms for influencing decision-makers on issues including AI safety, biosecurity, and other extreme risks.

★★★★☆
Claims (1)
CSER's research spans four primary domains: risks from artificial intelligence, extreme technological risks, global catastrophic biological risks, and extreme environmental risks including climate change. The centre operates through a three-pillar strategy focused on advancing understanding of existential risks through rigorous research, developing collaborative mitigation strategies, and building a global field of researchers, technologists, and policymakers committed to addressing these challenges.
35Exploring Careers in Existential Risk Event 2025Centre for the Study of Existential Risk

A career-focused event hosted by the Centre for the Study of Existential Risk (CSER) aimed at individuals interested in working on existential risk mitigation. The event likely provides guidance on career pathways, networking opportunities, and insight into research and policy roles within the existential risk field.

★★★★☆
Claims (1)
In March 2025, CSER hosted an "Exploring Careers in Existential Risk" event featuring speakers from 80,000 Hours and ERA Fellowship, connecting students and early-career researchers with opportunities in the field. The centre has also developed self-guided educational trails and contributed researchers as Lead Authors to the IPCC's Seventh Assessment Report.
36CSER Policy Advisory WorkCentre for the Study of Existential Risk

This page describes the Centre for the Study of Existential Risk's (CSER) policy engagement activities, outlining how researchers advise governments, international organizations, and other institutions on managing extreme technological and existential risks. It highlights CSER's role as a bridge between academic research and real-world policy decisions on risks from AI, biotechnology, and other emerging technologies.

★★★★☆
Claims (1)
CSER researchers have advised multiple governments and international organizations on AI and AGI governance, including consultations with the UN Secretary-General, UK government, EU, and US agencies. The centre's work emphasizes rigorous, multidisciplinary approaches to AI safety that bridge technical research, policy analysis, and ethical considerations.
Citation verification: 2 verified, 44 unchecked of 46 total

Structured Data

2 recordsView in FactBase →

Key People

2
JT
Jaan TallinnFounder
Co-Founder · 2012–present
MR
Martin ReesFounder
Co-Founder · 2012–present

Related Wiki Pages

Top Related Pages

Analysis

Anthropic (Funder)

Organizations

Survival and Flourishing FundBerkeley Existential Risk InitiativeFuture of Life Institute80,000 HoursFuture of Humanity InstituteCentre for Long-Term Resilience

Concepts

Safety Orgs Overview