Skip to content
Longterm Wiki
Back

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: EA Forum

ARC is a key alignment research org whose evals work has influenced major AI lab policies; this topic page is useful for tracking community discussion around their research and impact.

Metadata

Importance: 55/100wiki pagereference

Summary

The Alignment Research Center (ARC) is a non-profit AI alignment research organization founded in 2021 by Paul Christiano, focusing on alignment theory, AI safety evaluations, and responsible scaling policies. This EA Forum topic page aggregates community discussion and posts related to ARC's work and research directions.

Key Points

  • ARC was founded in 2021 by Paul Christiano, a prominent AI alignment researcher formerly at OpenAI.
  • The organization received over $260,000 in Open Philanthropy funding as of July 2022.
  • ARC's work spans alignment theory, AI safety evaluations, and responsible scaling policies.
  • ARC has been involved in third-party model evaluations for dangerous capabilities, influencing frontier AI deployment decisions.
  • This EA Forum topic page serves as a hub for community discussion of ARC's research outputs and organizational updates.

Cited by 1 page

PageTypeQuality
Model Organisms of MisalignmentAnalysis65.0

Cached Content Preview

HTTP 200Fetched Mar 15, 20262 KB
Alignment Research Center - EA Forum 
 
 This website requires javascript to properly function. Consider activating javascript to get access to all site functionality. Hide table of contents Alignment Research Center

 Edit History Discussion 0 Subscribe Edit History Discussion 0 Alignment Research Center Funding Further reading External links Random Topic Contributors 2 Pablo 1 Leo The Alignment Research Center ( ARC ) is a non-profit research organization focused on AI alignment . It was founded in 2021 by Paul Christiano . [1] 

 Funding 

 As of July 2022, ARC has received over $260,000 in funding from Open Philanthropy . [2]   

 ... (Read more) 

 Posts tagged Alignment Research Center

 Top Relevance 176 2021 AI Alignment Literature Review and Charity Comparison Larks Larks · 4y  ago · 87 m read 18 2 18 2 130 What I would do if I wasn’t at ARC Evals Lawrence Chan Lawrence Chan · 3y  ago · 16 m read 4 2 4 2 89 ARC is hiring alignment theory researchers Paul_Christiano Paul_Christiano , Mark Xu + 0 more · 4y  ago · 2 m read 4 2 4 2 78 ARC is hiring theoretical researchers Jacob_Hilton Jacob_Hilton , Paul_Christiano , Mark Xu + 0 more · 3y  ago · 5 m read 0 1 0 1 28 Safety evaluations and standards for AI | Beth Barnes | EAG Bay Area 23 Beth Barnes Beth Barnes · 3y  ago · 20 m read 0 1 0 1 16 Christiano (ARC) and GA (Conjecture) Discuss Alignment Cruxes Andrea_Miotti Andrea_Miotti · 3y  ago 1 2 1 2 16 ARC Evals: Responsible Scaling Policies Zach Stein-Perlman Zach Stein-Perlman · 2y  ago 1 2 1 2 8 A Neglected Alignment Strategy: Decision-Theoretic Self-Alignment via Simulation Uncertainty Mental Maths Mentor Mental Maths Mentor · 2mo  ago · 2 m read 0 1 0 1 Add posts
Resource ID: d4ba07bca55cb4f3 | Stable ID: ZjEwZDUxND