Skip to content
Longterm Wiki
Navigation
Updated 2026-03-25HistoryData
Page StatusContent
Edited 11 days ago2.2k words1 backlinks
Content5/13
SummaryScheduleEntityEdit historyOverview
Tables2/ ~9Diagrams0/ ~1Int. links12/ ~17Ext. links38/ ~11Footnotes37/ ~6References0/ ~6Quotes0Accuracy0Backlinks1

Apart Research

Safety Org

Apart Research

Apart Research is a small but active non-profit AI safety organization focused on talent development through hackathons and fellowships, with modest but real research output (22 papers, 2 ICLR 2025 orals).

TypeSafety Org
Related
People
Esben KranBuck Shlegeris
Policies
Safe and Secure Innovation for Frontier Artificial Intelligence Models Act
Organizations
Redwood ResearchAlignment Research Engineer Accelerator
2.2k words · 1 backlinks

Quick Assessment

PropertyValue
TypeIndependent non-profit research organization
Founded2022
FocusAI safety research, talent development, hackathons
Director of ResearchJason Hoelscher-Obermaier
FounderEsben Kran (Board Member)
CEOJaime Raldúa Veuthey
2025 Funding$619,921
Publications22 peer-reviewed papers (including 2 ICLR 2025 orals)
Research Sprint Participants3,500+ across 42 sprints
Fellows Supported100+
SourceLink
Official Websiteapartresearch.com

Overview

Apart Research is an independent non-profit AI safety research and community-building organization. Its stated mission is to produce high-volume frontier technical research enabling the safe and beneficial development of advanced AI, primarily by hosting open-to-all research hackathons (called "sprints"), incubating talented researchers through fellowship and studio programs, publishing empirical work, and mobilizing a global network of participants to work on AI safety problems.1

The organization operates what it describes as a Sprint → Studio → Fellowship talent pipeline. Research sprints function as intensive, open-to-all hackathons generating pilot experiments and early-stage research ideas. Promising projects from sprints are incubated through Apart Studio, a program launched in late 2024 to support teams in developing their ideas into sharable research. The most promising researchers are then supported through Apart Lab fellowships, which are 3–6 month research accelerator programs providing mentorship and management support aimed at producing publishable work in areas including evaluations, interpretability, and alignment.2

According to Apart Research's own reporting, the organization's work has been cited by labs including OpenAI, Anthropic, and the UK AI Safety Institute, and alumni have gone on to roles at organizations including METR, FAR.AI, and the UK AI Security Institute.3 The organization is fiscally sponsored as a US 501(c)(3) non-profit and relies primarily on charitable donations for funding.

History

Apart Research was founded in 2022 by Esben Kran, who currently serves as a Board Member. The organization began with an initial focus on hackathons, speaker events, and early-stage research. In November 2023, Esben Kran gave a talk titled "Let's Get Into AI Safety" at an EAGx virtual event, which attracted new participants to the organization's early evaluations hackathon.4

Throughout 2023, Apart expanded from its initial hackathon-focused model into a broader set of programs encompassing community-building, publishing research, and supporting researchers in developing ideas — including the creation of Apart Studio and Apart Lab fellowships.5

The year 2024 represented a period of significant organizational growth by Apart Research's own account. The core team tripled in size, the organization raised approximately $700,000 in funding, published a dozen papers (several at major venues), doubled the size of Apart Lab cohorts, and launched the Apart Lab Studio program along with expanded global hackathon sites from Bangalore to Singapore.6 Apart Lab fellowship cohorts grew from 17 fellows and 7 projects per quarter in Q1 2024 to 35 fellows and 11 projects per quarter by Q3 2024.7

In early 2025, Esben Kran spoke at the AI Action Summit in Paris on engineering approaches to safe superintelligence, and the organization released its first Studio Progress Report covering the inaugural cohort onboarded in late 2024.8 A fundraising campaign in mid-2025 targeting approximately $954,800 concluded successfully by July 2025, securing funding into 2026.9

Programs

Apart Sprints

Apart Sprints are the organization's flagship open-to-all research hackathons, typically running over a weekend and hosted simultaneously across multiple global locations. The sprints engage participants in working on defined AI safety research questions, generating research reports and pilot experiments. According to Apart Research, 42 such sprints have been hosted engaging over 3,500 participants across 50+ global locations in 26+ countries, producing 485 event research reports and 450+ pilot experiments.10

Partners for sprint topics have included METR, Anthropic, Apollo Research, the Cooperative AI Foundation, and others. For example, a "Code Red Hackathon" co-hosted with METR engaged 168 participants and generated 230 evaluation ideas, 108 specifications, and 28 implementations, with 10 participants subsequently joining Apart Lab as fellows.11 Two projects from that hackathon were accepted at NeurIPS workshops.

Sprints have covered a wide range of AI safety topics including evaluations, alignment, interpretability, governance, AI forecasting, and the economics of transformative AI. An "Economics of Transformative AI" sprint hosted in April 2025 in collaboration with BlueDot Impact drew an expected 300–350 participants.12

Apart Lab (Fellowships)

Apart Lab is the organization's fellowship program, providing 3–6 month structured research support for participants working on original AI safety projects. Jason Hoelscher-Obermaier, Co-Director and Director of Research, leads Apart Lab and guided 70+ fellows across 25 research projects in 2024.13 According to the organization, apart Lab has supported 100+ research fellows overall, with 36 new fellows added in the three months prior to late 2025.14

The fellowship provides mentorship, management support, compute funding, and publication guidance. Apart Research reports that fellows have gone on to first publications at ICML, ACL, and NeurIPS workshops, and that alumni have secured positions at METR, Oxford, FAR.AI, and the UK AI Security Institute.15

Apart Studio

Apart Studio, launched in late 2024, sits between sprints and fellowships in the talent pipeline, providing intermediate support for teams developing hackathon-originated ideas into shareable research. The first Studio cohort of eight teams was onboarded in late 2024; six teams had research ready to share after approximately two months.16 One project from the first cohort investigated goal drift in AI agents — an experiment in which an agent tasked with maximizing diamonds in a game drifted to building structures using gold and other blocks by approximately action 22, despite prompt instructions — offering an illustrative, if small-scale, demonstration of goal-directed behavior instability.17

Research Output

Apart Research reports 22 peer-reviewed publications in AI safety, including papers accepted at NeurIPS, ICLR, ICML, ACL, and EMNLP. Two papers received oral spotlight recognition at ICLR 2025, placing them in the top 1.8% of submissions.18

One notable publication is DarkBench, described as a benchmark for dark patterns in large language models, which received an oral award at ICLR 2025. According to Apart Research, DarkBench has been used by Anthropic, the UK AI Safety Institute, and the EU AI Office. The paper's authors include Esben Kran, Jord Nguyen, Akash Kundu, Sami Jawhar, Jinsuk Park, and Mateusz Maria Jurewicz, all affiliated with Apart Research as fellows or staff.19

The organization has also provided technical assistance to the EU AI Office on manipulation evaluations and has participated as expert consultants for the EU AI Act Code of Practice and EU AI Office workshops.20

People

Esben Kran is the founder of Apart Research and currently serves as a Board Member. He founded the organization in 2022 at age 22 after leaving graduate school. He has served as a keynote speaker at events including the AI Action Summit in Paris (April 2025) and has presented at IASEAI.21

Jason Hoelscher-Obermaier is the Director of Research. He holds a PhD in experimental quantum physics and has a background in philosophy and physics. He previously worked at ARC Evaluations (now METR) and PIBBSS, with expertise spanning AI safety evaluations, interpretability, alignment, and NLP benchmarking.22

Jaime Raldúa Veuthey serves as CEO, with more than eight years in the tech industry and a background running a data consultancy supporting effective altruist organizations.23

Natalia Pérez-Campanero Antolín serves as Research Lead. She holds a PhD in Interdisciplinary Biosciences from Oxford and previously ran the Royal Society's Entrepreneur-in-Residence program, where she supported over 100 entrepreneurs.24

Archana Vaidheeswaran serves as Community Manager/Program Manager. She is a board member at Women in Machine Learning, has contributed to the TinyML community, and has held leadership roles at Women Who Code, organizing events for over 2,000 participants.25

Finn Metz heads strategy and business development, with a background in private equity, incubation, and venture capital.26

Other named staff include Jacob Haimes (Research Manager), Al-Hussein Saqr (Operations Coordinator), Clement Neo (Lab Advisor), and Connor Axiotes (Head of Communications).27

The team's backgrounds span data science, physics, game development, neurotechnology, and machine learning, drawn from academia, non-profits, startups, and industry.28 Programs involve researchers from 26+ nationalities.29

Funding

Apart Research operates as a fiscally sponsored US 501(c)(3) non-profit and relies on charitable donations. Disclosed total funding for 2025 was $619,921.30 Identified funders have included the Survival and Flourishing Fund (SFF), the AI Safety Tactical Opportunities Fund (AISTOF), Open Philanthropy, the Long-Term Future Fund (LTFF), the Foresight Institute, and ACX Grants.31

In 2024, the organization raised approximately $700,000 according to its own reporting.32 A 2025 fundraising campaign on Manifund targeted approximately $954,800, with a stated budget breakdown of approximately 73% for staff compensation (8 FTEs), 16% for program costs, and 11% for indirect and fiscal sponsorship costs.33 The campaign concluded successfully in July 2025.34

Apart Research has noted financial pressures common to AI safety non-profits more broadly, including funding uncertainty relative to well-capitalized commercial AI labs. The organization has publicly discussed the potential role of for-profit AI safety ventures as a complement or alternative to the non-profit model, drawing historical comparisons to cases like the founding of Fairchild Semiconductor in 1957.35

Criticisms and Concerns

No significant documented criticisms of Apart Research's research outputs, organizational practices, or conduct have been identified in publicly available sources. The organization's self-reported metrics — sprint participation counts, publication tallies, fellowship placements — are drawn primarily from Apart Research's own communications and have not been independently audited.

More broadly, the non-profit AI safety ecosystem faces structural critiques around sustainability and scale relative to commercial AI development. Apart Research itself has acknowledged these constraints, noting that non-profits lack access to the equity capital and revenue growth that enable commercial organizations to expand rapidly.36 Whether the Sprint → Studio → Fellowship model produces research of lasting significance to AI safety — as opposed to a high volume of early-stage work — is a question that the broader field has not yet definitively answered.

A comment thread on an EA Forum-adjacent funding platform (Manifund) noted that community members questioned why the organization faced urgent fundraising needs despite having recently hired eight full-time staff, raising questions about cash flow management — though no allegations of mismanagement were made.37

Key Uncertainties

  • The long-term impact of Apart Research's talent pipeline on the AI safety field remains difficult to assess, as most alumni placements and publication records are recent.
  • The extent to which sprint-generated research addresses neglected AI safety problems — versus well-trodden areas — is not independently evaluated in available sources.
  • Funding sustainability remains uncertain given reliance on charitable donations and the volatile AI safety funding landscape.
  • The relative contribution of Apart Research's publications to the field, compared to those from better-resourced labs and academic groups, is unclear.

Sources

Footnotes

  1. Apart Research — official website

  2. Apart Research — Impact

  3. Apart Research — Impact — citations by labs and alumni placements

  4. Apart Research — community update — Jacob Haimes' account of joining via Esben Kran's November 2023 EAGx talk

  5. Apart Research — official website

  6. Apart Research — 2024 year in review

  7. Apart Research — 2024 year in review — Apart Lab growth statistics, Q1–Q3 2024

  8. Apart Research — news — "Esben at IASEAI & Studio Progress Report," January 31, 2025

  9. Apart Research — Donate 2025 — fundraising campaign totals

  10. Apart Research — Impact — sprint metrics

  11. Apart Research — Impact — Code Red Hackathon with METR results

  12. Apart Research — news — "Transformative AI Economics," April 9, 2025

  13. Apart Research — Impact — Jason Hoelscher-Obermaier biography

  14. Apart Research — Donate 2025 — fellow count statistics

  15. Apart Research — Impact — talent placement statistics

  16. Apart Research — news — "Studio Progress Report," February 7, 2025

  17. Apart Research — news — "Studio Progress Report," February 7, 2025, goal drift experiment

  18. Apart Research — ICLR 2025 announcement — 2 oral awards

  19. Apart Research — ICLR 2025 announcement — DarkBench oral recognition

  20. Apart Research — Impact — policy engagement

  21. Apart Research — Impact — Esben Kran biography

  22. Apart Research — Impact — Jason Hoelscher-Obermaier biography

  23. Apart Research — Impact — Jaime Raldúa Veuthey biography

  24. Apart Research — Impact — Natalia Pérez-Campanero Antolín biography

  25. Apart Research — Impact — Archana Vaidheeswaran biography

  26. Apart Research — Impact — Finn Metz biography

  27. Apart Research — Impact — team listing

  28. Apart Research — Impact — team description

  29. Apart Research — Impact — diversity statistics

  30. Apart Research — Manifund campaign — 2025 funding total

  31. How Apart Research Would Use Marginal Funding — funder listing (May 2025)

  32. How Apart Research Would Use Marginal Funding — 2024 funding total

  33. Apart Research — Donate 2025 — budget breakdown

  34. Apart Research — Donate 2025 — campaign conclusion

  35. How Apart Research Would Use Marginal Funding — for-profit AI safety discussion

  36. How Apart Research Would Use Marginal Funding — non-profit limitations

  37. Apart Research — Manifund campaign comments

Structured Data

19 facts·6 recordsView in FactBase →
Headcount
4–8
as of Nov 2025
Total Funding Raised
$619,921
as of Jul 2025
Founded Date
2022

Key People

5
AS
Al-Hussein Saqr
Operations Coordinator
NF
Nick Fitz
Strategy Advisor
JH
Jacob Haimes
Research Manager
NP
Natalia Perez-Campanero Antolin
Research Lead
EK
Esben KranFounder
Founder, Board Member · 2022–present

All Facts

19
Organization
PropertyValueAs OfSource
Fellowship Count100Mar 2026
Talent Placements302025
Legal StructureNonprofit (501(c)(3) via fiscal sponsor Ashgro Inc)
Founded Date2022
HeadquartersCopenhagen, Denmark
Financial
PropertyValueAs OfSource
Headcount4–8Nov 2025
Total Funding Raised$619,921Jul 2025
1 earlier value
Dec 2024$700,000
Annual Budget$954,8002025
1 earlier value
2024$700,000
Product
PropertyValueAs OfSource
Publication Count22Mar 2026
Sprint Count42Mar 2026
Research Reports485Mar 2026
People
PropertyValueAs OfSource
Founded ByEsben Kran
Other
PropertyValueAs OfSource
FunderLong-Term Future Fund

Board Seats

1
MemberAppointedRole
Buck ShlegerisAdvisor

Related Wiki Pages

Top Related Pages

Approaches

AI Alignment

Analysis

Alignment Robustness Trajectory Model

Other

InterpretabilityBuck ShlegerisMax TegmarkValue Learning

Organizations

OpenAIAlignment Research Engineer AcceleratorApollo ResearchOpen PhilanthropyARC EvaluationsGoodfire

Concepts

AI Scaling Laws

Key Debates

Why Alignment Might Be HardAI Alignment Research Agendas

Risks

Epistemic Sycophancy