Skip to content
Longterm Wiki
All Source Checks
Citation

MATS ML Alignment Theory Scholars program - Footnote 55

confirmed100% confidence

1 evidence check

Last checked: 4/3/2026

Migrated from citation_quotes. Original verdict: accurate

Evidence — 1 source, 1 check

confirmed100%Haiku 4.5 · 4/3/2026
Found: - **Sam Bowman**: Leads a research group working on AI alignment and welfare at Anthropic, with a particular focus on evaluation; Associate Professor of Computer Science and Data Science at NYU (on le

Note: Migrated from citation_quotes accuracy check. Original verdict: accurate

Debug info

Record type: citation

Record ID: page:mats:fn55