Skip to content
Longterm Wiki
entity

MATS ML Alignment Theory Scholars program

Metadata

Source Tableentities
Source IDmats
Entity Typeorganization
DescriptionMATS is a well-documented 12-week fellowship program that has successfully trained 213 AI safety researchers with strong career outcomes (80% in alignment work) and research impact (160+ publications, 8000+ citations). The program provides comprehensive support ($27k per scholar) and has produced no…
Source URLwww.matsprogram.org/
Wiki IDE548
Children1 total(1 fact)
CreatedApr 8, 2026, 9:52 PM
UpdatedApr 8, 2026, 9:52 PM
SyncedApr 8, 2026, 9:52 PM

Record Data

idmats
wikiIdE548
stableIdMATS ML Alignment Theory Scholars program(organization)
entityTypeorganization
titleMATS ML Alignment Theory Scholars program
descriptionMATS is a well-documented 12-week fellowship program that has successfully trained 213 AI safety researchers with strong career outcomes (80% in alignment work) and research impact (160+ publications, 8000+ citations). The program provides comprehensive support ($27k per scholar) and has produced no
websitewww.matsprogram.org/
tags
clusters
[
  "community",
  "ai-safety"
]
status
lastUpdated2026-02
customFields
relatedEntries
metadata
{
  "orgType": "safety-org",
  "summaryPage": "safety-orgs-overview"
}
Debug info

Thing ID: sid_yYtGSTsXuw

Source Table: entities

Source ID: mats

Wiki ID: E548

Entity Type: organization