entity
MATS ML Alignment Theory Scholars program
Metadata
| Source Table | entities |
| Source ID | mats |
| Entity Type | organization |
| Description | MATS is a well-documented 12-week fellowship program that has successfully trained 213 AI safety researchers with strong career outcomes (80% in alignment work) and research impact (160+ publications, 8000+ citations). The program provides comprehensive support ($27k per scholar) and has produced no… |
| Source URL | www.matsprogram.org/ |
| Wiki ID | E548 |
| Children | 1 total(1 fact) |
| Created | Apr 8, 2026, 9:52 PM |
| Updated | Apr 8, 2026, 9:52 PM |
| Synced | Apr 8, 2026, 9:52 PM |
Record Data
id | mats |
wikiId | E548 |
stableId | MATS ML Alignment Theory Scholars program(organization) |
entityType | organization |
title | MATS ML Alignment Theory Scholars program |
description | MATS is a well-documented 12-week fellowship program that has successfully trained 213 AI safety researchers with strong career outcomes (80% in alignment work) and research impact (160+ publications, 8000+ citations). The program provides comprehensive support ($27k per scholar) and has produced no… |
website | www.matsprogram.org/ |
tags | — |
clusters | [ "community", "ai-safety" ] |
status | — |
lastUpdated | 2026-02 |
customFields | — |
relatedEntries | — |
metadata | {
"orgType": "safety-org",
"summaryPage": "safety-orgs-overview"
} |
Debug info
Thing ID: sid_yYtGSTsXuw
Source Table: entities
Source ID: mats
Wiki ID: E548
Entity Type: organization