Skip to content
Longterm Wiki
Back

MATS Funding - Extruct

web

This Extruct-aggregated page covers funding details for the MATS program, relevant for researchers seeking financial support to enter or continue AI safety research careers.

Metadata

Importance: 35/100homepagereference

Summary

This page provides information about funding available through the ML Alignment Theory Scholars (MATS) program, which supports researchers working on AI safety and alignment. It likely outlines stipends, grants, or financial support structures for program participants pursuing technical AI safety research.

Key Points

  • MATS (ML Alignment Theory Scholars) is a research program focused on AI safety and alignment
  • The page details funding mechanisms available to MATS scholars and participants
  • Financial support helps enable researchers to pursue AI safety work full-time or part-time
  • MATS is a pipeline program aimed at growing the AI safety research talent base

Cited by 1 page

PageTypeQuality
MATS ML Alignment Theory Scholars programOrganization60.0

Cached Content Preview

HTTP 200Fetched Mar 20, 20263 KB
[Extruct home](https://www.extruct.ai/)

- [Pricing](https://www.extruct.ai/pricing/)
- [Research](https://www.extruct.ai/research/)
- [Product Blog](https://www.extruct.ai/blog/)
- [Data Room](https://www.extruct.ai/data-room/)
- [API Docs](https://www.extruct.ai/docs)

[Sign Up](https://app.extruct.ai/sign-up) [Login](https://app.extruct.ai/)

# ML Alignment & Theory Scholars Analysis: $3M Raised

## What is ML Alignment & Theory Scholars?

ML Alignment & Theory Scholars (MATS) Program connects scholars with mentors in AI alignment, governance, and security. Their unique approach combines research with educational seminars and community networking. MATS empowers researchers to address the urgent challenge of unaligned artificial intelligence.

Employees

11-50

Founded

2021

Industry

EdTech, AI/ML

## Product Features & Capabilities

- Research and educational seminars in AI alignment
- Networking events with AI alignment community
- Workshops on research strategy
- Mentorship from leading AI alignment researchers
- Financial support for scholars.

## Use Cases

Conduct research on AI alignment challenges; Attend workshops and seminars on AI governance; Network with professionals in AI safety; Collaborate with mentors on research projects; Pursue independent research with funding support.

## How much ML Alignment & Theory Scholars raised

### Grant \- $1,008,127

April 2022

Lead Investor:Open Philanthropy

### Grant \- $1,538,000

November 2022

Lead Investor:Open Philanthropy

### Grant \- $428,942

June 2023

Lead Investor:Open Philanthropy

## Other Considerations

Supported 357 scholars and 75 mentors since 2021; Received funding from notable organizations like Open Philanthropy; Alumni have co-founded AI safety organizations.

See something that needs updating? [Suggest edits to this profile](mailto:support@extruct.ai?subject=Hub%20profile%20edit%20suggestion%3A%20ML%20Alignment%20%26%20Theory%20Scholars%20(matsprogram-org-funding)&body=Company%3A%20ML%20Alignment%20%26%20Theory%20Scholars%0AHub%20URL%3A%20https%3A%2F%2Fwww.extruct.ai%2Fhub%2Fmatsprogram-org-funding%2F%0A%0ARequested%20edits%3A%0A-%20).

### Financial Overview

$3MTotal Raised

Grant$1,008,127

April 2022

**Investors:** Open Philanthropy

Grant$1,538,000

November 2022

**Investors:** Open Philanthropy

Grant$428,942

June 2023

**Investors:** Open Philanthropy

[Want to research more data points on ML Alignment & Theory Scholars?\\
\\
Start with Extruct](https://www.extruct.ai/)

### Platform Links

[![Follow on LinkedIn](https://www.google.com/s2/favicons?sz=64&domain=linkedin.com)Follow on LinkedIn](https://www.linkedin.com/company/mats-program "Follow on LinkedIn") [![Official Website](https://www.google.com/s2/favicons?sz=64&domain=matsprogram.org)Official Website](https://matsprogram.org/ "Official Website")
Resource ID: efc509be661efaa0 | Stable ID: ODEyODY5Mm