Skip to content
Longterm Wiki
Back

Ryan Kidd - TAIS 2024

web

Speaker profile from TAIS 2024 conference; limited metadata available as page content was not retrieved. TAIS conferences bring together technical AI safety researchers to share current work.

Metadata

Importance: 25/100conference paperreference

Summary

This is a speaker profile page for Ryan Kidd at the Technical AI Safety (TAIS) 2024 conference. The page likely contains information about Kidd's talk, research focus, and background in AI safety. Without content available, details about his specific contributions to the conference agenda are limited.

Key Points

  • Ryan Kidd is a speaker at the TAIS 2024 (Technical AI Safety) conference
  • TAIS 2024 is a conference focused on technical approaches to AI safety research
  • The agenda page would typically include talk title, abstract, and speaker bio
  • Content unavailable for detailed analysis of specific research contributions

Cited by 1 page

PageTypeQuality
MATS ML Alignment Theory Scholars programOrganization60.0

Cached Content Preview

HTTP 200Fetched Mar 20, 20263 KB
[Skip to content](https://tais2024.cc/agenda/ryan-kidd/#wp--skip-link--target)

[![Technical AI Safety Conference](https://tais2024.cc/wp-content/uploads/2023/11/asset-12.png)](https://tais2024.cc/)

# [Technical AI Safety Conference](https://tais2024.cc/)

# Ryan Kidd

## キッド・ライアン

### ML Alignment & Theory Scholars Program (MATS)

Ryan is Co-Director of the ML Alignment & Theory Scholars Program (since early 2022) and a Board Member and Co-Founder of the London Initiative for Safe AI (since early 2023). Previously, he completed a PhD in Physics at the University of Queensland.

![](https://tais2024.cc/wp-content/uploads/2024/03/fb_img_1681098260188-ryan-kidd-1.jpg?w=960)

Ryan Kidd — Insights from two years of AI safety field-building at MATS \[TAIS 2024\] - YouTube

[Photo image of AI Safety 東京](https://www.youtube.com/channel/UCKr9-OIxbA4TCJDsySVSUPQ?embeds_referring_euri=https%3A%2F%2Ftais2024.cc%2F)

AI Safety 東京

443 subscribers

[Ryan Kidd — Insights from two years of AI safety field-building at MATS \[TAIS 2024\]](https://www.youtube.com/watch?v=tA9K8JqyhP4)

AI Safety 東京

Search

Watch later

Share

Copy link

Info

Shopping

Tap to unmute

If playback doesn't begin shortly, try restarting your device.

More videos

## More videos

You're signed out

Videos you watch may be added to the TV's watch history and influence TV recommendations. To avoid this, cancel and sign in to YouTube on your computer.

CancelConfirm

Share

Include playlist

An error occurred while retrieving sharing information. Please try again later.

[Watch on](https://www.youtube.com/watch?v=tA9K8JqyhP4&embeds_referring_euri=https%3A%2F%2Ftais2024.cc%2F)

0:00

0:00 / 28:29

•Live

•

## Insights from two years of AI safety field-building at MATS

#### Friday, April 5th, 10:00–10:30

The ML Alignment & Theory Scholars (MATS) Program is an educational seminar and independent research program that aims to provide talented scholars with talks, workshops, and research mentorship in the field of AI alignment and connect them with AI safety research communities in the SF Bay Area and London. Since early 2022, MATS has run five seasonal programs, supporting 213 scholars and 47 mentors, and alumni have joined nearly every major AI safety initiative (and founded several new ones). This talk will summarize our insights into selecting and developing AI safety research talent and our plans for future projects.

[Toggle photo metadata visibility](https://tais2024.cc/agenda/ryan-kidd/#)[Toggle photo comments visibility](https://tais2024.cc/agenda/ryan-kidd/#)

Loading Comments...

Write a Comment...

Email (Required)Name (Required)Website

- - [![](https://tais2024.cc/wp-content/uploads/2023/11/cropped-asset-12-2.png?w=50) Technical AI Safety Conference](https://tais2024.cc/)
  - [Sign up](https://wordpress.com/start/)
  - [Log in](https://wordpress.com/log-in?redirect_to=https%3A%2F%2Fr-login.wordpress.com%2Fremote-login.php%3Faction%3Dlink%26back%3Dhttps%253A%252F%252Ftais2024.

... (truncated, 3 KB total)
Resource ID: bf3e9e701a226d04 | Stable ID: NTkxNTZjYT