The Sequences by Eliezer Yudkowsky
OrganizationA foundational collection of blog posts on rationality, cognitive biases, and AI alignment that shaped the rationalist movement and influenced effective altruism.
Related Wiki Pages
Top Related Pages
AI Alignment
Technical approaches to ensuring AI systems pursue intended goals and remain aligned with human values throughout training and deployment. Current ...
Instrumental Convergence
Instrumental convergence is the tendency for AI systems to develop dangerous subgoals like self-preservation and resource acquisition regardless of...
Robin Hanson
American economist known for pioneering prediction markets, proposing futarchy governance, and offering skeptical perspectives on AI existential risk
LessWrong
A community blog and forum focused on rationality, cognitive biases, and artificial intelligence that has become a central hub for AI safety discou...
Machine Intelligence Research Institute (MIRI)
A pioneering AI safety research organization that shifted from technical alignment research to policy advocacy, founded by Eliezer Yudkowsky in 200...