Skip to content
Longterm Wiki
Back

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: Cold Takes

This URL returns a 404; the original Cold Takes post likely summarized Ajeya Cotra's biological anchors report on transformative AI timelines. Users should seek the original Cotra report on the Alignment Forum or AI Impacts instead.

Metadata

Importance: 20/100blog postanalysis

Summary

This page returns a 404 error, indicating the content is no longer available at this URL. The intended resource appears to be Holden Karnofsky's Cold Takes summary or discussion of Ajeya Cotra's influential biological anchors framework for forecasting transformative AI timelines.

Key Points

  • Page is currently inaccessible (404 error); original content cannot be verified.
  • Likely discussed Cotra's 'biological anchors' approach to estimating when transformative AI might arrive.
  • Cold Takes blog by Holden Karnofsky frequently covers AI timelines and their implications for safety prioritization.
  • Cotra's original report estimated meaningful probability of transformative AI within decades based on compute scaling.

Cited by 1 page

PageTypeQuality
AI-Driven Concentration of PowerRisk65.0

Cached Content Preview

HTTP 200Fetched Mar 20, 20260 KB
# 404

## Page not found

[Go to the front page →](https://www.cold-takes.com/)
Resource ID: 78997e043e4a6184 | Stable ID: ZWNmNDc1OW