Back
What Happened to the Future? - Founders Fund Manifesto
webfoundersfund.com·foundersfund.com/2017/01/manifesto/
This Founders Fund manifesto is a venture capital perspective on technological ambition and risk, relevant as background on how influential tech investors frame AI and existential risk debates, though it is not a technical AI safety resource.
Metadata
Importance: 28/100opinion piececommentary
Summary
The Founders Fund manifesto critiques the stagnation of technological ambition, arguing that venture capital has shifted from funding transformative 'hard' technologies to incremental software and social media. It calls for a return to bold, civilization-scale technological bets. The page also previews Hereticon, a conference celebrating heterodox ideas and ambitious technologies.
Key Points
- •Founders Fund argues that technological progress has slowed, with VC funding 'bits' (software) rather than 'atoms' (physical, transformative tech).
- •The manifesto critiques both doomer safety absolutism and naive techno-optimism, arguing real progress entails genuine but worthwhile risks.
- •Hereticon is framed as a space for 'thoughtcrime' — protecting heterodox ideas across science, technology, biology, and AI.
- •The document suggests existential risks from technology are real but are the corollary of living in an age of transformative progress.
- •The venture fund positions itself as explicitly pro-ambitious technology investment, influencing how AI and biotech funding is philosophically justified.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| Founders Fund | Organization | 50.0 |
Cached Content Preview
HTTP 200Fetched Mar 20, 20266 KB
# [Founders Fund](https://foundersfund.com/)

Copyright © 2005-2026. All Rights Reserved. Founders Funder ® is a registered trademark of Founders Fund, LLC. Terms of Use.
Past performance is not indicative of future results. Investments may lose value. No offer to sell or solicitation of an offer to buy securities is made hereby.
# Hereticon Apocalypse Ball
By Mike Solana
#### **The Apocalypse will take place from the evening of October 28th to the morning of October 31st.**
[Join us](https://forms.gle/Wv78ZTZSyRiQoBiJA) for a showcase and celebration of technologies and projects so ambitious they will fundamentally transform our world for the better — or destroy it.
Right at the peak of our last pandemic, Founders Fund hosted the first Hereticon, a “conference for thoughtcrime.” Our thinking was simple: dissent is worth protecting. Most new ideas are wrong, or useless. Some are even dangerous. But from science and technology to business and faith, progress is a history of persecuted weirdos, so that is where we stand, and that is what we celebrated. In conversation, no topic was off limits: genetic modification, natalism, parapsychology, artificial consciousness, defense, pharmacology, virology, sex, God. There were lectures, there were performances, there was, randomly and happily, a marriage. Two babies were conceived at the event, two more because of it (that we know of), and new ideas were shared in quantities at least as great as drinks consumed. Obviously, we’re doing it again. But this October, while our hope is still for guests to come and speak their mind — prepare your take, your talk, your robust defense of whatever heretical concept you believe a worthy cause — we’d like to introduce a frame for your consideration: the end of the world.
For years, Americans have obsessed with the question of our impending doom, captured especially by the role of technology in humanity’s looming final chapter, though here perspectives tend to bifurcate. First, from our nascent class of celebrity safetyists, we’re told mankind’s technological destruction is certain unless we halt interrogation and development of every field from synthetic biology to advanced computing, reducing the imagined existential risk of novel technologies to something close to zero. Concurrently, from the accelerating techno-optimists, we’re told only good has ever come, and can ever come, of technological advance. Both positions are wrong. Amidst the lingering consequences of what was probably a lab made pandemic, heightened political turmoil in a world of nuclear weapons, and, as mankind’s first synthetic mind comes closer by the day to waking up, the unknowable implications of general artificial intelligence, the truth is we are clearly at risk of cataclysm. But that is the corollary of life in an age of wonder, a precious gift impossible without the natural dangers inhe
... (truncated, 6 KB total)Resource ID:
6d51e6a8a416a263 | Stable ID: MWZhYmUwNz