Back
SFist - Debug My Brain
websfist.com·sfist.com/2016/01/14/debug_my_brain/
Skeptical journalistic critique of CFAR, an organization closely tied to the rationalist and AI safety communities; useful for understanding external perceptions and critiques of the rationalist movement's training programs.
Metadata
Importance: 18/100opinion piececommentary
Summary
A critical SFist commentary on the Center for Applied Rationality (CFAR), arguing that its $3,900 seminars blend pseudoscience, self-help tropes, and quasi-religious assumptions under the guise of rationality training. The piece questions CFAR's scientific credibility and notes its connections to AI safety and existential risk communities in the Bay Area tech world.
Key Points
- •CFAR charges $3,900 for 4-day residential seminars targeting tech professionals, with backing from organizations like the Thiel Fellowship and clients including Facebook.
- •The article critiques CFAR's methods as a mashup of cognitive science, self-help, and religious assumptions rather than rigorous rationality training.
- •CFAR's co-founder Julia Galef frames the program around Kahneman's System 1/System 2 model, aiming to harmonize emotional and analytical thinking.
- •The piece highlights connections between CFAR's worldview and AI existential risk beliefs, noting quasi-religious themes of human improvement and immortality.
- •A Lumosity executive attending the seminar is noted ironically, as Lumosity was fined $2M by the FTC for unfounded cognitive benefit claims around the same time.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| Center for Applied Rationality | Organization | 62.0 |
Cached Content Preview
HTTP 200Fetched Mar 15, 20266 KB
It's time for a bunch of self-proclaimed "rationalists" — a group of mostly technologists in their 20s — to start acting like it and stop paying the Berkeley-based [Center For Applied Rationality](http://rationality.org/). I say this because CFAR's $3,900 4-day seminars and their attendees are [the subject of a New York Times magazine article](http://www.nytimes.com/2016/01/17/magazine/the-happiness-code.html?_r=1) this week that will leave you slapping your forehead in front of an imagined group of conference-goers.
Lest you think this to be a kooky-Berkeley one-off spiritual center, allow me to note that these charlatan-sounding types have been hired by Facebook and the Thiel Fellowship and that $3,900 is not cheap, especially given the living conditions offered.
> \[The\] workshops... are run like a college-dorm cram session. Participants stay on-site for the entire time (typically four days and nights), often in bargain-basement conditions. In San Leandro, the organizers packed 48 people (36 participants, plus six staff members and six volunteers) into a single house, using twin mattresses scattered on the floor as extra beds. In the kitchen, I asked Matt O’Brien, a 30-year-old product manager who develops brain-training software for Lumosity, whether he minded the close quarters. He looked briefly puzzled, then explained that he already lives with 20 housemates in a shared house in San Francisco. Looking around the chaotic kitchen, he shrugged and said, ‘‘It’s not really all that different.’’
Yes, in a total coincidence, that would be Lumosity the games app that must now fork over [$2 million to the FTC for "unfounded" claims of cognitive health benefits](https://sfist.com/2016/01/06/brain_drain_game_app_lumosity_will.php). Others at the seminar included Asher, a self-described "singing, freestyle rapping, former international Quidditch All-American turned software engineer.’’ A third was a gentleman who ended conversations with a bit of charm, saying ‘‘I will allow you to disengage,’’
The fun starts with a CoZE, or comfort-zone expansion, exercise. An organizer and CFAR founder who says ‘We’re trying to invent parkour for the mind," as if to shout "THIS IS A FAD," first encourages attendees to, indeed, step outside their comfort zones. Naturally, one puts his hand in a pan of curry and another takes off his shirt and affixes a sign to himself that reads "touch me." We're off!
> ‘‘A lot of people think that rationality means acting like Spock and ignoring things like intuition and emotion,’’ \[co-founder Julia Galef\] said ‘‘But we’ve found that that approach doesn’t actually work.’’ Instead, she said, the aim was to bring the emotional, instinctive parts of the brain (dubbed ‘‘System One’’ by Kahneman) into harmony with the more intellectual, goal-setting parts of the brain (‘‘System Two’’).
An elaboration on that:
> ‘‘The prefrontal cortex is like a monkey riding an elephant,’’ she told the group. ‘‘System One is the elephant. And y
... (truncated, 6 KB total)Resource ID:
23242dc9ae9d7ea6 | Stable ID: NzMxYWUwZD