Nico Espinosa / M-A Chronicle

The Church of AI

“We won’t break out the taser today. We have guests with us,” Frank* said, referring to the undisclosed “punishment” mentioned for those who didn’t complete the required reading prior to attendance. Berkeley’s Rationalist reading groups take place weekly, with each event post still hinting at this mysterious penalty.

A Brief History of the Rationalists

Although not officially a church, the Rationalists began as an online community out of a blog called Overcoming Bias in 2006. In their early years, Rationalists strived to create discussions over issues of philosophy, ethics, and technology. Within three years, the blog grew into an active network of hundreds of readers and contributors.

After those three years, greater interest in Rationalism prompted a chorus of new blogs to spring from Overcoming Bias, forming what is now called the “Rationalist blogosphere.” One of those blogs—LessWrong—shifted its focus to deeper questions of how to think rationally and the future of technology. Since, LessWrong has become the largest blog in the Rationalist blogosphere, and boasts over 170,000 registered users. 

LessWrong’s founder, Eliezer Yudkowsky, co-founded the Singularity Institute for Artificial Intelligence (SIAI), which later became the Machine Intelligence Research Institute (MIRI) in 2013. SIAI’s original goal was to develop artificial intelligence, but by 2003, “Yudkowsky realized that there would in fact be a problem of aligning smarter-than-human AI with human values,” according to MIRI’s website. MIRI is currently based in Berkeley as a nonprofit organization.

Yudkowsky is also the author of “The Sequences,” a nearly 2,400-page collection of essays detailing Rationalist philosophy. Although “The Sequences” cover a wide variety of subjects—for instance, abstract concepts like “superexponential conceptspace”—its primary goal is to teach readers how to “live life rationally.” According to Yudkowsky, rationality is a state of thinking free from cognitive biases.

Courtesy Time Magazine Eliezer Yudkowsky.

Lighthaven

Rationalists have seen meteoric growth along the recent rise of AI—so much so, that they crowdfunded nearly $3 million in order to purchase a permanent property to serve as their de facto headquarters.

Courtesy Lighthaven Bayes House building, within the Lighthaven property.

Called “Lighthaven,” the property was formerly the site of an old hotel. It’s now managed by Lightcone Infrastructure—a nonprofit offshoot of the Rationalists. According to their website, Lightcone’s mission is to preserve the human race. “We might not even survive the century. To increase our odds, we build services and infrastructure for people who are helping humanity navigate this crucial period,” it says.

Courtesy Lighthaven Outside the Bayes House building.

Lighthaven also hosts a weekly reading club, where members discuss portions of “The Sequences,” or “highlights” from LessWrong discussions. However, the required reading for October 28th, 2025 wasn’t about AGI or the apocalypse—it was a “Guide to rationalist interior decorating.” 

While the movement is known most for debating the future of humanity, many members apply Yudkowsky’s principles to ordinary problems, approaching 401(k) management or dieting with extensive statistical analysis.

The decorating guide goes in-depth on how to optimize everything from the brightness of lightbulbs (1600+ lumens is a good start) to the brand of air purifier (Coway or Blueair are best). The writing is surprisingly blunt. “For the love of f*ck please take the plastic covers off of the filters before running it,” it says, referencing air purifiers. 

At meetings, attendees first break into small groups to discuss the reading.

“Bad lighting that most people can ignore really upsets me,” attendee Matt Kinkele, a network engineer, said. “I feel shitty spending the whole day working under crappy fluorescent lights. I don’t like going to the grocery store because the lighting there is bad […] I love going to rationalist houses because it’s nice.”

“A lot of my lighting in my home is designed with a very rationalist, critical approach,” Matt added. “In a lot of other social contexts, people will tell me, ‘Matt, you put too much thought into that.’ In this context, people say that it’s laudatory. People are like, ‘Wow, you put a lot of thought into that light bulb. Good job.’ That doesn’t happen in a lot of other places.”

Another group strayed from the topic quickly. One attendee argued that humanity should universally adopt “meat tubes,” to the rest of the group’s agreement. The ultimate goal, according to them, should be to pump every kind of food through pipes just like water, eliminating the inconvenience of driving to the store.

Ironically, the person who ordered dinner forgot to select the “delivery” option and had to drive to pick it up. The groups then converged in the kitchen for pizza and more discussion.

Vesta Kassayan / M-A Chronicle A dining space in the kitchen.

There’s a variety in the extremity of Rationalist beliefs. “I’ve had Rationalist friends who had mental breakdowns over whether they were murdering phytoplankton,” Carson* said. 

“[Some] Rationalists have extreme beliefs and they try to convince each other that they’re right,” attendee Elijah Ravitz-Campbell said. “There’s almost certainly someone at this reading group who has a P(doom) of 95% or more—they think the world is gonna end.” 

Matt has a red, circular lapel pin he carries around everywhere with him. On it, in white font: “P(doom) > 25%.”

Elijah, who describes himself as “your stereotypical Redditor,” explained that he, among many in the group, doesn’t share such extreme views but still finds practicality in the philosophy while enjoying the socialization offered by the discussions. “There are a lot of people who have extreme ideas here,” Elijah said. “Eliezer Yudkowsky is and has been for over a decade, convinced that AI is very, very dangerous and probably going to kill us all.”

“Eliezer Yudkowsky is and has been for over a decade, convinced that AI is very, very dangerous and probably going to kill us all.”

In addition to “The Sequences,” Yudkowsky wrote a book called “Harry Potter and the Methods of Rationality,” which tells the story of an alternate universe where Harry Potter is raised by an Oxford professor of biochemistry and becomes a wizard with a Rationalist worldview. Each of the 122 chapters is titled based on the Rationalist lesson being established, such as “Multiple Hypothesis Testing” and “Comparing Reality To Its Alternatives.” “It’s very fun. I have to say—to my great shame—I like it a lot and I think it’s pretty good,” Elijah said.

“Rationalism broadly is probably 40% AI, 60% everything else,” Elijah said. “Rationalism largely is secular, maybe sometimes self-religiously posing itself as an alternative to that, like you don’t need to go to religion,” Elijah said. “Maybe part of the religious thing is there’s a strong sense of like, trying to save the world,” attendee Austin added. “Effective altruists, yeah,” Elijah said.

“I think the diversity in personal ideology you see in the Rationalist community is more than any other group of individuals. There are people here that I have zero ideological and moral relations with, and we have fantastic debates,” Matt added.

Courtesy hpmor.com The cover of Harry Potter and the Methods of Rationality.

How to Survive until the Apocalypse

While LessWrong has a wealth of articles on topics from sexual dimorphism to systems of control, one notable article covers a more central principle of the Rationalist movement: How to survive until AGI. AGI stands for “artificial general intelligence” and refers to AI that is essentially equally or more intelligent than humans. The author estimates that AGI will arrive within the next 20 years and talks about easy strategies to maximize one’s chances of survival until then.

The article focuses on the number of “micromorts”—a unit of risk that represents a one-in-a-million chance of death—involved in common activities. It advises against obvious things like hard drugs and mountaineering, and compares how a sport like paragliding has 74 micromorts per launch compared to skiing, which has just 0.7 micromorts per day.

Polyamory

Other posts venture into topics such as polyamory, defined as being in multiple romantic relationships simultaneously, with one post addressing the question, “Why Are So Many Rationalists Polyamorous?” The author states that “anecdotally, the most common justifications [they] hear for monogamy are jealousy-related.” They added, “Jealousy is just an emotion, and rationalists have a tradition of distrusting emotions.”

“Anecdotally, the most common justifications I hear for monogamy are are jealousy-related”

They go on to compare the monogamy debate to the famous Prisoners’ Dilemma and associated game theory, saying that “monogamy is a zero-sum game. Each person gets one partner, and once that partner is taken, they are removed from the dating pool for everyone else. There is no sharing, coordination, or trading. There are no strategies that can be optimized. In other words, it’s not interesting to rationalists.”

Nonmonogamy, or polyamory, on the other hand, is described as a positive-sum game. “Nonmonogamy allows parties to, for example, have a date with one partner while their other partner is busy, spend time with multiple partners at the same time, and coordinate to compensate for imbalances in sex drive. Parties rarely want exactly the same thing from their partners, so there are usually large opportunities for emotional arbitrage.”

A comment on the post states,  that “Rationality started with a polyamorous founder.” Ben Pace, an organizer of the Berkeley reading group, replied to the comment, adding, “I would think it a bad approach to polyamory to be constantly feeling angry/jealous/threatened by what’s happening in your romantic relationships, but keeping practising ignoring it until you’re numb to that part of yourself.”

More posts and discussions can be found on lesswrong.com, covering topics such as effective altruism, artificial intelligence and its threat to humanity, transhumanism, artificial food replacements, open borders, and the Slate Star Codex.

Nico is a senior in his first year of journalism. In his free time he likes playing tennis, playing music, and getting food with friends. He enjoys writing and looks forward to covering stories on culture, news, and music.

Vesta is a junior in his first year of journalism. Aside from covering board meetings and local events, he enjoys swimming, playing water polo, and talking to friends.

Leave a Reply

Your email address will not be published.