Rationalist community

Last updated

The rationalist community is a 21st century philosophical movement that formed around a group of internet blogs, primarily LessWrong and Astral Codex Ten (formerly known as Slate Star Codex). The movement initially gained prominence in the San Francisco Bay Area. Its members seek to use rationality to avoid cognitive biases. Common interests include probability, effective altruism, transhumanism, and mitigating existential risk from artificial general intelligence.

Contents

The borders of the rationalist community are blurry and subject to debate among the community and adjacent groups. [1] Members who diverge from typical rationalist beliefs often self-describe as "rationalist-adjacent", "post-rationalist" (also known as "ingroup" and "TPOT", an acronym for "this part of Twitter" [2] ) or "EA-adjacent". [3]

Description

Rationality

The rationalists are concerned with applying science and probability to various topics, [4] with special attention to Bayesian inference. [5] According to Ellen Huet, the rationalist community "aim[s] to keep their thinking unbiased, even when the conclusions are scary". [6]

The early rationalist blogs LessWrong and Slate Star Codex attracted a STEM-interested audience that cared about self-improvement, and was suspicious of the humanities and how emotions inhibit rational thinking. [7] The movement attracted the attention of the founder culture of Silicon Valley, leading to many shared cultural shibboleths and obsessions, especially optimism about the ability of intelligent capitalists and technocrats to create widespread prosperity. [8] [9]

Writing for The New Atlantis , Tara Isabella Burton describes rationalist culture as having a "technocratic focus on ameliorating the human condition through hyper-utilitarian goals", [10] with the "distinctly liberal optimism... that defines so much of Silicon Valley ideology — that intelligent people, using the right epistemic tools, can think better, and save the world by doing so". [11] Burton writes that "Central to the rationalist worldview was the idea that nothing — not social niceties, not fear of political incorrectness, certainly not unwarranted emotion — could, or should, get between human beings and their ability to apprehend the world as it really is". [12]

AI safety

One of the main interests of the rationalist community is combating existential risk posed by the emergence of an artificial superintelligence. [13] [14] Many members of the rationalist community believe that it is one of the only communities that has a chance at saving humanity from extinction. [15] [16] [17] The stress associated with this consequential responsibility has been a contributing factor to mental health crises among several rationalists. [18] [19]

Extreme values

Bloomberg Businessweek journalist Ellen Huet adds that the rationalist movement "valorizes extremes: seeking rational truth above all else, donating the most money and doing the utmost good for the most important reason. This way of thinking can lend an attractive clarity, but it can also provide cover for destructive or despicable behavior". [20]

Writing in The New Yorker , Gideon Lewis-Kraus argues that rationalists "have given safe harbor to some genuinely egregious ideas," such as scientific racism and neoreactionary views, and that "the rationalists' general willingness to pursue orderly exchanges on objectionable topics, often with monstrous people, remains not only a point of pride but a constitutive part of the subculture's self-understanding." [21] Though this attitude is based on "the view that vile ideas should be countenanced and refuted rather than left to accrue the status of forbidden knowledge", [21] rationalists also hold the view that other ideas, referred to as information hazards, are dangerous and should be suppressed. [22] Roko's basilisk and the writings of Ziz LaSota are commonly cited information hazards among rationalists. [18]

Some members and former members of the community have said that aspects of the community are cult-like. [23] [24] In The New York Times , religious scholar Greg Epstein stated: "When you think about the billions at stake and the radical transformation of lives across the world because of the eccentric vision of this group, how much more cult-y does it have to be for this to be a cult? Not much." [25]

Lifestyle

While the movement has online origins, the community is also active and close-knit offline. The community is especially active in the San Francisco Bay Area, where many rationalists live in intentional communities and some engage in polyamorous relationships with other rationalists. [26] [27] [28]

History

LessWrong was originally founded in 2009, [29] although the community had previously existed on various blogs on the Internet, including Overcoming Bias (founded 2006). Slate Star Codex was launched in 2013, and its successor blog Astral Codex Ten was launched on January 21, 2021. [30] [31] [32]

Eliezer Yudkowsky created LessWrong and is regarded as a major figure within the movement. He has also published the Harry Potter fanfiction called Harry Potter and the Methods of Rationality from 2010 to 2015, which led people towards LessWrong and the rationalist community. [33] [34] Harry Potter and the Methods of Rationality was a highly popular fanfiction and is well-regarded within the rationalist community. [35] [36] Yudkowsky has used the work to solicit donations for the Center for Applied Rationality, which teaches courses based on it, [37] [38] and a 2013 LessWrong survey revealed a quarter of its users had found the site due to the fanfiction. [39]

In the 2010s, the rationalist community emerged as a major force in Silicon Valley. [40] [41] Sillicon Valley founders such as Elon Musk, Peter Thiel, Vitalik Buterin, Dustin Moskovitz, and Jaan Tallinn have donated to rationalist-associated institutions or otherwise supported rationalist figures. [42] [43] [25] The movement has directed hundreds of millions of dollars towards companies, research labs, and think tanks aligned with its objectives, and was influential in the abortive removal of Sam Altman from OpenAI. [25]

Bay Area organizations associated with the rationalist community include the Center for Applied Rationality, which teaches the techniques of rationality espoused by rationalists, and the Machine Intelligence Research Institute, which conducts research on AI safety. [44] [45] [46]

Overlapping movements and offshoots

The borders of the rationalist community are blurry and subject to debate among the community and adjacent groups. [1] The rationalist community has a large overlap with effective altruism [47] [48] and transhumanism. [49] Critics such as computer scientist Timnit Gebru and philosopher Émile P. Torres link rationalists with other philosophies they collectively name TESCREAL: Transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism, and longtermism. [50] Members who diverge from typical rationalist beliefs often self-describe as "rationalist-adjacent", "post-rationalist" (also known as "ingroup" and "TPOT", an acronym for "this part of Twitter" [2] ) or "EA-adjacent". [3]

Effective altruism

Effective altruism (EA) is a 21st-century philosophical and social movement that advocates impartially calculating benefits and prioritizing causes to provide the greatest good. It is motivated by "using evidence and reason to figure out how to benefit others as much as possible, and taking action on that basis". [51] [52] People who pursue the goals of effective altruism, who are sometimes called effective altruists, [53] follow a variety of approaches proposed by the movement, such as donating to selected charities and choosing careers with the aim of maximizing positive impact. The movement gained popularity outside academia, spurring the creation of research centers, advisory organizations, and charities, which collectively have donated several hundred million dollars. [54]

Effective altruists emphasize impartiality and the global equal consideration of interests when choosing beneficiaries. Popular cause priorities within effective altruism include global health and development, social and economic inequality, animal welfare, and risks to the survival of humanity over the long-term future. Only a small portion of all charities are affiliated with effective altruism, except in niche areas such as farmed-animal welfare, AI safety, and biosecurity. [55]

The movement developed during the 2000s, and the name effective altruism was coined in 2011. Philosophers influential to the movement include Peter Singer, Toby Ord, and William MacAskill. What began as a set of evaluation techniques advocated by a diffuse coalition evolved into an identity. [56] Effective altruism has ties to elite universities in the United States and United Kingdom, and became associated with Silicon Valley's technology industry. [57]

The movement received mainstream attention and criticism with the bankruptcy of the cryptocurrency exchange FTX as founder Sam Bankman-Fried was a major funder of effective altruism causes prior to late 2022. [58] [59]

Postrationalists

The postrationalists are a loose group of one-time rationalists who became disillusioned with the rationalist community, which they came to perceive as "a little culty [and] dogmatic" [24] and as having lost focus on the less quantifiable elements of a well-lived human life. [10] This community also goes by the acronym TPOT, standing for This Part of Twitter. [60] [2] The term postrationalist is also used as a hedge by people associated with the rationalist community who have drifted from its orthodoxy. [3]

Zizians

The Zizians are a splinter [61] group with an ideological emphasis on veganism and anarchism, which became well known in 2025 for being suspected of involvement in four murders. [62] The Zizians originally formed around the Bay Area rationalist community, but became disillusioned with other rationalist organizations and leaders. Among the Zizians' accusations against them were anti-transgender discrimination, misuse of donor funds to pay off a sexual misconduct accuser, and not valuing animal welfare in plans for human-friendly AI. [63]

The group has been called radical or cult-like by publications such as The Independent, [64] the Associated Press, [65] SFGate , [66] and Reason . [67] The Boston Globe and The New York Times have compared the Zizians to the Manson Family. [68] [69] Similarly, Anna Salamon, the director of the Center for Applied Rationality, compared the Zizian belief system to that of a doomsday cult. [69]

See also

References

  1. 1 2 Huet 2023, The borders of any community this pedantic can be difficult to define. Some rationalists don't consider themselves effective altruists, and vice versa.
  2. 1 2 3 Shugerman 2024, Members of the TPOT community are often referred to as "post-rationalists".
  3. 1 2 3 Huet 2023, Many people who've drifted slightly from a particular orthodoxy hedge their precise beliefs with terms such as "post-rationalist" or "EA-adjacent.".
  4. Metz 2021, The Rationalists saw themselves as people who applied scientific thought to almost any topic. This often involved "Bayesian reasoning," a way of using statistics and probability to inform beliefs.
  5. Frank, Sam (January 2015). "Come With Us If You Want to Live". Harper's Magazine . Archived from the original on 2025-02-02. Retrieved 2025-04-02. "Bayesian" has a special status in the rationalist community because it's the least imperfect way to think
  6. Huet 2023, a community of people who call themselves rationalists and aim to keep their thinking unbiased, even when the conclusions are scary.
  7. Burton 2023, Both LessWrong and the similarly-focused Slate Star Codex... attracted not just passive readers but enthusiastic commenters, who were drawn to the promise of individual self-improvement as well as the potential to discuss philosophy, science, and technology with people as uncompromisingly devoted to the truth as they believed they were. These commenters — a mixture of the traditionally educated and autodidacts, generally STEM-focused and with a higher-than-average share of people who identified as being on the autism spectrum — tended to be suspicious not just of humanities as a discipline, but of all the ways in which human emotional response clouded practical judgment..
  8. Burton 2023, Rationalist culture — and its cultural shibboleths and obsessions — became inextricably intertwined with the founder culture of Silicon Valley as a whole, with its faith in intelligent creators who could figure out the tech, mental and physical alike, that could get us out of the mess of being human.
  9. Frank 2015, Thiel and Vassar and Yudkowsky, for all their far-out rhetoric, take it on faith that corporate capitalism, unchecked just a little longer, will bring about this era of widespread abundance.
  10. 1 2 Burton 2023, To them, rationality culture's technocratic focus on ameliorating the human condition through hyper-utilitarian goals ... had come at the expense of taking seriously the less quantifiable elements of a well-lived human life.
  11. Burton 2023, You might call it the postrationalist turn... The chipper, distinctly liberal optimism of rationalist culture that defines so much of Silicon Valley ideology — that intelligent people, using the right epistemic tools, can think better, and save the world by doing so — is giving way, not to pessimism, exactly, but to a kind of techno-apocalypticism.
  12. Burton 2023, Central to the rationalist worldview was the idea that nothing — not social niceties, not fear of political incorrectness, certainly not unwarranted emotion — could, or should, get between human beings and their ability to apprehend the world as it really is.
  13. Huet 2023, Since the early 2000s, Yudkowsky has argued that hostile artificial intelligence could destroy humanity within decades. This driving belief has made him an intellectual godfather in a community of people who call themselves rationalists.
  14. Burton 2023, They focused on big-picture, global-level issues, most notably and controversially Yudkowsky's pet concern: the "x-risk" ("x" for existential) that we will inadvertently create unfriendly artificial intelligence that will wipe out human life altogether..
  15. Huet 2023, Within the group, there was an unspoken sense of being the chosen people smart enough to see the truth and save the world, of being "cosmically significant,".
  16. Frank 2015, I asked him about the rationalist community. Were they really going to save the world? From what? "Imagine there is a set of skills," he said. "There is a myth that they are possessed by the whole population, and there is a cynical myth that they're possessed by 10 percent of the population. They've actually been wiped out in all but about one person in three thousand.".
  17. Burton 2023, For many, rationality culture had at least initially offered a thrilling sense of purpose: a chance to be part of a group of brilliant, committed young heroes capable of working together to save all humanity..
  18. 1 2 Barba, Michael; Cassidy, Megan; Gafni, Matthias (2025-05-16). "Before killings linked to cultlike 'Zizians,' a string of psychiatric crises befell AI doomsdayers". San Francisco Chronicle. Archived from the original on 2025-05-16. Retrieved 2025-05-21.
  19. Huet 2023, The underlying ideology valorizes extremes: seeking rational truth above all else, donating the most money and doing the utmost good for the most important reason. This way of thinking can lend an attractive clarity, but it can also provide cover for destructive or despicable behavior..
  20. 1 2 Lewis-Kraus, Gideon (2020-07-09). "Slate Star Codex and Silicon Valley's War Against the Media". The New Yorker. Archived from the original on 2025-02-28. Retrieved 2025-04-05.
  21. Barba, Cassidy & Gafni 2025, Adherents worried about “infohazards,” considered so dangerous that simply learning about them put a person at risk.
  22. Huet 2023, Several current and former members of the community say its dynamics can be "cult-like".
  23. 1 2 Shugerman, Emily (2024-12-10). "This one internet subculture explains murder suspect Luigi Mangione's odd politics". The San Francisco Standard . Retrieved 2025-04-02. former adherents who became "disillusioned with that whole scene, because it's a little culty, it's a little dogmatic"
  24. 1 2 3 Metz, Cade (2025-08-04). "The Rise of Silicon Valley's Techno-Religion". The New York Times . Archived from the original on August 6, 2025. Retrieved 2025-08-04.
  25. Huet 2023, Joseph moved to the Bay Area .... There, she realised the social scene that seemed so sprawling online was far more tight-knit in person. Many rationalists and effective altruists, who call themselves EAs, worked together, invested in one another's companies, lived in communal houses and socialised mainly with each other, sometimes in a web of polyamorous relationships..
  26. Burton 2023, There were commune-style rationalist group houses and polyamorous rationalist group houses devoted to modeling rational principles of good living..
  27. Metz 2021, The Rationalists held regular meet-ups around the world, from Silicon Valley to Amsterdam to Australia. Some lived in group houses. Some practiced polyamory..
  28. "History of LessWrong". www.lesswrong.com. 9 March 2021. Retrieved 2025-06-21.
  29. "Rationalist Movement – LessWrong". www.lesswrong.com. Archived from the original on 2023-06-17. Retrieved 2023-06-19.
  30. Metz, Cade (2021-02-13). "Silicon Valley's Safe Space". The New York Times. ISSN   0362-4331. Archived from the original on 2021-04-20. Retrieved 2023-06-19.
  31. The Rationalist's Guide to the Galaxy: Superintelligent AI and the Geeks Who Are Trying to Save Humanity's Future. Orion. 13 June 2019. ISBN   9781474608800. Archived from the original on 18 May 2023. Retrieved 23 June 2023.
  32. Whelan, David (March 2, 2015). "The Harry Potter Fan Fiction Author Who Wants to Make Everyone a Little More Rational". Vice . Retrieved 11 April 2025.
  33. Burton 2023, In his Harry Potter and the Methods of Rationality — perhaps old-school rationalists' most effective recruiting text — Eliezer Yudkowsky is clear that part of the appeal of rationality is the promise of self-overcoming, of becoming more than merely human.
  34. Frank 2015, The next year, Yudkowsky began publishing Harry Potter and the Methods of Rationality at fanfiction.net. The Harry Potter category is the site's most popular, with almost 700,000 stories; of these, HPMoR is the most reviewed and the second-most favorited.
  35. Koebler, Jason (20 November 2023). "New OpenAI CEO Was a Character in a Harry Potter Fanfic That's Wildly Popular With Effective Altruists". 404 Media . Retrieved 11 April 2025.
  36. Whelan, David (March 2, 2015). "The Harry Potter Fan Fiction Author Who Wants to Make Everyone a Little More Rational". Vice . Retrieved 23 December 2015.
  37. Huet 2023, Many of its serialised chapters directed readers to LessWrong posts about rationalist tenets, and some solicited donations to the Centre for Applied Rationality (CFAR), a Yudkowsky-affiliated institute in Berkeley.
  38. Frank 2015, Of the 1,636 people who responded to a 2013 survey of Less Wrong's readers, one quarter had found the site thanks to HPMoR, and many more had read the book.
  39. Tiku, Nitasha (2022-11-17). "The do-gooder movement that shielded Sam Bankman-Fried from scrutiny". The Washington Post. Retrieved 2022-11-25.
  40. Sargeant, Alexi (3 January 2018). "Simulating Religion". Plough. Retrieved 22 February 2024.
  41. Huet 2023, The movement's leaders have received support from some of the richest and most powerful people in tech, including Elon Musk, Peter Thiel and Ethereum creator Vitalik Buterin.
  42. Burton 2023, Investor Peter Thiel gave over $1 million to Yudkowsky's Machine Intelligence Research Institute. Elon Musk met his now-ex Grimes when the two bonded on Twitter over a rationalist meme.
  43. Frank 2015, Whereas MIRI aims to ensure human-friendly artificial intelligence, an associated program, the Center for Applied Rationality, helps humans optimize their own minds, in accordance with Bayes's Theorem..
  44. Metz 2021, Because the Rationalists believed A.I. could end up destroying the world — a not entirely novel fear to anyone who has seen science fiction movies — they wanted to guard against it. Many worked for and donated money to MIRI, an organization created by Mr. Yudkowsky whose stated mission was "A.I. safety.".
  45. Ratliff 2025, One was an alumni gathering for a nonprofit called the Center for Applied Rationality. The Bay Area group ran workshops dedicated to "developing clear thinking for the sake of humanity's future," as they put it.... CFAR was itself an outgrowth of another organization, the Machine Intelligence Research Institute, devoted to the technical endeavor of creating artificial intelligence that wouldn't destroy the world..
  46. Metz 2021, 'Many Rationalists embraced "effective altruism," an effort to remake charity by calculating how many people would benefit from a given donation.'.
  47. Huet, Ellen (2023-03-07). "The Real-Life Consequences of Silicon Valley's AI Obsession". Bloomberg Businessweek . Archived from the original on 2025-03-01. Retrieved 2025-04-08. 'These distinct but overlapping groups developed in online forums'
  48. Burton, Tara Isabella (Spring 2023). "Rational Magic". The New Atlantis . Retrieved 2025-04-02. There were rationalist sister movements: the transhumanists, who believed in hacking and improving the "wetware" of the human body; and the effective altruists, who posited that the best way to make the world a better place is to abandon cheap sentiment entirely
  49. "The Wide Angle: Understanding TESCREAL — Silicon Valley's Rightward Turn". May 2023. Archived from the original on 2023-06-06. Retrieved 2023-06-06.
  50. MacAskill, William (January 2017). "Effective altruism: introduction". Essays in Philosophy. 18 (1): eP1580:1–5. doi: 10.7710/1526-0569.1580 . ISSN   1526-0569. Archived from the original on 2019-08-07. Retrieved 2020-02-08.
  51. The quoted definition is endorsed by a number of organizations at: "CEA's Guiding Principles". Centre For Effective Altruism. Retrieved 2021-12-03.
  52. The term effective altruists is used to refer to people who embrace effective altruism in many published sources such as Oliver (2014) , Singer (2015) , and MacAskill (2017) , though as Pummer & MacAskill (2020) noted, calling people "effective altruists" minimally means that they are engaged in the project of "using evidence and reason to try to find out how to do the most good, and on this basis trying to do the most good", not that they are perfectly effective nor even that they necessarily participate in the effective altruism community.
  53. Adams, Crary & Gruen 2023, p. xxii.
  54. Piper, Kelsey (2024-02-09). "Can effective altruism stay effective?". Vox. Retrieved 2025-05-31. But within various niche areas that ordinary Americans generally ignore and that have few other funders — like farmed-animal welfare, AI safety, and biosafety and biosecurity — the movement's influence has been much larger.
  55. Lewis-Kraus, Gideon (2022-08-08). "The Reluctant Prophet of Effective Altruism". The New Yorker. ISSN   0028-792X . Retrieved 2022-12-04.
  56. Tiku, Nitasha (2023-07-05). "How elite schools like Stanford became fixated on the AI apocalypse". The Washington Post. ISSN   0190-8286 . Retrieved 2025-05-31.
  57. FTX (8 February 2021). "The FTX Foundation for Charitable Giving". ftx.medium.com. Archived from the original on 2021-02-08. Retrieved 2024-02-29.
  58. Howcroft, Elizabeth (6 April 2023). "Collapse of FTX deprives academics of grants, stokes fears of forced repayment". Reuters . Retrieved 2024-02-29.
  59. Burton 2023, the postrationalists — also known by the jokey endonym "this part of Twitter," or TPOT.
  60. Ratliff 2025, The Zizians came together over the next two years, splintering off one by one from the established rationalist and EA communities.
  61. Ratliff, Evan (February 21, 2025). "The Delirious, Violent, Impossible True Story of the Zizians" . Wired . Archived from the original on February 26, 2025. Retrieved February 26, 2025.
  62. Ratliff 2025, 'They alleged that MIRI had "paid out blackmail (using donor funds)" to quash sexual misconduct accusations and that CFAR's leader "discriminates against trans women."...expressed outrage that MIRI's efforts to create human-friendly AI didn't seem to include other animals in the equation.'.
  63. Dodda, Io (February 19, 2025). "Inside the 'Zizians': How a cultish group of radical vegans is now linked to 6 deaths". The Independent . Archived from the original on 2025-06-19. Retrieved 2025-03-21.
  64. "Police arrest apparent leader of Zizian group tied to killing of U.S. border agent near Canada". CBC . Associated Press. February 17, 2025. Archived from the original on 2025-04-02. Retrieved 2025-03-21.
  65. Chamings, Andrew; Dowd, Katie (February 13, 2025). "A 'death cult' on the run: The Bay Area fringe group terrorizing America". SFGate . Archived from the original on February 22, 2025. Retrieved March 1, 2025.
  66. Wolfe, Liz (January 31, 2025). "California's Rationalist-Linked Death Cult". Reason Magazine. Retrieved August 5, 2025.
  67. Cullen, Kevin (28 June 2025). "Government wants to execute Teresa Youngblut after Vermont shooting that killed border agent, her lawyer says - The Boston Globe". The Boston Globe . Retrieved 2025-07-08.
  68. 1 2 Beam, Christopher (2025-07-06). "She Wanted to Save the World From A.I. Then the Killings Started". The New York Times . ISSN   0362-4331 . Retrieved 2025-07-08.