Contemporary rationalists and contrarians

Cloth mother/wire mother for cognition

As a community

Attempt 1: A rationalist is a person who, upon reading about the social nature of human cognition, forms a social group designed to do cognition.

Attempt 2: A rationalist is a person who, upon being asked about whether they read responds “Oh yes, but I don’t agree with 60% of what they say of course.”

Maybe I like attempt 2 better as a definition. It’s a common mistake of neophyte rationalists to agree with prestigious figures, the whole red-queen signalling race here is people experimenting with robust norms of arguing. Agreeing too much is for n00bs.

Attempt 3: A rationalist is a person who is prepared to dedicate time to signalling that they have what the community agrees is rationality through a carefully calibrated balance of agreement and disagreement with prominent rationalists.

This is why I am interested in the output of this community/low-key social experiment. Even though it is dunderheaded and vexatious like all communities, it seems they occasionally get it right, and bat somewhat above average in terms of proportion of the chatter that is useful and/or true and/or interesting.

To participate feels slightly like participating in academia, but with a lower different bar for entry. Plus: This avoids some false negatives. People who have non academic careers can be involved and bring perspective from outside the ivory tower. It also leads to false positives. Many aspirational community members who are not as good at being clever as they are at adopting the trappings thereof remain chattering away in the name of free speech, openness, robustness of debate and so on. Academia does a better job at removing such persons, performance-managing them out of the ivory tower, or possibly shoving them out, or sticking them on a workplace safety committee with ill-defined scope in perpetuity.

This might be a feature rather than a bug for their community design goals. They do, after all, explicitly aim to include the viewpoints of people who are excluded from other online communities of discourse. And there are systems to surface higher-grade morsels from the thought gumbo. Vehicles such as Less wrong, for example surface some quality content. Lesswrong’s bandwidth is still too much though. I have too many journals and conferences to stay abreast of as it is, and I am confused, how people have time to engage productively with that vast body of semi-reviewed speculative writing. As such this community very much resembles other scientific communities of practice.

Alternatively, if all else fails:

Attempt 4: A rationalist is someone who will explain to you that a big problem that no-one is sufficiently concerned about is smart machines taking over society.

As a cult

The rationalist path to reason has definite parallels with the induction process into a cult, although a cult wherein notionally the inner truths you unlock are distinguished in certain ways:

  1. They seem to be typically more empirically informed than, e.g scientology
  2. Not obviously intended to turn you into a foot soldier in a charismatic cult leader’s army

Nice. But back to the parallels: There is a sacred text and much commentary by the learned. This is not a criticism, per se. This community is all about knowingly sticking levers in the weird crevices the human mind, including religious ones..

This is not my spot to evangelise for the rationalist community, which I am too cool to consider myself part of, because I only join communities with meta- or post- as their prefixes. But the rationalists do come up with some fun stuff, which I agree with the regulation 40% of, as handed down embossed on gold plates by Elizier Yudkowsky.

As a community of practice

Some interesting things going on here, but I am possibly not the one to write the social history of them.

It would also be interesting to document the fall of the original, and the role of neoreactionaries in that process.

As a discipline

Jakob Falkovich believes that rationalism is a skillset/discipline that leads to its practitioners being different. Also he can turn a phrase, so I will let him what for what rationalism as a discipline might be:

Michael Vassar says that what Rationalists call “thinking” is treated by most people as a rare technical ability (“design thinking”) that normal people can only pretend to do. What they call “thinking” we call “being depressed and anxious”. This sounded crazy when I first heard it, but the more I mulled it over the more it made sense and explained much of what has been happening in the last year.

Social reality is what is normal, accepted, cool, predictable, expected, rewarded, agreed upon. Physical reality is what is out there determining the outcomes of physical experiments, such as whether you get COVID or not if you wear a mask. When Rationalists say “thinking” they usually mean something like “using your effortful system 2 to determine something about physical reality”. It’s what I try to do when writing posts about COVID. Swimming in social reality is best done on feeling and intuition, not “thought”.

My experience is spending perhaps 97% of my time in social reality, swimming along with everyone else. 3% of the time I notice some confusion, an unexpected mismatch between my predictions and what physical reality hits me with, and try to think through a solution. 3% is enough to notice the difference between the two modes and to be able to switch between them on purpose.

I don’t think that this experience is typical.

With Vassar in mind, my best guess of the typical experience is being in social reality 99.9% of the time. The 0.1% are extreme shocks, cases when physical reality kicks someone so far off-script they are forced to confront it directly. These experiences are extremely unpleasant, and processing them appears as “depression and anxiety”. One looks at the first opportunity to dive back into the safety of social reality, in the form of a communal narrative that “makes sense” of what happened and suggests an appropriate course of action.

Julia Galef in The Scout Mindset makes a case that certain mindset habits can assist rationality.

Jacob Falkovitch, in Is Rationalist Self-Improvement Real?, has ideas about the effectiveness of trying to be more rational (in more areas than theory of mind). TODO: question his apparent assumption that casual commenters on rationality blogs are measurably more committed to rationality than drive-by commenter on any other site. That rationality blogs attract or cultivate more rational individuals is a hypothesis to test.

In the internet of dunks

Exactly like every other community online, the rationalist community labours under the burden of being judged by the self-selected, most intention-grabbingly grating of the people who claim to be part of it. For all the vaunted claims that the rationalists have fostered healthy norms of robust intellectual debate and so on, their comments are a mess just like everyone else’s. This is empirically verifiable. It is a recommended experience to try to contribute to the discussion in the comment threads dangling from, say, some Scott Alexander article. One with the typical halo of erudition, one that hits all the right notes of making you feel smart because you had that “a-HA” feeling and nodded along to it. “Oh!” you might think to yourself, “this intellectual ship would be propelled further out into the oceans of truth if I stuck my oar in, and other people who are, as I, elevated enough to read this blog, they will see my cleverness in turn, and we will together row ourselves onward, like brains-vikings in an intellectual longboat.”

You might think that, possibly with a better metaphor if you are superior person. But I’ll lay you odds of 4:1 5:2 against anything fun happening at that moment you put this to experimental test. More likely upon sticking your oar in you will find yourself in the usual lightly-moderated internet dogpile of people straw-manning and talking past each other in their haste to enact the appearance of healthy norms of thoughtful, robust debate, mostly without the more onerous labour of doing thoughtful, robust debate. “Mostly” in which metric? For sure by word count, facile and vacuous verbiage predominates. By head-count, maybe the situation is less grim? The thing is, producing thoughtless twaddle is cheaper and easier per word than finely honed reasoning, and the typically hardline open-door comment policy requires twaddle be given the benefit of the doubt.

Bitter corollary: Odds are no favourable that you are qualified to assess whether your own comment was twaddle.

No mistake, I think some useful and interesting debates have come out of card-carrying rationalists. Even, occasionally, from the comment threads. Just not often, between all the facile value-signalling and people claiming other people’s value systems are religions. I doubt that the bloggers who host these blogs would themselves argue otherwise, or even find it surprising, but you know, it still seems to startle neophytes and journalists. It is likely that the modest odds of a good debate are nonetheless better than the baseline extremely tiny odds elsewhere on the internet.

Occasionally I feel that rationalists set themselves up with a difficult task in this regard. The preponderance of reader surveys and comment engagement seems to indicate that rationalists are prone to regarding people who show up and fill out a survey as community members. This leads to a classic online social movement problem, which might be interpolated into Ben Sixsmith’s discussion of online-community-as-religion:

Participation in online communities requires far less personal commitment than those of real life. And commitment has often cloaked hypocrisy. Men could play the role of God-fearing family men rationalist in public, for example, while cheating on their wives and abusing their kids failing to participate in prediction markets. Being a respectable member of their community depended, to a great extent, on being a family rational man, but being a respectable member of online right-wing rationalist communities depends only on endorsing the concept.

Long story short, the rationalist corner of the internet is still full of social climbing, facile virtue signalling, trolling and general foolishness. If we insist on judging communities en masse though, which we do, the bar is presumably not whether they are all angels, or every community must be damned. We presumably care whether they do detectably better better than the (toxic, atavistic) baseline. Perhaps this particular experiment attains the very best humans being can do on the internet. Perhaps this shambling nerdy quagmire has proportionally the highest admixture of quality thought possible from an open, self policing community.

Rationalist dialect

There is a complicated vocabulary for compassion, sympathy and empathy. Is there evidence that this vocabulary leads to clearer thinking about morality? Or more moral behaviour?
effective altruism
Marginally data-informed marginalist charity
update my priors
See Richard Ngo’s essay for some discussion of this community’s apparent adoption of strong Bayesianism.
A crux is a crucial ingredient in your belief. See Double-cruxing. Stating them is regarded as a good-faith rhetorical strategy:
I could be wrong. If you think so, tell me. I would like to believe true things
I kinda like this disclaimer, which appears on various blogs. It is kind of a compressed rationalist manifesto that certain blogs are trialling as a way to get the discusssion started on a good foot. I wonder how effective it is.

Slate Star Codex kerfuffle

A concrete example with actual names and event is illustrative. Here are some warm takes on a high profile happening.

There is some invective on the theme of renewing journalism more broadly:

The actual story here IMO, if we ignore the particulars of the effects upon these protagonists for a moment, is that the internet of dunks grates against the internet of debate, if the latter is a thing except in our imaginations.

That said, N-Gate’s burn was funny for me.

just because the comments section of some asshole’s blog happens to be a place where technolibertarians cross-pollinate with white supremacists, says Hackernews, doesn’t mean it’s fair to focus on that instead of on how smart that blog’s readership has convinced itself it is. So smart, in fact, that to criticize them at all is tantamount to an admission that you’re up to something. This sort of censorship, concludes Hackernews, should never have been allowed to be published.

Demography of the rationalists

Extremely interesting, to me at least. TODO: track down references to the make-up of the community in terms of the unusual prevalence of transgender people, polyamorous people, depressed people, ethnically (less often religiously) Jewish people… Also interesting: political diversity. Less interesting but notable: preponderance of males, prevalence of autism spectrum disorders.


Lewis-Kraus, Gideon. n.d. “Slate Star Codex and Silicon Valley’s War Against the Media.” The New Yorker. Accessed February 13, 2021.
Metz, Cade. 2021. “Silicon Valley’s Safe Space.” The New York Times, February 13, 2021, sec. Technology.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.