Contemporary rationalists and contrarians

Cloth mother/wire mother for cognition


As a community

Attempt 1: A rationalist is a person who, upon reading about the social nature of human cognition, forms a social group designed to do cognition.

Attempt 2: A rationalist is a person who, upon being asked about whether they read responds “Oh yes, but I don’t agree with 60% of what they say of course.”

Maybe I like attempt 2 better as a definition. It’s a common mistake of neophyte rationalists to agree with prestigious figures, the whole red-queen signalling race here is people experimenting with robust norms of arguing. Agreeing too much is for n00bs.

Attempt 3: A rationalist is a person who is prepared to dedicate time to signalling that they have what the community agrees is rationality through a carefully calibrated balance of agreement and disagreement with prominent rationalists.

This is why I am interested in the output of this community/low-key social experiment. Even though it is dunderheaded and vexatious like all communities, it seems they occasionally get it right, and bat somewhat above average in terms of proportion of the chatter that is useful and/or true and/or interesting.

To participate feels slightly like participating in academia, but with a lower different bar for entry. Plus: This avoids some false negatives. People who have non academic careers can be involved and bring perspective from outside the ivory tower. It also leads to false positives. Many aspirational community members who are not as good at being clever as they are at adopting the trappings thereof remain chattering away in the name of free speech, openness, robustness of debate and so on. Academia does a better job at removing such persons, performance-managing them out of the ivory tower, or possibly shoving them out, or sticking them on a workplace safety committee with ill-defined scope in perpetuity.

This might be a feature rather than a bug for their community design goals. They do, after all, explicitly aim to include the viewpoints of people who are excluded from other online communities of discourse. And there are systems to surface higher-grade morsels from the thought gumbo. Vehicles such as Less wrong, for example surface some quality content. Lesswrong’s bandwidth is still too much though. I have too many journals and conferences to stay abreast of as it is, and I am confused, how people have time to engage productively with that vast body of semi-reviewed speculative writing. As such this community very much resembles other scientific communities of practice.

Alternatively, if all else fails:

Attempt 4: A rationalist is someone who will explain to you that a big problem that no-one is sufficiently concerned about is smart machines taking over society.

As a cult

The rationalist path to reason has definite parallels with the induction process into a cult, although a cult wherein notionally the inner truths you unlock are distinguished in certain ways:

  1. They seem to be typically more empirically informed than, e.g scientology
  2. Not obviously intended to turn you into a foot soldier in a charismatic cult leader’s army

Nice. But back to the parallels: There is a sacred text and much commentary by the learned. This is not a criticism, per se. This community is all about knowingly sticking levers in the the weird crevices the human mind, including religious ones..

This is not my spot to evangelise for the rationalist community, which I am too cool to consider myself part of, because I only join communities with meta- or post- as their prefixes. But the rationalists do come up with some fun stuff, which I agree with the regulation 40% of, as handed down embossed on gold plates by Elizier Yudkowsky.

As a fragment of the internet of dunks

Exactly like every other community online, the rationalist community labours under the burden of being judged by the self-selected, most intention-grabbingly grating of the people who claim to be part of it. For all the vaunted claims that the rationalists have fostered healthy norms of robust intellectual debate and so on, their comments are a mess just like everyone else’s. This is empirically verifiable. It is a recommended experience to try to contribute to the discussion in the comment threads dangling from, say, some Scott Alexander article. One with the typical halo of erudition, one that hits all the right notes of making you feel smart because you had that “a-HA” feeling and nodded along to it. “Oh!” you might think to yourself, “this intellectual ship would be propelled further out into the oceans of truth if I stuck my oar in, and other people who are, as I, elevated enough to read this blog, they will see my cleverness in turn, and we will together row ourselves onward, like brains-vikings in an intellectual longboat.”

You might think that, possibly with a better metaphor if you are superior person. But I’ll lay you odds of 4:1 5:2 against anything fun happening at that moment you put this to experimental test. More likely thereafter you will find yourself in the usual lightly-moderated internet dogpile of people straw-manning and talking past each other in their haste to enact the appearance of healthy norms of thoughtful, robust debate, mostly without the more onerous labour of doing thoughtful, robust debate. “Mostly” in which metric? For sure by word count, facile and vacuous verbiage predominates. By head-count, maybe the situation is less grim? The thing is, producing thoughtless twaddle is cheaper and easier per word than finely honed reasoning.

No mistake, I think some useful and interesting debates have come out of card-carrying rationalists. Even, occasionally, from the comment threads. Just not often, between all the facile value-signalling and people claiming other people’s value systems are religions. I doubt that the bloggers who host these blogs would themselves argue otherwise, or even find it surprising, but you know, it still seems to startle neophytes and journalists. It is likely that the modest odds of a good debate are nonetheless better than the baseline extremely tiny odds elsewhere on the internet.

Occasionally I feel that rationalists set themselves up with a difficult task in this regard. The preponderance of reader surveys and comment engagement seems to indicate that rationalists are prone to regarding people who show up and fill out a survey as community members. This leads to a classic online social movement problem, which might be interpolated into Ben Sixsmith’s discussion of online-community-as-religion:

Participation in online communities requires far less personal commitment than those of real life. And commitment has often cloaked hypocrisy. Men could play the role of God-fearing family men rationalist in public, for example, while cheating on their wives and abusing their kids failing to participate in prediction markets. Being a respectable member of their community depended, to a great extent, on being a family rational man, but being a respectable member of online right-wing rationalist communities depends only on endorsing the concept.

Long story short, the rationalist corner of the internet is still full of social climbing, facile virtue signalling, trolling and general foolishness. If we insist on judging communities en masse though, which we do, the bar is presumably not whether they are all angels, or every community must be damned. We presumably care whether they do detectably better better than the (toxic, atavistic) baseline. Perhaps this particular experiment attains the very best humans being can do on the internet. Perhaps this shambling nerdy quagmire has proportionally the highest admixture of quality thought possible from an open, self policing community.

Rationalist dialect

empathies
There is a complicated vocabulary for compassion, sympathy and empathy. Is there evidence that this vocabulary leads to clearer thinking about morality? Or more moral behaviour?
effective altruism
Marginally data-informed marginalist charity
update my priors
See Richard Ngo’s essay for some discussion of this community’s adoption of strong Bayesianism.

Slate Star Codex kerfuffle

A concrete example with actual names and event is illustrative. Here are some warm takes on a high profile happening.

There is some invective on the theme of renewing journalism more broadly:

The actual story here IMO, if we ignore the particulars of the effects upon these protagonists for a moment, is that the internet of dunks grates against the internet of debate, if the latter is a thing except in our imaginations.

That said, N-Gate’s burn raises a fun point:

just because the comments section of some asshole’s blog happens to be a place where technolibertarians cross-pollinate with white supremacists, says Hackernews, doesn’t mean it’s fair to focus on that instead of on how smart that blog’s readership has convinced itself it is. So smart, in fact, that to criticize them at all is tantamount to an admission that you’re up to something. This sort of censorship, concludes Hackernews, should never have been allowed to be published.

Demography of the rationalists

Extremely interesting, to me at least. TODO: track down references to the make-up of the community in terms of the unusual prevalence of transgender people, polyamorous people, depressed people, ethnically (less often religiously) Jewish people… Also interesting: political diversity. Less interesting but notable: preponderance of males, prevalence of autism spectrum disorders.

References

Lewis-Kraus, Gideon. n.d. “Slate Star Codex and Silicon Valley’s War Against the Media.” The New Yorker. Accessed February 13, 2021. https://www.deccanherald.com/business/technology/why-slate-star-codex-is-silicon-valley-s-safe-space-950727.html.
Metz, Cade. 2021. “Silicon Valley’s Safe Space.” The New York Times: Technology, February 13, 2021. https://www.nytimes.com/2021/02/13/technology/slate-star-codex-rationalists.html.

Warning! Experimental comments system! If is does not work for you, let me know via the contact form.

No comments yet!

GitHub-flavored Markdown & a sane subset of HTML is supported.