Contemporary rationalists

Cloth mother/wire mother for cognition

September 6, 2020 — August 9, 2022

crisis
culture
ethics
history
language
mind
wonk

Content warning:

Discussion of an open community that specifically aims to engage with arguments despite edginess. Only ever one click away from finding dialogues with people with uncomfortable opinions on objectionable topics.

Figure 1

Related: decoupling. Usually opposed: conflict theory.

1 As a community

Attempt 1: A rationalist is a person who, upon reading about the social nature of human cognition, forms a social group designed to do cognition.

This is why I am interested in the output of this community/low-key social experiment. Even though it is dunderheaded and vexatious like all communities, it seems they occasionally get it right, and bat somewhat above average in terms of proportion of the chatter that is useful and/or true and/or interesting.

Attempt 2: A rationalist is a person who is prepared to dedicate time to signalling that they have what the community agrees is rationality through a carefully calibrated balance of agreement and disagreement with prominent rationalists.

Maybe I like attempt 2 better as a definition. It’s a common mistake of neophyte rationalists to agree with prestigious figures, the whole red-queen signalling race is people experimenting with robust norms of arguing. Agreeing too much is for n00bs.

To participate in this community feels slightly like participating in academia, but with a lower different bar for entry.

Benefit: This avoids some false negatives. People who have non academic careers can be involved and bring perspective from outside the ivory tower. It also leads to false positives, in that there is a lot of dross that one must sift through to find the helpfully outsider perspectives. Many aspirational community members who are not as good at being clever as they are at adopting the trappings thereof remain chattering away in the name of free speech, openness, robustness of debate and so on. Academia does a somewhat better job at removing such persons, performance-managing them out of the ivory tower, or possibly shoving them out, or sticking them on a workplace safety committee with ill-defined scope, in perpetuity.

This differently-policed membership is a community design goal; they are explicit about aim to include the viewpoints of people who are excluded from other online communities of discourse. And there are systems to surface higher-grade morsels from the thought gumbo. Vehicles such as Less wrong, for example, surface some quality content. Eventually. Lesswrong’s bandwidth is still too much for me though. I have too many journals and conferences to stay abreast of as it is, and I am confused how people have time to engage productively with a whole ’nother vast body of semi-reviewed speculative writing. As such, this community very much resembles other scientific communities of practice.

Alternatively:

Attempt 3: A rationalist is someone who will explain to you that a big problem that people are on average insufficiently concerned about smart machines taking over society.

2 As a cult

The rationalist path to reason has definite parallels with the induction process into a cult, although a cult wherein notionally the inner truths you unlock are distinguished in certain ways:

  1. They seem to be typically more empirically informed than, e.g scientology
  2. Less obviously intended to turn you into a foot soldier in a charismatic cult leader’s army

Nice. But back to the parallels: There is a sacred text and much commentary by the learned. This is not a criticism, per se. This community is all about knowingly sticking levers in the weird crevices the human mind, including religious ones..

This is not my spot to evangelise for the rationalist community, which I am too cool to consider myself part of, because I only join communities with meta- or post- as their prefixes. But the rationalists do come up with some fun stuff, which I agree with the regulation 40% of, as handed down embossed on gold plates by Elizier Yudkowsky.

3 As a community of practice

Some interesting things going on here, but I am possibly not the one to write the social history of rationalists.

It would also be interesting to investigate the fall of the original lesswrong.org, and the role of neoreactionaries in that process.

4 As a discipline

Jakob Falkovich believes that rationalism is a skillset/discipline that leads to its practitioners being different. Also he can turn a phrase, so I will let him suppose what rationalism as a discipline might be:

Michael Vassar says that what Rationalists call “thinking” is treated by most people as a rare technical ability (“design thinking”) that normal people can only pretend to do. What they call “thinking” we call “being depressed and anxious”. This sounded crazy when I first heard it, but the more I mulled it over the more it made sense and explained much of what has been happening in the last year.

Social reality is what is normal, accepted, cool, predictable, expected, rewarded, agreed upon. Physical reality is what is out there determining the outcomes of physical experiments, such as whether you get COVID or not if you wear a mask. When Rationalists say “thinking” they usually mean something like “using your effortful system 2 to determine something about physical reality”. It’s what I try to do when writing posts about COVID. Swimming in social reality is best done on feeling and intuition, not “thought”.

My experience is spending perhaps 97% of my time in social reality, swimming along with everyone else. 3% of the time I notice some confusion, an unexpected mismatch between my predictions and what physical reality hits me with, and try to think through a solution. 3% is enough to notice the difference between the two modes and to be able to switch between them on purpose.

I don’t think that this experience is typical.

With Vassar in mind, my best guess of the typical experience is being in social reality 99.9% of the time. The 0.1% are extreme shocks, cases when physical reality kicks someone so far off-script they are forced to confront it directly. These experiences are extremely unpleasant, and processing them appears as “depression and anxiety”. One looks at the first opportunity to dive back into the safety of social reality, in the form of a communal narrative that “makes sense” of what happened and suggests an appropriate course of action.

I am not actually sold on this idea. Falkovitch also argues, in Is Rationalist Self-Improvement Real?, for the effectiveness of trying to be more rational (in more areas than theory of mind). TODO: question his apparent assumption that casual commenters on rationality blogs are measurably more committed to rationality than drive-by commenter on any other site. That rationality blogs attract or cultivate more rational individuals is a hypothesis to test.

Julia Galef in The Scout Mindset makes a case that certain mindset habits can assist rationality, and seems less crazy.

5 In the internet of dunks

Exactly like every other community online, the rationalist community labours under the burden of being judged by the self-selected, most intention-grabbingly grating of the people who claim to be part of it. For all the vaunted claims that the rationalists have fostered healthy norms of robust intellectual debate and so on, their comments are a mess just like everyone else’s. This is empirically verifiable. The degree of mess might be different, but the shining perfection that the internet expects from outgroups it is not.

It is a recommended experience to try to contribute to the discussion in the comment threads dangling from, say, some Scott Alexander article. One with the typical halo of erudition, one that hits all the right notes of making you feel smart because you had that “a-HA” feeling and nodded along to it. “Oh!” you might think to yourself, “this intellectual ship would be propelled further out into the oceans of truth if I stuck my oar in, and other people who are, as I, elevated enough to read this blog, they will see my cleverness in turn, and we will together row ourselves onward, like brains-vikings in an intellectual longboat.”

You might think that, possibly with a better metaphor if you are superior person. But I’ll lay you odds of 4:1 5:2 against anything fun happening at that moment you put this to experimental test. More likely upon sticking your oar in you will find yourself in the usual lightly-moderated internet dogpile of people straw-manning and talking past each other in their haste to enact the appearance of healthy norms of thoughtful, robust debate, mostly without the more onerous labour of doing thoughtful, robust debate. “Mostly” in which metric? For sure by word count, facile and vacuous verbiage predominates. By head-count, maybe the situation is less grim? The thing is, producing thoughtless twaddle is cheaper and easier per word than finely honed reasoning, and the typically hardline open-door comment policy requires twaddle be given the benefit of the doubt.

Bitter corollary: Odds are not favourable that you are qualified to assess whether your own comment was twaddle.

No mistake, I think some useful and interesting debates have come out of card-carrying rationalists. Even, occasionally, from the comment threads, if you have time to surface the good ones between all the facile value-signalling and people claiming other people’s value systems are religions. I doubt that the bloggers who host these blogs would themselves argue otherwise, or even find it surprising, but you know, it still seems to startle neophytes and journalists. It is likely that the modest odds of a good debate are nonetheless better than the baseline extremely tiny odds elsewhere on the internet.

Figure 3: By Reza Farazmand

Occasionally I feel that rationalists set themselves up with a difficult task in this regard. The preponderance of reader surveys and comment engagement seems to indicate that rationalists are prone to regarding people who show up and fill out a survey as community members. This leads to a classic online social movement problem, which might be interpolated into Ben Sixsmith’s discussion of online-community-as-religion:

Participation in online communities requires far less personal commitment than those of real life. And commitment has often cloaked hypocrisy. Men could play the role of God-fearing family men rationalist in public, for example, while cheating on their wives and abusing their kids failing to participate in prediction markets. Being a respectable member of their community depended, to a great extent, on being a family rational man, but being a respectable member of online right-wing rationalist communities depends only on endorsing the concept.

Long story short, the rationalist corner of the internet is still full of social climbing, facile virtue signalling, trolling and general foolishness. If we insist on judging communities en masse though, which we do, the bar is presumably not whether they are all angels, or every community must be damned. We presumably care whether they do detectably better better than the (toxic, atavistic) baseline. Perhaps this particular experiment attains the very best humans being can do on the internet. Perhaps this shambling nerdy quagmire has proportionally the highest admixture of quality thought possible from an open, self policing community.

6 Rationalist dialect

effective altruism
Marginally/marginalist data-informed marginalist charity
update my priors

Ambiguous. Could mean

  1. I believe with high certainty that I am strong Bayesian, or
  2. I am prepared to update my estimate of the plausibility of THIS arguments in the light of new evidence, without holding to spurious certainty, or
  3. I am solving a problem in conditional probability and I just multiplied the prior distribution by the likelihood of the evidence and renormalized (Rarely this one).
crux
A crux is a crucial ingredient in your belief. See Double-cruxing. Stating them is regarded as a good-faith rhetorical strategy.
empathies
There is a complicated vocabulary for compassion, sympathy and empathy to imply that you are being careful about it.
I could be wrong. If you think so, tell me. I would like to believe true things
I kinda like this disclaimer, variants of which appear on various blogs. It is kind of a compressed rationalist manifesto that some people are trialling as a way to get the discussion started on a good foot. I wonder how effective it is. It is probably more effective than claiming to be prepared to ‘update my priors’.
wamb
The opposite of nerd
cheems mindset
cheems mindset: The mindset that leaves you scrambling for reasons why something just can’t be done (as opposed to the Improving mentality.)
Figure 4: SMBC on strict Bayesians

7 Casual use of Bayesian terminology

Figure 5: Two rationalists discovering the sigma-additivity axiom of probability measures

This used to annoy me. It has grown on me, if the goal is to make explicit the idea of trying to consider multiple hypotheses around and act under uncertainty. If people do wish to imply they are actually performing Bayesian inference, Iam skeptical about most uses of this terminology. There the more to do to conduct Bayesian inference, and also it is not enough merely to do Bayesian inference in an open world. I have my eye on you, aspiring rationalists, and am wearing language-police badge.

8 Slate Star Codex kerfuffle

Figure 6

A concrete example with actual names and event is illustrative. Here are some warm takes on a high profile happening.

There is some invective on the theme of renewing journalism more broadly:

The actual story IMO, if we ignore the particulars of the effects upon these protagonists for a moment, is that the internet of dunks grates against the internet of debate, if the latter is a thing except in our imaginations.

That said, N-Gate’s burn was funny for me.

just because the comments section of some asshole’s blog happens to be a place where technolibertarians cross-pollinate with white supremacists, says Hackernews, doesn’t mean it’s fair to focus on that instead of on how smart that blog’s readership has convinced itself it is. So smart, in fact, that to criticize them at all is tantamount to an admission that you’re up to something. This sort of censorship, concludes Hackernews, should never have been allowed to be published.

9 Demography of the rationalists

Extremely interesting, to me at least. TODO: track down references to the make-up of the community in terms of the unusual prevalence of transgender people, polyamorous people, depressed people, ethnically (less often religiously) Jewish people… Also interesting: political diversity. Less interesting but notable: preponderance of males, prevalence of autism spectrum disorders.

11 Fiction esp fan fiction

Oh yes, that is a thing. Backstory here or here.

12 Effectiveness

Chinese Businessmen: Superstition Doesn’t Count

…there are two commonly understood forms of rationality, and LessWrong is mostly concerned with only one of them. The two forms are:

  • Epistemic rationality — how do you know that your beliefs are true?
  • Instrumental rationality — how do you make better decisions to achieve your goals?

Jonathan Baron calls the first form—epistemic rationality — “thinking about beliefs”. He calls the second “thinking about decisions”.

LessWrong has concentrated most of its efforts on epistemic rationality. The vast majority of writing on the site focuses its attention on common cognitive biases and failures of human thinking, and discusses methods for overcoming them. In other words, LessWrong’s community of rationality practitioners desire the ability to hold accurate and true beliefs about the world, and believe that doing so will enable them to achieve success in their lives and in pursuit of their goals.

… My current bet, however, is that this simply can’t be true. LessWrong’s decade of existence, and my experience with traditional Chinese businessmen, suggests to me that instrumental rationality is the thing that dominates when it comes to success in business and life. It suggests to me that if you’re instrumentally rational, you don’t need to optimise for correct and true beliefs to succeed. You merely need a small set of true beliefs, related to your field; these beliefs can be determined from trial and error itself.

Figure 7

13 In Australia

See rationalists in australia.

14 References

Bright. 2022. Neo-Rationalism.”
Lewis-Kraus. n.d. Slate Star Codex and Silicon Valley’s War Against the Media.” The New Yorker.
Metz. 2021. Silicon Valley’s Safe Space.” The New York Times.