Anti-TESCREALists

Post-rationalism for non-post-rationalists

October 1, 2024 — November 29, 2024

adversarial
economics
faster pussycat
innovation
language
machine learning
mind
neural nets
NLP
security
technology
Figure 1: Go on, buy the sticker

Attention conservation notice: Coverage of a recent battle in the forever culture war. You might want to skip reading about it unless you enjoy culture wars or are caught up in one of the touch-points of this one, such as AI risk or Effective Altruism, or read Ozy Brennan’s more tightly-argued The “TESCREAL” Bungle

Since I wrote this, the article about which the Torres article was written has been published (Gebru and Torres 2024). It has a clearer thesis than the Torres article, arguing that the central theme of the ‘bundle’ is not longtermism but rather eugenics. It also constructs the bundle in a rather different, IMO more coherent way, albeit still not convincing to me.

The main highlight for me is an interesting argument about blame evasion which seems to me to be mostly independent of the TESCREAL definition, and which I am generally sympathetic to.

If I had to recommend one article, it would be that newer one; while I still find many things to disagree with, at least I can work out what they are.

An article by Émile P. Torres has gone viral in my circles recently, denouncing TESCREALism, i.e. Transhumanism, Extropianism, singularitarianism, cosmism, Rationalism, Effective Altruism and longtermism.

There is a lot going on in the article. In fact, there is too much going on. I am not really sure what the main thrust is, content-wise.

In terms of pragmatics, the article probably most successfully acts to found an anti-TESCREAList movement. In the course of doing so, it leans upon many vibes-based arguments.

I think the main thrust of criticism might be that some flavours of longtermism lead to unpalatable conclusions, including excessive worry about AI x-risk at the expense of the currently living. While making that argument, it frames several online communities which have entertained various longtermist ideas to be a “bundle”, which I assume is working to imply that these groups form a political bloc which encourages or enables accelerationist hypercapitalism. Is that definition the main thrust? Is it that longtermism is bad? Is it that accelerationist, hypercapitalist movements are coordinating? It’s not clear to me. Not that I am one to criticise from a superior position; my own blog is notoriously full of half-finished thoughts awaiting structuring into cogent argument. In my defence, I don’t claim those are position-pieces. Maybe I should think of the Torres piece as a notebook rather than an article.

Figure 2

The article lists some philosophical stances I also criticise. But the main deal seems to hinge upon an argument of Torres’ which I do not buy, which has a large serve of “all those others over there are alike in their malignancy”, and that those others are TESCREALists.

I am not a fan of TESCREALism as a term, in that I don’t think it is useful or credible category of analysis. Despite the lumpy, gerrymandered feel, it does seem to have gotten modest traction in discourse.

That observation leads me to wonder how much mileage I myself can get out of lumping all the movements that have grated me the wrong way in the past together into a single acronym. (“Let me tell you about NIMBYs, coal magnates, liberal scolds, and three-chord punk bands, and how they are all part of the same bundles of malign patsy ideologies, which I call NICOLS3CPism.”)

With this in mind, you will probably not be surprised to hear me assert that Torres’ linked TESCREAL article leans on genealogical arguments, mostly guilt by association. Of the movement they name that I know well, there is no single view on the topic of AI x-risk, longtermism, or the future in general, nor do they share, say, a consistent utilitarian stance, nor a consistent interpretation of utilitarianism when they are utilitarian. We could draw a murder-board upon which key individuals in each of the philosophies are connected by red string, but it doesn’t seem to be a natural category in any strong sense to me, any more than Axis Of Evil or NICOLS3CP.

That said, just because the associations do not seem meaningful to me that doesn’t mean there is nothing to criticise in the various movements. Annoyingly, that article does not clearly identify which arguments in particular the author thinks are deficient in the ‘bundle’.

Like Torres, I would take issue with reasoning like this if I saw it in the wild [citation needed]:

the biggest tragedy of an AGI apocalypse wouldn’t be the 8 billion deaths of people now living. This would be bad, for sure, but much worse would be the nonbirth of trillions and trillions of future people who would have otherwise existed.

On the other hand, if the action to take to prevent an AI apocalypse is the same either way, do we care? Is taking AI risk seriously the bad bit? Or is it the details of the moral justification that we are supposed to be worried about?

The author seems generally exercised about longtermism themes, e.g. how to trade off the needs of people living now and people yet to be born. There are for sure some weird outcomes from some of the longtermist thought experiments.

If I disagree with some school of longtermism, why not just say I disagree with it, without bringing in this bundle? Better yet, why not mention which of the many longtermisms I am worried about, and rebut a specific argument they make?

The muddier strategy of the article, disagreeing-with-longtermism-plus-feeling-bad-vibes-about-various-other-movements-and-philosophies-that-have-a-diverse-range-of-sometimes-tenuous-relationships-with-longtermism, doesn’t feel to me like it is making the other half of the article which invents TESCREALism do useful work.

I saw this guilt-by-association play out in public discourse previously with “neoliberalism”, and probably the criticisms of the “woke” “movement” are doing the same work. Since reading that article, I have become worried that I am making the same mistake myself when talking about neoreactionaries. As such, I am grateful to the author for making me interrogate my own prejudices, although I suspect that if anything, I have been shifted in the opposite direction than they intended.

Don’t get me wrong, it is important to note what uses are made of philosophies by movements. Further, movements are hijacked by bad actors all the time (which is to say, actors whose ends may have little to do with the stated goals of the movement), and it is important to be aware of that. But once again, what are we doing by lumping a bunch of movements together? Maybe I need to see the red string on this murder-board and I will be persuaded. Until then, gerrymandering them together seems suspect to me.

If “TESCREALists” are functioning as a bloc, then… by all means, analyse this. I think that some signatories to some components of the acronym do indeed function as a bloc from time to time (cf rationalists and effective altruists).

Broadly, however, I am not convinced there is a movement to hijack in the acronym. Cosmism and Effective Altruism are not in correspondence with each other, not least because all the Cosmists are dead AFAIK.

To be clear, I think there are terribly dangerous philosophies in the various movements name-checked. Some flavours of longtermism (e.g. the ones that apparently allow an unlimited budget of suffering now against badly quantified futures) seem to me also undercooked.

That said, if the goal is to have a banner to rally under, complaining about TESCREALists is one, a Schelling point for the “anti-TESCREAL movement”. That which might actually be a coherent thing, or at least a more coherent thing than that thing it is criticising. And why not? Uniting against a common enemy might be more important than the enemy existing. There are many implied demands that the anti-TESCREALists seem to make, about corporate accountability, about equity for workers involved in the AI industry, and against tech-accelerationism. Some of these demands are ones I make myself. I am not sure that any of those are dependent upon the TESCREAL bundle bundling, or shadowy cabals coordinating, or the existence of an underlying coherent opposed philosophy.

1 References