Dunning-Kruger theory of mind

On anthropomorphising humans

June 18, 2020 — February 24, 2024

bounded compute
cooperation
culture
how do science
incentive mechanisms
learning
mind
snarks
wonk

Content warning:

Complicated questions where we would all prefer simple answers

Figure 1

This piece’s title is a reference to the title of the influential paper Unskilled and unaware of it (Kruger and Dunning 1999). In the common parlance, what I want to evoke is knowing just enough to be dangerous but not enough to know your limits, which is not quite the same thing that the body of the paper discuses.Interpretation of the Dunning Kruger effect model is a subtle affair which goes somewhat too far into the weeds for our current purposes. But see below.

A collection of notes on our known-terrible ability to know our terrible inability to know about others and ourselves. Was that hard to parse? What I mean is that we seem to reliably

  1. think we know more about how peoples’s minds work than we do in fact know, and
  2. fail to learn that we do not know how people’s minds work.

To put it another way

Being a smart, perceptive, reasonable person is exactly what being a prejudiced, sanctimonious ignoramus feels like from the inside.

I am especially interested in how we seem to be terribly myopic at social perception despite being social creatures?

I do not attempt to document every way in which our intuitions about the world are defective (after all, were we born knowing all there was to know, we would be gods). Rather, I would like to collect under the rubric Dunning Kruger theory of mind my favourite ways in which we reliably, regularly get other people and ourselves wrong, and fail to notice that we have got it wrong, in ways which are important to address for the ongoing health and survival of our civilisation.

Figure 2

Biases and failures which fit in this category are depressingly diverse; pluralistic ignorance, the out-group homogeneity effect, idiosyncratic rater effect, many other biases I do not yet know names for. I’ll add more as they seem relevant.

Related: Stuff we learn as kids that we fail to unlearn as adults.

Relevant picks from the Buster Benson summary, A comprehensive guide to cognitive biases:

Now, a catalogue of theory-of-mind failings that arise in life that I would like take in to account. There are many, many more.

Recommended for a fun general introduction to this topic, David McRaney’s You are not so smart (McRaney 2012), which has an excellent accompanying Podcast with some updated material.

2 Dunning-Krugering Dunning-Kruger

Is the Dunning-Kruger effect itself real? Maybe. Dan Luu, Dunning-Kruger and Other Memes:

A pop-sci version of Dunning-Kruger, the most common one I see cited, is that, the less someone knows about a subject, the more they think they know. Another pop-sci version is that people who know little about something overestimate their expertise because their lack of knowledge fools them into thinking that they know more than they do. The actual claim Dunning and Kruger make is much weaker than the first pop-sci claim and, IMO, the evidence is weaker than the second claim.

The whole story is indeed more nuanced and contingent than the one you might pick up from the 10-word summary, and may be wrong, but that is a whole other thing.

This is an interesting case study in social psychology even. Does even having social psychology to hand help us with stuff?

3 On evaluating others

Figure 5

The feedback fallacy:

The first problem with feedback is that humans are unreliable raters of other humans. Over the past 40 years psychometricians have shown in study after study that people don’t have the objectivity to hold in their heads a stable definition of an abstract quality, such as business acumen or assertiveness, and then accurately evaluate someone else on it. Our evaluations are deeply colored by our own understanding of what we’re rating others on, our own sense of what good looks like for a particular competency, our harshness or leniency as raters, and our own inherent and unconscious biases. This phenomenon is called the idiosyncratic rater effect, and it’s large (more than half of your rating of someone else reflects your characteristics, not hers) and resilient (no training can lessen it). In other words, the research shows that feedback is more distortion than truth.

Figure 6

The real question is why individuals often try to avoid feedback that provides them with more accurate knowledge of themselves.

The Selective Laziness of Reasoning (Trouche et al. 2016):

many people will reject their own arguments—if they’re tricked into thinking that other people proposed them. […]

By “selective laziness”, Trouche et al. are referring to our tendency to only bother scrutinizing arguments coming from other people who we already disagree with. To show this, the authors first got Mturk volunteers to solve some logic puzzles (enthymematic syllogisms), and to write down their reasons (arguments) for picking the answer they did. Then, in the second phase of the experiment, the volunteers were shown a series of answers to the same puzzles, along with arguments supporting them. They were told that these were a previous participant’s responses, and were asked to decide whether or not the “other volunteer’s” arguments were valid. The trick was that one of the displayed answers was in fact one of the participant’s own responses that they had written earlier in the study. So the volunteers were led to believe that their own argument was someone else’s. It turned out that almost 60% of the time, the volunteers rejected their own argument, and declared that it was wrong. They were especially likely to reject it when they had, in fact, been wrong the first time.

This is a kind of hopeful one, when you think about it. It means that we are better at identifying the flaws in each other’s arguments than our own. If only we could normalise critical feedback and not somehow turn it into blow-up flame-wars we might find that collective reasoning is more powerful than individual reasoning. Indeed, is this precisely how science works?

Figure 7

4 On erroneously thinking my experience is universal

A.k.a. denying the reality of lived experience.

TBC. How do we mediate between our own experience and the experience of others when both are faulty representations of reality? How do we maintain both humility and scepticism at once? For bonus points, how do we do that even across a cultural divide?

Figure 8

5 On erroneously thinking my experience is not universal

TODO: write about Out-group homogeneity bias, and other biases that lead me to believe that other groups are unlike my group in spurious ways. The outgroup homogenaeity bias is the one that says e.g. those people are all the same, not marvelously diverse like us.

This one seems ubiquitous in my personal experience, by which I mea that I experience it all the time in my own thoughts, and also, that other people seem very ready to claim that some outgroup is not only like that but that they are all like that.

TBC.

6 On understanding how others think

Todo: raid the following for references: Why You’re Constantly Misunderstood on Slack (and How to Fix It).

A connection to the status literature: People of higher status are more likely to think that those who disagree with them are stupid or biased — even when their high status is the result of a random process (Brown-Iannuzzi et al. 2021).

Figure 9

The researchers first analyzed data from 2,374 individuals who participated in the 2016 American National Election Studies Time Series Survey, a nationally representative survey of U.S. citizens. As expected, liberals and conservatives were more likely to describe the opposing political party as uninformed, irrational, and/or biased compared to their own party.

Importantly, the researchers found that this was especially true among those with a higher socio-economic status. Among more liberal participants, higher status individuals displayed more naive realism toward Republicans. Among more conservative participants, higher status individuals displayed more naive realism toward Democrats.

In a follow-up experiment, the researchers experimentally manipulated people’s sense of status through an investment game. The study of 252 participants found that those who were randomly told they had performed “better than 89% of all players to date” were more likely to say that people who disagreed with their investment advice were biased and incompetent.

Figure 10

For ages my favourite go-to-bias to think on here was Fundamental attribution bias, which seems ubiquitous to me.

In social psychology, fundamental attribution error (FAE), also known as correspondence bias or attribution effect, is the tendency for people to under-emphasize situational explanations for an individual’s observed behavior while over-emphasizing dispositional and personality-based explanations for their behavior. This effect has been described as “the tendency to believe that what people do reflects who they are”.

Apparently it has bee called into question? (Epstein and Teraspulsky 1986; Malle 2006) TODO: investigate.

Naive realism is also a phenomenon of note in this category (Gilovich and Ross 2015; Ross and Ward 1996). David McRaney’s summary:

[N]aive realism also leads you to believe you arrived at your opinions, political or otherwise, after careful, rational analysis through unmediated thoughts and perceptions. In other words, you think you have been mainlining pure reality for years, and like Gandalf studying ancient texts, your intense study of the bare facts is what has naturally led to your conclusions.

Ross says that since you believe you are in the really-real, true reality, you also believe that you have been extremely careful and devoted to sticking to the facts and thus are free from bias and impervious to persuasion. Anyone else who has read the things you have read or seen the things you have seen will naturally see things your way, given that they’ve pondered the matter as thoughtfully as you have. Therefore, you assume, anyone who disagrees with your political opinions probably just doesn’t have all the facts yet. If they had, they’d already be seeing the world like you do. This is why you continue to ineffectually copy and paste links from all our most trusted sources when arguing your points with those who seem misguided, crazy, uninformed, and just plain wrong. The problem is, this is exactly what the other side thinks will work on you.

Figure 11

7 If we were machines would we understand other machines this badly?

TBD. Consider minds as machine learning and see what that would tell us about learners that learn to model learners. Possibly that means predictive coding.

8 Remedies

CFAR, putting money on the line, calibration. That example about set theory and cards. TBC.

9 On evaluating ourselves

I cannot even.

10 Moral wetware

See moral wetware.

11 Incoming

Figure 12

12 References

Acemoglu, Chernozhukov, and Yildiz. 2006. Learning and Disagreement in an Uncertain World.” Working Paper 12648.
Argyle. 2013. Lay Theories: Everyday Understanding of Problems in the Social Sciences.
Atir, Wald, and Epley. 2022. Talking with Strangers Is Surprisingly Informative.” Proceedings of the National Academy of Sciences.
Baldassarri, and Grossman. 2013. The Effect of Group Attachment and Social Position on Prosocial Behavior. Evidence from Lab-in-the-Field Experiments.” Edited by Angel Sánchez. PLoS ONE.
Belmi, and Laurin. 2016. Who wants to get to the top? Class and lay theories about power.” Journal of Personality and Social Psychology.
Brewer. 1993. Social Identity, Distinctiveness, and In-Group Homogeneity.” Social Cognition.
Brown-Iannuzzi, Lundberg, Kay, et al. 2021. A Privileged Point of View: Effects of Subjective Socioeconomic Status on Naïve Realism and Political Division.” Personality and Social Psychology Bulletin.
Bursztyn, and Yang. 2022. Misperceptions About Others.” Annual Review of Economics.
Byron. 2008. Carrying Too Heavy a Load? The Communication and Miscommunication of Emotion by Email.” The Academy of Management Review.
Couzin, Ioannou, Demirel, et al. 2011. Uninformed Individuals Promote Democratic Consensus in Animal Groups.” Science.
Epley, and Schroeder. 2014. Mistakenly seeking solitude.” Journal of Experimental Psychology. General.
Epstein, and Teraspulsky. 1986. Perception of Cross-Situational Consistency. Journal of Personality and Social Psychology.
Gilovich, and Ross. 2015. The Wisest One in the Room: How You Can Benefit from Social Psychology’s Most Powerful Insights.
Guerra-Pujol. 2014. Domestic Violence, Strategic Behavior, and Ideological Rent-Seeking.” SSRN Scholarly Paper ID 915929.
Jecker, and Landy. 1969. Liking a Person as a Function of Doing Him a Favour.” Human Relations.
Kruger, and Dunning. 1999. Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments. Journal of Personality and Social Psychology.
Kruger, Epley, Parker, et al. 2005. Egocentrism over e-Mail: Can We Communicate as Well as We Think? Journal of Personality and Social Psychology.
Lambert, Kahn, and Apple. 2003. Pluralistic Ignorance and Hooking up.” The Journal of Sex Research.
Laubert, and Parlamis. 2019. Are You Angry (Happy, Sad) or Aren’t You? Emotion Detection Difficulty in Email Negotiation.” Group Decision and Negotiation.
Levy, Chiu, and Hong. 2006. Lay Theories and Intergroup Relations.” Group Processes & Intergroup Relations.
Lilienfeld, Ammirati, and David. 2012. Distinguishing Science from Pseudoscience in School Psychology: Science and Scientific Thinking as Safeguards Against Human Error.” Journal of School Psychology.
Malle. 2006. The actor-observer asymmetry in attribution: a (surprising) meta-analysis.” Psychological Bulletin.
McRaney. 2012. You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, an d 46 Other Ways You’re Deluding Yourself.
Moral Sentiments and Material Interests: The Foundations of Cooperation in Economic Life. 2006.
Murphy. 2020. Markets Against Modernity: Ecological Irrationality, Public and Private. Capitalist Thought : Studies in Philosophy, Politics, and Economics.
O’Gorman, and Garry. 1976. Pluralistic Ignorance—A Replication and Extension.” Public Opinion Quarterly.
Prentice, and Miller. 1996. Pluralistic Ignorance and the Perpetuation of Social Norms by Unwitting Actors.” In Advances in Experimental Social Psychology.
Richerson, Boyd, and Henrich. 2003. “Cultural Evolution of Human Cooperation.” Genetic and Cultural Evolution of Cooperation.
Ross, and Ward. 1996. Naive Realism in Everyday Life: Implications for Social Conflict and Misunderstanding.” In Values and Knowledge.
Simonovits, Kézdi, and Kardos. 2017. Seeing the World Through the Other’s Eye: An Online Intervention Reducing Ethnic Prejudice.” American Political Science Review.
Stephens-Davidowitz. 2022. Don’t Trust Your Gut: Using Data to Get What You Really Want in LIfe.
Trouche, Johansson, Hall, et al. 2016. The Selective Laziness of Reasoning.” Cognitive Science.
Trouche, Sander, and Mercier. 2014. Arguments, More Than Confidence, Explain the Good Performance of Reasoning Groups.” SSRN Scholarly Paper ID 2431710.
Watts. 2011. Everything Is Obvious: Once You Know the Answer.
———. 2014. Common Sense and Sociological Explanations.” American Journal of Sociology.
Willer, Kuwabara, and Macy. 2009. The False Enforcement of Unpopular Norms.” American Journal of Sociology.
Wilson, and English. 2017. The Motivated Fluidity of Lay Theories of Change.” In The Science of Lay Theories: How Beliefs Shape Our Cognition, Behavior, and Health.
Wolfe, and Williams. 2018. Poor metacognitive awareness of belief change.” Quarterly Journal of Experimental Psychology (2006).
Zedelius, Müller, and Schooler. 2017. The Science of Lay Theories: How Beliefs Shape Our Cognition, Behavior, and Health.