Dunning-Kruger theory of mind

On anthropomorphising humans



A collection of notes on our known-terrible ability to know our terrible inability to know about others and ourselves. Was that hard to parse? What I mean is that we seem to reliably

  1. think we know more about how peoples’s minds work than we do in fact know, and
  2. fail to learn that we do not know how people’s minds work.

To put it another way

Being a smart, perceptive, reasonable person is exactly what being a prejudiced, sanctimonious ignoramus feels like from the inside.

Or another way:

Are we terrible at seeing the social element of our brains?

I do not attempt to document every way in which our intuitions about the world are defective (after all, were we born knowing all there was to know, we would be gods). Rather, I would like to collect under the rubric Dunning Kruger theory of mind my favourite ways in which we reliably, regularly get other people and ourselves wrong, and fail to notice that we have got it wrong, in ways which are important to address for the ongoing health and survival of our civilisation.

Biases and failures which fit in this category manifest are depressingly diverse; pluralistic ignorance, the out-group homogeneity effect, idiosyncratic rater effect, many other biases I do not yet know names for. I’ll add more as they seem relevant.

Related: Stuff we learn as kids that we fail to unlearn as adults.

Relevant picks from the Buster Benson summary, A comprehensive guide to cognitive biases:

Terminology note: The title here is a reference to the title of the influential paper Unskilled and unaware of it (Kruger and Dunning 1999). In the common parlance, what I want to evoke is knowing just enough to be dangerous but not enough to know your limits.1

Here is a catalogue of theory-of-mind failings that arise in life that I would like take in to account. There are many, many more.

A list of popular cognitive biases. Note that not all of these are theory of mind biases. Don’t even get me started on all the other handicaps we have when it come to facing reality. design: John Manoogian III categories and descriptions: Buster Benson implementation: TilmannR, CC BY-SA 4.0, via Wikimedia Commons. Get a printed version at Design Hacks

Recommended for a fun general introduction to this topic, David McRaney’s You are not so smart (McRaney 2012), which has an excellent accompanying Podcast with some updated material.

On evaluating others

The feedback fallacy:

The first problem with feedback is that humans are unreliable raters of other humans. Over the past 40 years psychometricians have shown in study after study that people don’t have the objectivity to hold in their heads a stable definition of an abstract quality, such as business acumen or assertiveness, and then accurately evaluate someone else on it. Our evaluations are deeply colored by our own understanding of what we’re rating others on, our own sense of what good looks like for a particular competency, our harshness or leniency as raters, and our own inherent and unconscious biases. This phenomenon is called the idiosyncratic rater effect, and it’s large (more than half of your rating of someone else reflects your characteristics, not hers) and resilient (no training can lessen it). In other words, the research shows that feedback is more distortion than truth.

The real question is why individuals often try to avoid feedback that provides them with more accurate knowledge of themselves.

The Selective Laziness of Reasoning (Trouche et al. 2016):

many people will reject their own arguments—if they’re tricked into thinking that other people proposed them. [...]

By “selective laziness”, Trouche et al. are referring to our tendency to only bother scrutinizing arguments coming from other people who we already disagree with. To show this, the authors first got Mturk volunteers to solve some logic puzzles (enthymematic syllogisms), and to write down their reasons (arguments) for picking the answer they did. Then, in the second phase of the experiment, the volunteers were shown a series of answers to the same puzzles, along with arguments supporting them. They were told that these were a previous participant’s responses, and were asked to decide whether or not the “other volunteer’s” arguments were valid. The trick was that one of the displayed answers was in fact one of the participant’s own responses that they had written earlier in the study. So the volunteers were led to believe that their own argument was someone else’s. It turned out that almost 60% of the time, the volunteers rejected their own argument, and declared that it was wrong. They were especially likely to reject it when they had, in fact, been wrong the first time.

This is a kind of hopeful one, when you think about it. It means that we are better at identifying the flaws in each other’s arguments than our own. If only we could normalise critical feedback and not somehow turn it into blow-up flame-wars we might find that collective reasoning is more powerful than individual reasoning. Indeed, is this precisely how science works?

On erroneously thinking my experience is universal

A.k.a. denying the reality of lived experience.

TBC. How do we mediate between our own experience and the experience of others when both are faulty representations of reality? How do we maintain both humility and scepticism at once? For bonus points, how do we do that even across a cultural divide?

On erroneously thinking my experience is not universal

TBD. Out-group effects.

On understanding how others think

Todo: raid this for references: Why You’re Constantly Misunderstood on Slack (and How to Fix It).

A connection to the status literature: People of higher status are more likely to think that those who disagree with them are stupid or biased — even when their high status is the result of a random process. (Brown-Iannuzzi et al. 2020)

The researchers first analyzed data from 2,374 individuals who participated in the 2016 American National Election Studies Time Series Survey, a nationally representative survey of U.S. citizens. As expected, liberals and conservatives were more likely to describe the opposing political party as uninformed, irrational, and/or biased compared to their own party.

Importantly, the researchers found that this was especially true among those with a higher socio-economic status. Among more liberal participants, higher status individuals displayed more naive realism toward Republicans. Among more conservative participants, higher status individuals displayed more naive realism toward Democrats.

In a follow-up experiment, the researchers experimentally manipulated people’s sense of status through an investment game. The study of 252 participants found that those who were randomly told they had performed “better than 89% of all players to date” were more likely to say that people who disagreed with their investment advice were biased and incompetent.

For ages my favourite go-to-bias to think on here was Fundamental attribution bias, which seems ubiquitous to me.

In social psychology, fundamental attribution error (FAE), also known as correspondence bias or attribution effect, is the tendency for people to under-emphasize situational explanations for an individual’s observed behavior while over-emphasizing dispositional and personality-based explanations for their behavior. This effect has been described as “the tendency to believe that what people do reflects who they are”.

Apparently it has put in doubt though? (Epstein and Teraspulsky 1986; Malle 2006) TODO: investigate.

Naive realism is also a phenomenon of note in this category (Gilovich and Ross 2015; Ross and Ward 1996). David McRaney’s summary:

[N]aive realism also leads you to believe you arrived at your opinions, political or otherwise, after careful, rational analysis through unmediated thoughts and perceptions. In other words, you think you have been mainlining pure reality for years, and like Gandalf studying ancient texts, your intense study of the bare facts is what has naturally led to your conclusions.

Ross says that since you believe you are in the really-real, true reality, you also believe that you have been extremely careful and devoted to sticking to the facts and thus are free from bias and impervious to persuasion. Anyone else who has read the things you have read or seen the things you have seen will naturally see things your way, given that they’ve pondered the matter as thoughtfully as you have. Therefore, you assume, anyone who disagrees with your political opinions probably just doesn’t have all the facts yet. If they had, they’d already be seeing the world like you do. This is why you continue to ineffectually copy and paste links from all our most trusted sources when arguing your points with those who seem misguided, crazy, uninformed, and just plain wrong. The problem is, this is exactly what the other side thinks will work on you.

If we were machines would we understand other machines this badly?

TBD. Consider minds as machine learning and see what that would tell us about learners that learn to model learners. Possibly that means predictive coding.

Remedies

CFAR, putting money on the line, calibration. That example about set theory and cards. TBC.

On evaluating ourselves

References

Acemoglu, Daron, Victor Chernozhukov, and Muhamet Yildiz. 2006. Learning and Disagreement in an Uncertain World.” Working Paper 12648. National Bureau of Economic Research.
Argyle, Michael. 2013. Lay Theories: Everyday Understanding of Problems in the Social Sciences. Elsevier.
Baldassarri, Delia, and Guy Grossman. 2013. The Effect of Group Attachment and Social Position on Prosocial Behavior. Evidence from Lab-in-the-Field Experiments.” Edited by Angel Sánchez. PLoS ONE 8 (3): e58750.
Belmi, Peter, and Kristin Laurin. 2016. Who wants to get to the top? Class and lay theories about power.” Journal of Personality and Social Psychology 111 (4): 505–29.
Brewer, Marilynn B. 1993. Social Identity, Distinctiveness, and In-Group Homogeneity.” Social Cognition 11 (1): 150–64.
Brown-Iannuzzi, Jazmin L., Kristjen B. Lundberg, Aaron C. Kay, and B. Keith Payne. 2020. A Privileged Point of View: Effects of Subjective Socioeconomic Status on Naïve Realism and Political Division.” Personality and Social Psychology Bulletin, May, 0146167220921043.
Byron, Kristin. 2008. Carrying Too Heavy a Load? The Communication and Miscommunication of Emotion by Email.” The Academy of Management Review 33 (2): 309–27.
Couzin, Iain D., Christos C. Ioannou, Güven Demirel, Thilo Gross, Colin J. Torney, Andrew Hartnett, Larissa Conradt, Simon A. Levin, and Naomi E. Leonard. 2011. Uninformed Individuals Promote Democratic Consensus in Animal Groups.” Science 334 (6062): 1578–80.
Epstein, Seymour, and Laurie Teraspulsky. 1986. Perception of Cross-Situational Consistency. Journal of Personality and Social Psychology 50 (6): 1152–60.
Gilovich, Thomas, and Lee Ross. 2015. The Wisest One in the Room: How You Can Benefit from Social Psychology’s Most Powerful Insights. Illustrated edition. New York: Free Press.
Guerra-Pujol, F. E. 2014. Domestic Violence, Strategic Behavior, and Ideological Rent-Seeking.” SSRN Scholarly Paper ID 915929. Rochester, NY: Social Science Research Network.
Jecker, Jon, and David Landy. 1969. Liking a Person as a Function of Doing Him a Favour.” Human Relations 22 (4): 371–78.
Kruger, Justin, and David Dunning. 1999. Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments. Journal of Personality and Social Psychology 77: 1121–34.
Kruger, Justin, Nicholas Epley, Jason Parker, and Zhi-Wen Ng. 2005. Egocentrism over e-Mail: Can We Communicate as Well as We Think? Journal of Personality and Social Psychology 89 (6): 925–36.
Lambert, Tracy A., Arnold S. Kahn, and Kevin J. Apple. 2003. Pluralistic Ignorance and Hooking up.” The Journal of Sex Research 40 (2): 129–33.
Laubert, Christoph, and Jennifer Parlamis. 2019. Are You Angry (Happy, Sad) or Aren’t You? Emotion Detection Difficulty in Email Negotiation.” Group Decision and Negotiation 28 (2): 377–413.
Levy, Sheri R., Chi-yue Chiu, and Ying-yi Hong. 2006. Lay Theories and Intergroup Relations.” Group Processes & Intergroup Relations 9 (1): 5–24.
Lilienfeld, Scott, Rachel Ammirati, and Michal David. 2012. Distinguishing Science from Pseudoscience in School Psychology: Science and Scientific Thinking as Safeguards Against Human Error.” Journal of School Psychology 50 (February): 7–36.
Malle, Bertram F. 2006. The actor-observer asymmetry in attribution: a (surprising) meta-analysis.” Psychological Bulletin 132 (6): 895–919.
McRaney, David. 2012. You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, an d 46 Other Ways You’re Deluding Yourself. Reprint edition. New York: Avery.
Moral Sentiments and Material Interests: The Foundations of Cooperation in Economic Life. 2006. The MIT Press.
Murphy, Ryan H. 2020. Markets Against Modernity: Ecological Irrationality, Public and Private. Capitalist Thought : Studies in Philosophy, Politics, and Economics. Lanham, Maryland: Lexington Books.
O’Gorman, Hubert J., and Stephen L. Garry. 1976. Pluralistic Ignorance—A Replication and Extension.” Public Opinion Quarterly 40 (4): 449–58.
Prentice, Deborah A., and Dale T. Miller. 1996. Pluralistic Ignorance and the Perpetuation of Social Norms by Unwitting Actors.” In Advances in Experimental Social Psychology, edited by Mark P. Zanna, 28:161–209. Academic Press.
Richerson, Peter J, Robert T Boyd, and Joseph Henrich. 2003. “Cultural Evolution of Human Cooperation.” Genetic and Cultural Evolution of Cooperation, 357.
Ross, Lee, and Andrew Ward. 1996. Naive Realism in Everyday Life: Implications for Social Conflict and Misunderstanding.” In Values and Knowledge. Psychology Press.
Simonovits, Gábor, Gábor Kézdi, and Péter Kardos. 2017. Seeing the World Through the Other’s Eye: An Online Intervention Reducing Ethnic Prejudice.” American Political Science Review, November, 1–8.
Trouche, Emmanuel, Petter Johansson, Lars Hall, and Hugo Mercier. 2016. The Selective Laziness of Reasoning.” Cognitive Science 40 (8): 2122–36.
Trouche, Emmanuel, Emmanuel Sander, and Hugo Mercier. 2014. Arguments, More Than Confidence, Explain the Good Performance of Reasoning Groups.” SSRN Scholarly Paper ID 2431710. Rochester, NY: Social Science Research Network.
Watts, Duncan J. 2011. Everything Is Obvious: Once You Know the Answer. 1st ed. New York: Crown Business.
———. 2014. Common Sense and Sociological Explanations.” American Journal of Sociology 120 (2): 313–51.
Willer, Robb, Ko Kuwabara, and Michael W. Macy. 2009. The False Enforcement of Unpopular Norms.” American Journal of Sociology 115 (2): 451–90.
Wilson, Anne E., and Jaslyn A. English. 2017. The Motivated Fluidity of Lay Theories of Change.” In The Science of Lay Theories: How Beliefs Shape Our Cognition, Behavior, and Health, edited by Claire M. Zedelius, Barbara C. N. Müller, and Jonathan W. Schooler, 17–43. Cham: Springer International Publishing.
Wolfe, Michael B., and Todd J. Williams. 2018. Poor metacognitive awareness of belief change.” Quarterly Journal of Experimental Psychology (2006) 71 (9): 1898–1910.
Zedelius, Claire M., Barbara C. N. Müller, and Jonathan W. Schooler. 2017. The Science of Lay Theories: How Beliefs Shape Our Cognition, Behavior, and Health. Springer.

  1. Interpretation of the Dunning Kruger effect model is a subtle affair which somewhat in the weeds for our current purposes. tl;dr their actual model is not that great and probably arises, ironically, from sampling bias.↩︎


No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.