Dunning-Kruger theory of mind

On anthropomorphising humans



A collection of notes on our known-terrible ability to know our terrible inability to know about others. Was that hard to parse? What I mean is that we seem to reliably

  1. think we know more about others than we do, and
  2. fail to learn that we do not.

Manifesting in, e.g. pluralistic ignorance, the out-group homogeneity effect, idiosyncratic rater effect, many other biases I do not yet know names for.

Terminology note: The title here is a reference to the title of the influential paper Unskilled and unaware of it (Kruger and Dunning 1999). In the common parlance, what I want to evoke is knowing just enough to be dangerous but not enough to know your limits.1

Here is a catalogue of theory-of-mind failings that arise in life that I would like take in to account.

On evaluating others

The feedback fallacy:

The first problem with feedback is that humans are unreliable raters of other humans. Over the past 40 years psychometricians have shown in study after study that people don’t have the objectivity to hold in their heads a stable definition of an abstract quality, such as business acumen or assertiveness, and then accurately evaluate someone else on it. Our evaluations are deeply colored by our own understanding of what we’re rating others on, our own sense of what good looks like for a particular competency, our harshness or leniency as raters, and our own inherent and unconscious biases. This phenomenon is called the idiosyncratic rater effect, and it’s large (more than half of your rating of someone else reflects your characteristics, not hers) and resilient (no training can lessen it). In other words, the research shows that feedback is more distortion than truth.

The real question is why individuals often try to avoid feedback that provides them with more accurate knowledge of themselves.

On erroneously thinking my experience is universal

A.k.a. denying the reality of lived experience.

TBC. How do we mediate between our own experience and the experience of others when both are faulty? How do we maintain both humility and scepticism at once? For bonus points, how do we do that even across a cultural divide?

On erroneously thinking my experience is not universal

TBD. Out-group effects.

On understanding how others think

Todo: raid this for references: Why You’re Constantly Misunderstood on Slack (and How to Fix It).

Here is a connection to the status literature. People of higher status are more likely to think that those who disagree with them are stupid or biased — even when their high status is the result of a random process. (Brown-Iannuzzi et al. 2020)

The researchers first analyzed data from 2,374 individuals who participated in the 2016 American National Election Studies Time Series Survey, a nationally representative survey of U.S. citizens. As expected, liberals and conservatives were more likely to describe the opposing political party as uninformed, irrational, and/or biased compared to their own party.

Importantly, the researchers found that this was especially true among those with a higher socio-economic status. Among more liberal participants, higher status individuals displayed more naive realism toward Republicans. Among more conservative participants, higher status individuals displayed more naive realism toward Democrats.

In a follow-up experiment, the researchers experimentally manipulated people’s sense of status through an investment game. The study of 252 participants found that those who were randomly told they had performed “better than 89% of all players to date” were more likely to say that people who disagreed with their investment advice were biased and incompetent.

For ages my favourite go-to-bias to think on here was Fundamental attribution bias, which seems ubiquitous to me.

In social psychology, fundamental attribution error (FAE), also known as correspondence bias or attribution effect, is the tendency for people to under-emphasize situational explanations for an individual’s observed behavior while over-emphasizing dispositional and personality-based explanations for their behavior. This effect has been described as “the tendency to believe that what people do reflects who they are”.

Apparently it has put in doubt though? (Epstein and Teraspulsky 1986; Malle 2006) TODO: investigate.

If we were machines would we understand other machines this badly?

TBD. Consider minds as machine learning and see what that would tell us about learners that learn to model learners. Possibly that mean predictive coding.

References

Acemoglu, Daron, Victor Chernozhukov, and Muhamet Yildiz. 2006. “Learning and Disagreement in an Uncertain World.” Working Paper 12648. National Bureau of Economic Research. https://doi.org/10.3386/w12648.
Brewer, Marilynn B. 1993. “Social Identity, Distinctiveness, and In-Group Homogeneity.” Social Cognition 11 (1): 150–64. https://doi.org/10.1521/soco.1993.11.1.150.
Brown-Iannuzzi, Jazmin L., Kristjen B. Lundberg, Aaron C. Kay, and B. Keith Payne. 2020. “A Privileged Point of View: Effects of Subjective Socioeconomic Status on Naïve Realism and Political Division.” Personality and Social Psychology Bulletin, May, 0146167220921043. https://doi.org/10.1177/0146167220921043.
Byron, Kristin. 2008. “Carrying Too Heavy a Load? The Communication and Miscommunication of Emotion by Email.” The Academy of Management Review 33 (2): 309–27. https://doi.org/10.2307/20159399.
Epstein, Seymour, and Laurie Teraspulsky. 1986. “Perception of Cross-Situational Consistency.” Journal of Personality and Social Psychology 50 (6): 1152–60. https://doi.org/10.1037/0022-3514.50.6.1152.
Kruger, Justin, and David Dunning. 1999. “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments.” Journal of Personality and Social Psychology 77: 1121–34. https://doi.org/10.1037/0022-3514.77.6.1121.
Kruger, Justin, Nicholas Epley, Jason Parker, and Zhi-Wen Ng. 2005. “Egocentrism over e-Mail: Can We Communicate as Well as We Think?” Journal of Personality and Social Psychology 89 (6): 925–36. https://doi.org/10.1037/0022-3514.89.6.925.
Lambert, Tracy A., Arnold S. Kahn, and Kevin J. Apple. 2003. “Pluralistic Ignorance and Hooking Up.” The Journal of Sex Research 40 (2): 129–33. https://doi.org/10.1080/00224490309552174.
Laubert, Christoph, and Jennifer Parlamis. 2019. “Are You Angry (Happy, Sad) or Aren’t You? Emotion Detection Difficulty in Email Negotiation.” Group Decision and Negotiation 28 (2): 377–413. https://doi.org/10.1007/s10726-018-09611-4.
Malle, Bertram F. 2006. “The actor-observer asymmetry in attribution: a (surprising) meta-analysis.” Psychological Bulletin 132 (6): 895–919. https://doi.org/10.1037/0033-2909.132.6.895.
O’Gorman, Hubert J., and Stephen L. Garry. 1976. “Pluralistic Ignorance—A Replication and Extension.” Public Opinion Quarterly 40 (4): 449–58. https://doi.org/10.1086/268331.
Prentice, Deborah A., and Dale T. Miller. 1996. “Pluralistic Ignorance and the Perpetuation of Social Norms by Unwitting Actors.” In Advances in Experimental Social Psychology, edited by Mark P. Zanna, 28:161–209. Academic Press. https://doi.org/10.1016/S0065-2601(08)60238-5.
Simonovits, Gábor, Gábor Kézdi, and Péter Kardos. 2017. “Seeing the World Through the Other’s Eye: An Online Intervention Reducing Ethnic Prejudice.” American Political Science Review, November, 1–8. https://doi.org/10.1017/S0003055417000478.
Willer, Robb, Ko Kuwabara, and Michael W. Macy. 2009. “The False Enforcement of Unpopular Norms.” American Journal of Sociology 115 (2): 451–90. https://doi.org/10.1086/599250.

  1. Interpretation of the Dunning Kruger effect model is a subtle affair which somewhat in the weeds for our current purposes. tl;dr their actual model is not that great and probably arises ironically from sampling bias.↩︎


No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.