On our known-terrible ability to know our terrible inabilities to know about others. Our constant and reliable ability to 1) think we know more about others than we do and 2) never notice it. Manifesting in, e.g. pluralistic ignorance, the out-group homogeneity effect, idiosyncratic rater effect, many other biases I do not yet know names for.
Terminology note: The title here ia a reference to the title of the influential paper (Kruger and Dunning 1999), “Unskilled and unaware of it”, rather than the model itself. In the common parlance, knowing just enough to be dangerous but not to know your limits.1
I cannot shake the suspicion that there is some insight in learning process of our minds which explains how our interpersonal understanding capacities are biased.
On evaluating others
The first problem with feedback is that humans are unreliable raters of other humans. Over the past 40 years psychometricians have shown in study after study that people don’t have the objectivity to hold in their heads a stable definition of an abstract quality, such as business acumen or assertiveness, and then accurately evaluate someone else on it. Our evaluations are deeply colored by our own understanding of what we’re rating others on, our own sense of what good looks like for a particular competency, our harshness or leniency as raters, and our own inherent and unconscious biases. This phenomenon is called the idiosyncratic rater effect, and it’s large (more than half of your rating of someone else reflects your characteristics, not hers) and resilient (no training can lessen it). In other words, the research shows that feedback is more distortion than truth.
On the alternate realities inhabited by pundits
Another tendency of interest was put nicely by Tanner Greer: Our think pieces about how the world works are likely to come from pudits with a public profile, that being what a public profile is, and the world inhabited by such people is different than the world that the majority inhabit. We are getting our models of the world from people in a special bubble of experiential bias.
This is the first difficulty that comes with a growing follower count on twitter. As the count grows, the number of different communities you are projecting to grows as well. Soon, large numbers of people start to follow because they see you as a representative of a certain strain of thought, or as a key voice in a particular conversation they care about. They are not are sympathetic to your ideas or even merely intellectually interested in them; instead they follow you to keep tabs on what you and people like you are saying. Many actually despise you and your ideas to their core (in twitterese, they are a “hate follow”).
My friend Matthew Stinson described this shift as that point where “interactions stop being inquisitive and start getting accusatory. “Points for my side-ism” becomes a real thing.” Twitter’s retweet mechanism makes this problem far worse. All one needs is a snarky RT for these people to take what a thought they dislike and BOOM!, project it into communities it was never intended for as the perfect example of what they all should be hating at that moment.
Thus if you have a large follower account your experience on twitter goes like this: you share a thought optimized for Group X. Members of Group Y, Group Z, and Group V automatically start sharing it as the textbook example of why Group X deserves crucifixion.
On understanding how others think
Todo: raid this for references: Why You’re Constantly Misunderstood on Slack (and How to Fix It).
The researchers first analyzed data from 2,374 individuals who participated in the 2016 American National Election Studies Time Series Survey, a nationally representative survey of U.S. citizens. As expected, liberals and conservatives were more likely to describe the opposing political party as uninformed, irrational, and/or biased compared to their own party.
Importantly, the researchers found that this was especially true among those with a higher socio-economic status. Among more liberal participants, higher status individuals displayed more naive realism toward Republicans. Among more conservative participants, higher status individuals displayed more naive realism toward Democrats.
In a follow-up experiment, the researchers experimentally manipulated people’s sense of status through an investment game. The study of 252 participants found that those who were randomly told they had performed “better than 89% of all players to date” were more likely to say that people who disagreed with their investment advice were biased and incompetent.
For ages my favourite go-to-bias to think on here was Fundamental attribution bias, which seems ubiquitous to me.
In social psychology, fundamental attribution error (FAE), also known as correspondence bias or attribution effect, is the tendency for people to under-emphasize situational explanations for an individual’s observed behavior while over-emphasizing dispositional and personality-based explanations for their behavior. This effect has been described as “the tendency to believe that what people do reflects who they are”.
Jacob Falkovitch Is Rationalist Self-Improvement Real? has a lot of ideas about the effectiveness of trying to be more rational (in more areas than theory of mind). TBC: quesetion assumption casual commenters on rationality blogs are any more committed to rationality than drive-by commenter on any other site; that rationality blogs attract or cultivate more rational individuals is a hypothesis to test. There will be obvious sampling bias problems …