Doing complicated things the obvious way

and not researching alternatives when the obvious way turns out to be wrong



A problem that I have is spending too much time researching things before diving in. George, in Do, Then Think, identifies times when this is pathological. I am working on trying to direct my compulsive autodidact tendencies better.

One of my blockers in improving upon that is that I observe a counterposed pathology out there: People who dive in without making sufficient effort to muster the best research and ideas about what they are doing, even when there is a lot, potentially not proceeding to do that research even when they have ample evidence that their first attempt was bad, and continuing to avoid that research for years at a time.

In the best case you might imagine this would lead to maverick outsiders productively disrupting the status quo. What I often observe in practice is far from that best case. I see people trying stuff that is unlikely to work, stuff that many other people have also tried for the same naïve reasons, which runs into common difficulties, when those difficulties have been well researched and have well understood solutions. Often the protagonist’s search for alternative solutions is perfunctory. This is especially common for people trying to act politically or in business or other complicated domains, and in places with emotionally loaded nast experiences. This is often not a contrarian stance; more of a surely it must be easy stance which is never updated by when the evidence arrives that this project is actually hard. That is the pathological bit: if we do not use the new evidence to research how to do it better, even when such research is available.

Examples:

  1. A local cohousing group confessing that after ten years of gruelling unsatisfactory amateurish meeting process and defective governance, they were finally researching how to improve their meeting process and governance, now that the most pressing governance challenges are over. This is a group of people who have a major project that is consuming essentially their entire life savings and all their free time, and mostly about governance.

  2. Friends distressed by the state of some Bad Thing, e.g. global climate risk, who attempt to address this by replicating some Greenpeace media stunt they saw one time, without pausing to ask if

    1. it was effective even back then, or rather an idea that Greenpeace tried and discarded for being ineffectual, and
    2. it would be effective now, even if it was back then, and
    3. if it would be effective now, whether perhaps Greenpeace already has this one in the bag and there are other niches that would be more effective to fill.

    When their media stunts fail to make a discernable improvement in fixing the Bad Thing they repeat again and again, eventually become discouraged and/or burnt out, without dipping into the massive body of research about effective policy change.

  3. Terrible aspirational business ideas, where the business model turns out upon testing to have no path to profitability or even sustainability, and yet the plan is not revised. Delicacy prevents me from naming names or details, but suffice it to say, I know people who, in order to keep the fun and spontanaeity in their businesses, run said businesses so badly that they become a grind devoid of fun, spontaneity or even cashflow.

  4. Presumably I do this too, but have not noticed it, and am doing various things in an easily-avoided wrongheaded fashion.

A common factor seems to be taking some underinformed model of society or of group organising as given and never interrogating it. Another factor seems to be doing things that feel intuitively right, in favour of attempting to revisit preconceptions and do better.

I have been noticing this pop up often recently because I am an extroverted generalist and I talk with people about their pet projects. Hesitant to dansplain to someone their own pet project, I often nonetheless find myself dumping a body of foundational literature related field into the hand of my passionate friend, that they probably would have read earlier in their odyssey into whatever thing is not working, for what as far as I am concerned are widely-understood reasons. Often time they seem grateful and/or surprised to find that anyone could put time into systematically working out better ways to do project even though clearly many other people throughout history might have tried to do a very similar project. Sometimes my ideas are unwelcome, which is fine, but I am occasionally confused when someone who has done no basic diligence is convinced that is no possibility that anyone could have done any research in the area.

HOW OFTEN DO I DO THIS MYSELF? Which projects of mine are ignorant of good practices? Which background reading am I failing to do? Should I drop into analysis paralysis even more often? Should I be soliciting way more advice?

Why do we do this? How can we detect it? Is it that some of these errors simply look more obvious to me in particular because I have done a lot of social research?

The phenomenon I have outlined does resemble the implied mechanism of the lede example in the original (Kruger and Dunning 1999) paper, which talks about a bank robber who was crap at researching bank robberies before trying them out, despite the high cost of failure. The later parts of that paper have come under methodological attack for being, essentially, a sampling bias epiphenomenon. But does that first anecdatum reflect something analogous to what happens when trying complicated things?

I am indebted to Miriam Lyons for an alternative explanation, which is that we are slow to update what was once a good idea. Garish media stunt may have been effective for greenpeace in the 80s. Freestyle group organising may have been great in the first few months. Terrible business model may have been great for the first gig and merely failed to scale up past that one etc. We are, in this version, bad at continuing to learn.

That theme has com eup a few times, so how about I make a new notebook about failure to update beliefs?

References

Kruger, Justin, and David Dunning. 1999. Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments. Journal of Personality and Social Psychology 77: 1121–34.
Watts, Duncan J. 2011. Everything Is Obvious: Once You Know the Answer. 1st ed. New York: Crown Business.
———. 2014. Common Sense and Sociological Explanations.” American Journal of Sociology 120 (2): 313–51.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.