Risk perception

David Spiegelhalter [survival by arrows]{/images/breast-200-tree.jpg)

Heuristics and biases. Micromorts.

Risk, perception of. How humans make judgments under uncertainty, as actors able to form beliefs about the future.

Trying to work out how people make decisions and whether it is

  • probabilistic

  • rational

  • optimal

  • manipulable

and what definitions of each of those words would be required to make this work.

People really want this to be simple, in the sense of mathamtically simple, and not simple in the evolutionarily plausible simplcity of the idea that it is just messy, and our messy lumpy Pleistocene brains worry about great white sharks more than loan sharks because the former have teeth.

Assigned reading 1: The Beast Upon Your Shoulder, The Price Upon Your Head.

Assigned reading 2: Visualising risk in the NHS Breat cancer screening leaflet

Health information leaflets are designed for those with a reading age of 11, and similar levels of numeracy. But there is some reasonable evidence that people of low reading age and numeracy are less likely to take advantage of options for informed choice and shared decision making. So we are left with the conclusion -

Health information leaflets are designed for people who do not want to read the leaflets.

David Spiegelhalter discusses microlives:

many risks we take don’t kill you straight away: think of all the lifestyle frailties we get warned about, such as smoking, drinking, eating badly, not exercising and so on. The microlife aims to make all these chronic risks comparable by showing how much life we lose on average when we’re exposed to them:

a microlife is 30 minutes of your life expectancy

The Ellsberg paradox

If you’re like most people, you don’t have a preference between B1 and W1, nor between B2 and W2. But most people prefer B1 to B2 and W1 to W2. That is, they prefer “the devil they know”: They’d rather choose the urn with the measurable risk than the one with unmeasurable risk.

This is surprising. The expected payoff from Urn 1 is $50. The fact that most people favor B1 to B2 implies that they believe that Urn 2 contains fewer black balls than Urn 1. But these people most often also favor W1 to W2, implying that they believe that Urn 2 also contains fewer white balls, a contradiction.

Ellsberg offered this as evidence of “ambiguity aversion,” a preference in general for known risks over unknown risks. Why people exhibit this preference isn’t clear. Perhaps they associate ambiguity with ignorance, incompetence, or deceit, or possibly they judge that Urn 1 would serve them better over a series of repeated draws.

The principle was popularized by RAND Corporation economist Daniel Ellsberg, of Pentagon Papers fame.


Using expected utility to explain anything more than economically negligible risk aversion over moderate stakes such as $10, $100, and even $1,000 requires a utility-of-wealth function that predicts absurdly severe risk aversion over very large stakes. Conventional expected utility theory is simply not a plausible explanation for many instances of risk aversion that economists study.

Case, Donald O., James E. Andrews, J. David Johnson, and Suzanne L. Allard. 2005. “Avoiding Versus Seeking: The Relationship of Information Seeking to Avoidance, Blunting, Coping, Dissonance, and Related Concepts.” Journal of the Medical Library Association 93 (3): 353–62. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1175801/.

Enke, Benjamin, and Thomas Graeber. 2019. “Cognitive Uncertainty.” Working Paper 26518. National Bureau of Economic Research. https://doi.org/10.3386/w26518.

Gigerenzer, Gerd, and Daniel G. Goldstein. 1996. “Reasoning the Fast and Frugal Way: Models of Bounded Rationality.” Psychological Review 103 (4): 650–69. https://doi.org/10.1037/0033-295X.103.4.650.

Gigerenzer, Gerd, and Ulrich Hoffrage. 1995. “How to Improve Bayesian Reasoning Without Instruction: Frequency Formats.” Psychological Review 102 (4): 684–704. https://doi.org/10.1037/0033-295X.102.4.684.

Golman, Russell, David Hagmann, and George Loewenstein. 2017. “Information Avoidance.” Journal of Economic Literature 55 (1): 96–135. https://doi.org/10.1257/jel.20151245.

Golman, Russell, and George Loewenstein. 2015. “Curiosity, Information Gaps, and the Utility of Knowledge.” SSRN Scholarly Paper ID 2149362. Rochester, NY: Social Science Research Network. http://papers.ssrn.com/abstract=2149362.

Howell, Jennifer L., and James A. Shepperd. 2012. “Behavioral Obligation and Information Avoidance.” Annals of Behavioral Medicine 45 (2): 258–63. https://doi.org/10.1007/s12160-012-9451-9.

Kahneman, Daniel. 2003a. “A Psychological Perspective on Economics.” The American Economic Review 93 (2): 162–68. https://doi.org/10.1257/000282803321946985.

———. 2003b. “Maps of Bounded Rationality: Psychology for Behavioral Economics.” The American Economic Review 93 (5): 1449–75.

Kahneman, Daniel, and Amos Tversky. 1979. “Prospect Theory: An Analysis of Decision Under Risk.” Econometrica 47 (2): 263–92. http://www.nolle.se/wp-content/2011/11/Kahneman-Tversky-Prospect-theory79.pdf.

———. 1984. “Choices, Values, and Frames.” American Psychologist 39 (4): 341–50. https://doi.org/10.1037/0003-066X.39.4.341.

Köszegi, Botond, and Matthew Rabin. 2006. “A Model of Reference-Dependent Preferences.” Quarterly Journal of Economics 121 (4): 1133–65. https://doi.org/10.1162/qjec.121.4.1133.

Loomes, Graham, and Robert Sugden. 1982. “Regret Theory: An Alternative Theory of Rational Choice Under Uncertainty Graham Loomes; Robert Sugden.” Economic Journal 92 (368): 805–24.

Rabin, Matthew, and Richard H. Thaler. 2001. “Anomalies: Risk Aversion.” The Journal of Economic Perspectives 15 (1): 219–32. https://doi.org/10.2307/2696549.

Samuelson, Paul A. 1938. “A Note on the Pure Theory of Consumer’s Behaviour.” Economica 51. https://doi.org/10.2307/2548836.

Sedlmeier, and Gerd Gigerenzer. 2001. “Teaching Bayesian Reasoning in Less Than Two Hours.” J Exp Psychol Gen 130: 380–400.

Sharpe, Keiran. 2015. “On the Ellsberg Paradox and Its Extension by Machina.” SSRN Scholarly Paper ID 2630471. Rochester, NY: Social Science Research Network. http://papers.ssrn.com/abstract=2630471.

Simon, HA. 1956. “Rational Choice and the Structure of the Environment.” Psychological Review 63 (2): 129–38. https://doi.org/10.1037/h0042769.

Simon, Herbert A. 1975. “Style in Design.” Spatial Synthesis in Computer-Aided Building Design 9: 287–309. http://edra.org/sites/default/files/publications/EDRA02-Simon-1-10_0.pdf.

———. 1958. “"The Decision-Making Schema": A Reply.” Public Administration Review 18 (1): 60–63. https://doi.org/10.2307/973736.

Sloman, Steven A., and Philip Fernbach. 2017. The Knowledge Illusion: Why We Never Think Alone. New York: Riverhead Books.

Tversky, A., and D. Kahneman. 1981. “The Framing of Decisions and the Psychology of Choice.” Science 211 (4481): 453–58. https://doi.org/10.1126/science.7455683.

Tversky, Amos, and Itamar Gati. 1982. “Similarity, Separability, and the Triangle Inequality.” Psychological Review 89 (2): 123–54. https://doi.org/10.1037/0033-295X.89.2.123.

Tversky, Amos, and Daniel Kahneman. 1973. “Availability: A Heuristic for Judging Frequency and Probability.” Cognitive Psychology 5 (2): 207–32. https://doi.org/10.1016/0010-0285(73)90033-9.

———. 1974. “Judgment Under Uncertainty: Heuristics and Biases.” Science 185 (4157): 1124–31. https://doi.org/10.1126/science.185.4157.1124.

Wolpert, Daniel M, R Chris Miall, and Mitsuo Kawato. 1998. “Internal Models in the Cerebellum.” Trends in Cognitive Sciences 2: 338–47. https://doi.org/10.1016/S1364-6613(98)01221-2.

Wolpert, David H. 2010. “Why Income Comparison Is Rational.” Ecological Economics 69 (2): 458–74. https://doi.org/10.1016/j.geb.2009.12.001.