Thinking, Fast and Slow

by

Daniel Kahneman

Teachers and parents! Our Teacher Edition on Thinking, Fast and Slow makes teaching easy.

Thinking, Fast and Slow: Part 4, Chapter 30 Summary & Analysis

Summary
Analysis
Kahneman visited Israel several times during a period in which suicide bombings became a concern for bus riders. There were 23 bombings on buses between 2001 and 2004, which caused 236 fatalities. The number of daily bus riders was 1.3 million at the time. The risks were tiny, but that was not how the public felt about them—they avoided buses as much as possible. Emotion and vividness influence availability and thus judgments of probability.
The judgment of the risk of riding a bus in this example recalls earlier examples of plane crashes to demonstrate the availability bias. We mistakenly overestimate things that we have recently witnessed or heard about, particularly if they evoke visceral images or have dominated the media.
Themes
Intuition, Deliberation, and Laziness Theme Icon
The possibility of a rare event is likely to be overestimated particularly when the alternative is not fully specified. A psychologist recruited basketball fans and asked them to estimate the probability that each of eight given teams would win the playoff, focusing on one team at a time. Their estimates of each of the eight teams should add up to 100% probability, but instead they added up to 240%. With each question, a different team became the focus, and that team’s chances were overweighted.
In this example, people place too much confidence in each team’s success because they do not fully consider the alternatives (recalling the principle of “what you see is all there is”). This leads to gross mistakes as people’s estimates defy the logic of probability.
Themes
Human Fallibility and Overconfidence Theme Icon
Prospect theory and utility theory differ in that utility theory asserts that probabilities and decision weights are the same, while prospect theory holds that people’s decision weights are correlated with but do not exactly match probability. Psychologists at the University of Chicago found that decision weights in gambles were even less correlated with probability when the fictitious outcomes were emotional (“meeting and kissing your favorite movie star” or “getting a painful, but not dangerous, electric shock”).
It makes sense that decision weights become less correlated with probability when outcomes are emotionally driven, because emotions are processed largely by System 1. Therefore, people are even less considerate of the actual probabilities of some outcomes and focus on their desire or aversion towards a given outcome.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Vividness and ease of imagining can change people’s decisions, as in this example: Urn A contains 10 marbles, 1 of which is red. Urn B contains 100 marbles, 8 of which are red. Drawing a red marble wins a prize. Which do you choose? About 30-40% of students choose the urn with the larger number of red marbles because of what Kahneman calls “denominator neglect.” A single red marble against an undefined white background seems to provide a lower chance than eight marbles against an undefined background, even though mathematically this is not the case.
When we consider the problem of the two urns, it is just as easy to visualize the two urns (perhaps even easier) as it is to calculate the probability of a red marble in each case. Yet when we do not mobilize System 2 and instead rely on System 1, we are more prone to make mistakes.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon
Get the entire Thinking, Fast and Slow LitChart as a printable PDF.
Thinking, Fast and Slow PDF
Denominator neglect explains why there are many different ways of communicating risks. A vaccine that carries a 0.001% risk of permanent disability seems much safer than a vaccine that carries this description: “One of 100,000 vaccinated children will be permanently disabled.” People have a hard time translating percentages and fractions, and the different ways of framing create opportunities for people’s opinions to be manipulated.
Denominator neglect once again serves as an example of our difficulty with statistics—when we think of the vaccine in terms of statistics, we view it as safe. But when we think of it in terms of individuals, we conjure a story in our mind of a disabled child and rate it as much riskier.
Themes
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon
Quotes
Kahneman gives an example of a study that refutes part of prospect theory. Instead of receiving descriptions of gambles, people are given the choice between two buttons, each of which has a gamble. The expected value of each gamble is approximately the same, but one option is riskier than another (for example, one button would have 5% to win $10 while the other would have 50% to win $1). When participants press a button, an outcome is drawn based on the odds of the gamble. They are given many trials, and thus learn the consequences of pressing one button or another. In these “choice by experience” situations, overweighting the rare event is never observed, and underweighting is common. A possible explanation for this effect is that people almost never experience the rare event.
Even though this experiment appears to be based mostly on intuition, it satisfies some of the earlier conditions that Kahneman lays out on what constitutes being able to garner expertise. Because the participants are exposed to a regular environment, get immediate feedback, and have ample trials to practice, the participants get to know the outcomes of the two buttons and start to gain some predictive skill. This leads to a lessening of overconfidence, and thus a lessening of mistakes like overweighting.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Human Fallibility and Overconfidence Theme Icon
For comparison, Kahneman gives an example of two different people from whom a person may want advice. Adele is consistent and helpful, but not exceptional. Brian is not quite as friendly or helpful most of the time, but on some occasions he has been extremely generous. Adele is closer to a sure thing, and people generally prefer her because of their global representation of her (thus they do not overweight rare events).
This example, evaluating the personalities of two people, is comparable to participants’ analysis of the two buttons. They have a global view of the person and understand that the rare event is just that—rare. Thus, they prefer consistency over optimism.
Themes
Choices, Losses, and Gains Theme Icon
This evidence is distinct from “choice by description,” because in the instance of 99% chance to win $1,000 and 1% chance to win nothing, our attention is called to the rare event, and thus we give it more concern than we would otherwise.
Another contrast between the description of a gamble and the experience above is that here, people are not often exposed to these chances. Thus, it’s hard for them to fully grasp the reality that 99 out of 100 times, they would win $1,000—they instead focus on the possibility of not winning.
Themes
Choices, Losses, and Gains Theme Icon