Thinking, Fast and Slow

by

Daniel Kahneman

Thinking, Fast and Slow: Part 3, Chapter 19 Summary & Analysis

Summary
Analysis
Trader-philosopher-statistician Nassim Taleb introduced the notion of a “narrative fallacy” to describe how flawed stories of the past shape our views of the world and our expectations for the future. We search for simplicity in the world, assign more weight to talent and stupidity than luck, and focus on events that do happen rather than those that don’t.
The third part of Kahneman’s book focuses on human overconfidence and the mistakes that we make due to that overconfidence. Taleb’s points demonstrate that we believe that our actions change outcomes more than they actually do.
Themes
Human Fallibility and Overconfidence Theme Icon
Compelling narratives foster an illusion of inevitability, like Google’s story. Two creative graduate students at Stanford come up with a superior way of searching for info on the internet. They obtain funding to start a company and within a few years, the company is one of the most valuable stocks in America. In this story, every decision the founders made was a good one and contributed to its success.
The story of Google is an example of the emphasis we place on our own actions. Although the narrative is true, that the founders made a lot of good decisions, they were also extremely lucky—a factor that we often leave out of narratives about success.
Themes
Human Fallibility and Overconfidence Theme Icon
This narrative tells only part of the story, because no story can include the myriad of events that would have caused a different outcome. Bad luck could have disrupted any one of the successful steps, but instead, the founders had a great deal of good luck.
It is difficult to incorporate ideas of luck into success stories because nonevents (things that don’t happen) are hard to conceptualize, particularly because in some ways they can be infinite.
Themes
Human Fallibility and Overconfidence Theme Icon
Many people, Kahneman writes, claim they knew that the 2008 financial crisis was inevitable. But Kahneman explains they could not have known it; instead they could only have thought that it would happen and were proven correct. We know and understand the past less than we believe we do.
People become particularly overconfident in hindsight. The financial crisis example demonstrates that people overstate their ability to know and understand the past.
Themes
Human Fallibility and Overconfidence Theme Icon
Get the entire Thinking, Fast and Slow LitChart as a printable PDF.
Thinking, Fast and Slow PDF
When an unpredicted event occurs, we adjust our view of the world to accommodate the surprise so that the surprise makes sense. We have a hard time reconstructing past states of knowledge or beliefs. Once we adopt a new view of the world, we immediately lose much of the ability to recall what we used to believe. This causes us to underestimate the extent to which we were surprised by past events, or “hindsight bias.”
Hindsight bias serves as another example of overconfidence. It is a way in which the brain essentially changes what we thought our prior beliefs were. This effectively makes us more correct about an event than we actually had been in the past.
Themes
Human Fallibility and Overconfidence Theme Icon
Quotes
In 1972, Baruch Fischhoff conducted a survey just before President Nixon travelled to China to meet with Mao Zedong. Respondents assigned probabilities to different possible outcomes of the meeting. After Nixon’s return, respondents were asked to recall the probability they assigned to different outcomes. If the event had actually occurred, they exaggerated what they had thought the probability of that event was.
Fischoff’s experiment illustrates the concept of hindsight bias that Kahneman describes, and people’s overconfidence in their answers having been correct.
Themes
Human Fallibility and Overconfidence Theme Icon
Hindsight bias leads us to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad. We often blame decision makers for good decisions that worked out badly and give them too little credit for successful moves that appear obvious later—called an “outcome bias.” The worse the consequence, the greater the hindsight bias, as in the harsh judgment brought on the CIA after 9/11 for not anticipating the attack.  Hindsight bias and outcome bias usually foster risk aversion, but they also bring undeserved rewards to irresponsible people who take risky gambles and win.
As Kahneman illustrates here, these forms of overconfidence lead us to make unfair judgments on people. We believe, after something has happened, that that outcome appeared more obvious than it actually was. Thus, we judge people using knowledge that neither they nor we had, which leads to the cruel kind of feedback that Kahneman describes.
Themes
Human Fallibility and Overconfidence Theme Icon
In discussing recipes for success, Kahneman brings up a study that looks at the correlation between the quality of a CEO and the success of their firm. In a predictable world, the correlation would be 1. Instead, a generous estimate finds the correlation to be .30, indicating 30% overlap of shared factors. This means that given a pair of CEOs and a pair of firms, the stronger CEO would lead the stronger firm about 60% of the time—only 10% better than random guessing.
Overconfidence also yields inappropriate narratives about success. We like to believe that successful CEOs tend to run successful firms, but mathematically this isn’t necessarily the case, as Kahneman demonstrates by showing that the correlation is positive but not particularly strong.
Themes
Human Fallibility and Overconfidence Theme Icon
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon
Yet even given this statistic, consumers like to believe that CEOs have a great impact on the success or failure of a firm. Entire genres of literature have been devoted to analyzing the success or failure of individuals and companies. But a business school professor named Philip Rosenzweig shows in his book The Halo Effect that the knowledge of the success or failure of a company greatly affects how we view the CEO. The CEO of a successful company is called flexible, methodical, and decisive. That same CEO, if things later go sour, might be called confused, rigid, and authoritarian.
The overconfidence we place in successful CEOs plays into another form of overconfidence that Kahneman has already discussed: the halo effect, which is also aptly the title of Rosenzweig’s book. In the halo effect, we tend to like everything about a person, and that becomes true when we consider the success of a given firm. It would be strange to use the negative adjectives to describe the CEO when the firm is garnering success.
Themes
Human Fallibility and Overconfidence Theme Icon
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon
Many books are devoted to analyzing good managerial practices, which they argue will lead to good results. But this ignores the fact that firms that are more or less successful could also simply be described as more or less lucky. Stories to the contrary maintain an “illusion of understanding,” which again provide causal explanations for random events.
Kahneman finishes the chapter by reiterating that many events have a lot of random factors, and the success of any person or company requires luck. Still,  we choose to search for and believe narratives that imply concrete causes and effects.
Themes
Human Fallibility and Overconfidence Theme Icon
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon