Thinking, Fast and Slow

by

Daniel Kahneman

Thinking, Fast and Slow: Part 3, Chapter 23 Summary & Analysis

Summary
Analysis
A few years after beginning his work with Tversky, Kahneman convinced some officials in the Israeli Ministry of Education of the necessity for a textbook on decision making and judgment. After about a year, the team that Kahneman had assembled had constructed a detailed outline, written a few chapters, and had run a few sample lessons. Kahneman asked the group to individually estimate how long it might take to submit a finished draft. The estimates ranged from 1.5 to 2.5 years.
Even as Kahneman was writing a textbook about these topics, he fell victim to overconfidence as well, emphasizing its pervasiveness and how difficult it is to overcome. In polling his colleagues, each of them gave the best-case scenario time frame by which they might finish the textbook.
Themes
Human Fallibility and Overconfidence Theme Icon
Then Kahneman asked Seymour Fox, an expert in curriculum development, whether he could think of teams similar to theirs who had developed curricula, and how long they took. Fox realized that many teams did not finish the project, and those who did took around seven to ten years. Seymour’s estimate had been in the same range as everyone else’s until Kahneman prompted him; he had not utilized the prior knowledge that he had.
Fox, like the rest of the team, also fell victim to overconfidence—but his was unique, due to the fact that he could have easily recalled information that would have disproven his intuition. But instead, he let his System 1 processing prevail over System 2’s deliberation.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Human Fallibility and Overconfidence Theme Icon
The statistics that Fox provided, Kahneman writes, should have dissuaded them from continuing the project. It took them eight years to finish it and by that time Kahneman was neither living in Israel nor still part of the team. The enthusiasm for the textbook in the Ministry of Education had waned, and it was never used.
For Kahneman, the principle of overconfidence that he and his team exhibited had real-life consequences. Despite spending eight years of time and effort in creating the textbook, this hard work was unlikely to and does not pay off.
Themes
Human Fallibility and Overconfidence Theme Icon
Kahneman learned three lessons from this incident. The first is the distinction between the two methods of forecasting (which he labels the inside view and the outside view). The inside view of forecasting is what the team had initially adopted to estimate their remaining time. But they made mistakes by basing it off of the work they had already done: the first chapters they wrote were likely easier, and their commitment had been at its highest.
The inside view and the outside view could just as easily be labelled the “System 1 view” and the “System 2 view.” The inside view had essentially substituted the simpler question of how long it had taken them to complete their current work for the question of how much work they had remaining.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Human Fallibility and Overconfidence Theme Icon
Get the entire Thinking, Fast and Slow LitChart as a printable PDF.
Thinking, Fast and Slow PDF
The outside view directed Seymour’s attention to a class of similar cases to theirs. This allowed him to come up with a base rate, which gave a better idea of the range of possibilities—and showed that the group’s inside-view forecasts were not even close.
The outside view, on the other hand, consisted of System 2 utilizing a base rate. This base rate stemmed not from a biased view of the team’s own work, but instead from examples of other teams.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Human Fallibility and Overconfidence Theme Icon
The second lesson is that Kahneman and his team estimated a best-case scenario rather than a realistic assessment. Even though the rest of the group did not have Seymour’s outside information, they did not feel they needed it. They felt very comfortable making predictions from an individual case, rather than needing information about other groups. And they assumed that they would do better than others who had similarly tried and failed.
In Kahneman and the rest of the group’s case, even though they did not have the base rate, they still fell victim to overconfidence. This is a clear case of WYSIATI, wherein the group only used the biased information they had available to them to make their calculations.
Themes
Human Fallibility and Overconfidence Theme Icon
Quotes
The third lesson is that they should have given up the project. This is similar to the experiment that suggested the futility of teaching psychology: learning about the general cases did not alter the students’ assessments of the individual people they were introduced to.
The inability to abandon the project also suggests overconfidence because the team believed that the principles that applied to other teams would not also apply to them.
Themes
Human Fallibility and Overconfidence Theme Icon
Kahneman and Tversky coined the term planning fallacy to describe plans like this that are unrealistically close to best-case scenarios and could be improved by consulting the statistics of similar cases. Examples of the planning fallacy can be found in government projects, businesses plans, and home renovations. People begin from an overly optimistic place and end up spending more than if they had started with a more expensive but realistic plan.
Kahneman then expands his analysis of overconfidence to provide larger examples that might be true of his readers. He discusses areas in which they might also commit the planning fallacy, in the hopes that they might take measures to avoid that fallacy.
Themes
Human Fallibility and Overconfidence Theme Icon
The outside view, Kahneman and Tversky found, is the cure to the planning fallacy. It is now called reference class forecasting—using information from other similar ventures to help predict how much something might cost or how long it might take. It is also important for organizations to recognize overly optimistic plans and to instead reward planners for precise execution.
In keeping with one of the goals of the book, Kahneman lays out the ways in which people can avoid the planning fallacy. Instead of taking the easy, optimistic route, finding a deliberate, realistic path often saves time and effort in the end.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Human Fallibility and Overconfidence Theme Icon
Kahneman realizes that not only did the team commit the planning fallacy, but he was particularly at fault because he did not have an accurate baseline prediction when they started. If they had, they surely would not have begun the project. And because they had already invested effort, it was hard to give up at that point. In the future, he writes, he hopes that he would begin with the outside view.
Kahneman’s point here touches on the “sunk cost fallacy”—another example of human fallibility that he discusses in the next chapter, in which people have a difficult time letting go of projects in which they’ve already invested time and effort.
Themes
Human Fallibility and Overconfidence Theme Icon