Daniel Kahneman’s primary aim in Thinking, Fast and Slow is to explain human problem-solving, decision-making, and behavioral economics for those without psychology degrees. In order to do that, Kahneman first introduces readers to two ways in which people think, which he calls “System 1” and “System 2.” System 1 handles involuntary, automatic processing, and is often associated with intuition. Peoples’ intuitions are often right, but in certain circumstances, System 1 makes key judgmental errors or is easily manipulated. System 2, on the other hand, is used in scenarios that require more deliberate effort (such as calculating 17 x 24). While System 2 can be more accurate than System 1, it too can sometimes be fooled by simple manipulations because the human brain tries to use the least amount of energy possible when confronted with something that cannot be calculated or solved automatically. Kahneman argues that the brain is lazy by nature, and people should work to recognize situations in which mistakes or manipulations are likely and attempt to avoid those missteps.
Kahneman demonstrates that the brain naturally tends towards System 1 because it requires less effort, but that system is prone to make mistakes because it processes things extremely quickly and automatically. Thinking fast takes little energy, unlike thinking slow (for example, people will naturally stop walking if they are asked to complete a difficult mental task). Because of this, people naturally tend towards allowing System 1 to take over. As an example, Kahneman describes a common puzzle: A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost? The intuitive answer is 10 cents, but using System 2, one can determine that the correct answer is 5 cents. People could easily calculate this, but they tend to let their automatic thinking handle the work and therefore make mistakes. Like visual illusions, Kahneman brings up these kinds of cognitive illusions so that people can recognize the value of putting in just slightly more cognitive effort.
System 1 is a way of learning patterns. It separates things into categories and examples to help complete certain tasks, like being able to immediately recognize that two towers are the same height, or being able to determine the approximate average of a set of lines. However, although System 1 does well with comparisons, it doesn’t do well with sums. Participants in a study were asked about their willingness to pay to help save birds after an oil spill. Different groups of participants stated their willingness to pay to save 2,000 birds, 20,000 birds, or 200,000 birds; though the number of birds was vastly different, the answers from the different groups were nearly identical. The emotional attachment to birds does not depend on the number of them. Thus, fast thinking can sometimes lead to judgements that don’t make sense in comparison to other responses. The goal, then, is to broaden the context in which people make decisions. System 1 is also easily manipulated by outside factors. Kahneman describes this as “priming.” For example, people who read words associated with money, or who are exposed to the idea of money in other ways, unconsciously become more selfish and less altruistic. People’s automatic associations, then, can change their behavior and actions in ways that they do not realize. They do not intentionally choose to be less altruistic—their System 1 makes that choice for them. Primes can be found in many places, but Kahneman’s point is to ensure that people are not intentionally manipulated.
While System 1 has its blind spots due to its being automatic, System 2 also has issues even when people try to put more deliberate effort into making decisions. System 2 is devoted to tasks that require attention and effort, like trying to count the instances of the letter “a” on a page, or picking out a relative in a crowd. However, in instances of more complicated questions and decisions, System 2 can be easily fooled because it can sometimes be preoccupied with other thoughts and is often lazy. For example, participants in a study were asked to watch a video of a basketball game and count the number of passes made by the team wearing white. The participants were so focused on the task that they rarely noticed a woman dressed in a gorilla costume walk into the game, pound on her chest, and then walk out of the game. This illuminates the blind spots that people might have if they are concentrated on something else. System 2 also makes people prone to simplify complicated questions in order to work less hard. When people are asked how successful a candidate might be in politics, they often substitute far simpler questions, like whether that candidate looks like a political winner. The issue with these simplifications, we need to recognize, is that sometimes the actual question asked requires a lot more information and analysis, yet people instead formulate important opinions, decisions, and financial contributions based on easier questions. Another way that System 2 tries to simplify its thought process is by using any available information as a guide, even though the information may not actually be useful. If a person is asked whether Gandhi was more than 114 years old when he died, they will give a much higher estimate of his age at death than if the first question had asked if he was more than 35. System 2 is still activated, but it relies on available information to make its decisions. But it is important for us to recognize when that information is obviously uninformative, and not to be swayed by it.
System 1 and System 2 are both modes of thinking that help people answer questions and make assumptions. In Thinking, Fast and Slow, Kahneman demonstrates that people cannot always rely on their automatic responses, but also that even when people put in extra effort, they are prone to errors because they rely on faulty reasoning. The goal, then, is to recognize those methods of faulty reasoning (which Kahneman refers to as heuristics) and attempt to avoid them in order to be more rational and accurate thinkers.
Intuition, Deliberation, and Laziness ThemeTracker
Intuition, Deliberation, and Laziness Quotes in Thinking, Fast and Slow
The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.
Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System I in making routine decisions. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.
The bat-and-ball problem is our first encounter with an observation that will be a recurrent theme of this book: many people are overconfident, prone to place too much faith in their intuitions.
The results are not made up, nor are they statistical flukes. You have no choice but to accept that the major conclusions of these studies are true. More important, you must accept that they are true about you.
Contrary to the rules of philosophers of science, who advise testing hypotheses by trying to refute them, people (and scientists, quite often) seek data that are likely to be compatible with the beliefs they currently hold.
We often fail to allow for the possibility that evidence that should be critical to our judgment is missing—what we see is all there is.
We are far too willing to reject the belief that much of what we see in life is random.
The explanation is a simple availability bias: both spouses remember their own individual efforts and contributions much more clearly than those of the other, and the difference in availability leads to a difference in judged frequency.
The lesson is clear: estimates of causes of death are warped by media coverage. The coverage is itself biased toward novelty and poignancy.
People without training in statistics are quite capable of using base rates in predictions under some conditions. […] However, concern for base rates evidently disappears as soon as Tom W’s personality is described.
The illusion of skill is not only an individual aberration; it is deeply ingrained in the culture of the industry. Facts that challenge such basic assumptions—and thereby threaten people’s livelihood and self-esteem—are simply not absorbed.
For most people, the fear of losing $100 is more intense than the hope of gaining $150. We concluded from many such observations that “losses loom larger than gains” and that people are loss averse.
You read that “a vaccine that protects children from a fatal disease carries a 0.001% risk of permanent disability.” The risk appears small. Now consider another description of the same risk: “One of 100,000 vaccinated children will be permanently disabled.” The second statement does something to your mind that the first does not.
People will more readily forgo a discount than pay a surcharge. The two may be economically equivalent, but they are not emotionally equivalent.
Saving lives with certainty is good, deaths are bad. Most people find that their System 2 has no moral intuitions of its own to answer the question.
The use of time is one of the areas of life over which people have some control. Few individuals can will themselves to have a sunnier disposition, but some may be able to arrange their lives to spend less of their day commuting, and more time doing things they enjoy with people they like.
The investment of attention improves performance in numerous activities—think of the risks of driving through a narrow space while your mind is wandering—and is essential to some tasks, including comparison, choice, and ordered reasoning. However, System 2 is not a paragon of rationality. Its abilities are limited and so is the knowledge to which it has access.