Thinking Fast and Slow (Part 1)
Daniel Kahneman is a psychologist who shared a Nobel Prize in Economics for the work he did with Amos Twersky on the processes people use to make decisions. This book explains their view of how we see the world and react to it. This is a big book with some big ideas. As a result, I’ve decided to present the content into sections. This first section explains the idea that we have two processes for perceiving and reacting to our experiences. Called System 1 and System 2 as a way to contrast the two processes, it is critical to understand that there are not literally two systems in the brain processing inputs. It just seems that way.
One of the origins of this work was the question: are people intuitive statisticians? There are a number of aspects of life that are intuitive (grammar is learned intuitively), but statistics turns out not to be one of them. In other words, people have a poor grasp of probability and its consequences. So in many cases, people make decisions inconsistent with statistical logic. These are not always reflexive decisions either. In many cases, there is no time pressure to overcome logical thinking. Careful examination also showed that these decisions were not the result of emotion dominating logic; these were the result of systematic cognitive “errors”.
Intuition is an extremely important cognitive function, and System 1 presumably evolved to increase our survival prospects. One of the early realizations was that intuition was a form of recognition. People with greater expertise have greater stores of experience to use in recognition. Herbert Simon noted that “The situation has provided a cue; this cue has given the expert access to information stored in memory, and that information provides the answer. Intuition is nothing more and nothing less than recognition”. Intuition is very useful when it works, but a problem when it leads to actions that don’t work – or lead to harm.
One feature of System 1 is that it is automatic in the sense that it is always on (it operates whether you are awake or asleep). It does not require any “effort” to engage. System 2 is on when you are awake, but minimally operating until engaged. The phrase “pay attention” expresses the call to switch from System 1 alone to using System 2 as well. System 2 requires considerable effort, so it is difficult to pay attention to more than one thing at a time or to anything for extended periods. System 2 is an energy drain, and consequently we use when we have to – but no more than that. In the end, everybody is lazy (this is biologically sensible – save your energy for when you really need it). There are many demonstrations of the energy drain that paying attention causes. One of the best can be found at attention test . When we focus enough attention on something, we may become blind or deaf to our surroundings. Paying attention to one thing distracts us from others. The other observation is that we are unaware that we are blind under these conditions.
In general, their research showed that many of our decisions are made by System 1 and system 2 usually goes along with that decision. In other words, many of our decisions are intuitive and we don’t really think about them. This exposes us to a set of risks that result from the effects of cognitive conflicts, illusions, and useful fictions.
Cognitive conflicts arise when we are asked to examine things which are out of place. For example, if you are asked to read a list of words related to color where the words themselves are in color, you will read the word RED much faster than the word RED. There is no conflict when the color and word are coherent, but a conflict when they are not. This is an example of familiarity – we can see and understand what we are familiar with much more easily than what is new to us. System 1 prefers to examine what we are familiar with, and ignore what we are not. Illusions are perceptions that are not overturned by information. One common illusion is shown below.
The upper line looks longer, but it is not. Even when you know they are the same length, the upper line looks longer. That is an illusion. System 1 sees and acts on the illusion. System 2 knows that there is an illusion and acts on the basis of reality. Useful fictions are used to enable expression. For example, the book and this summary treat Systems 1 and 2 as if they were things (or people), but they do not exist. It is a useful fiction to treat them this way because it makes the concepts easier to understand. We create useful fictions both in learning or understanding events and in expressing them. Sometimes, the analogy is close enough that decisions work, but sometimes the analogy is practically false. Based on the false expression of circumstances, we make faulty decisions.
Central to understanding the interplay between Systems 1 and 2 is the energy requirements of paying attention. The author suggests that you do a simple test of this. Go on a walk with a friend and while walking ask them to multiply in their head two numbers (like 23 and 51) right then. Almost certainly, they will stop walking to carry out the task. The brain does not have enough capacity to walk and multiply bigger numbers at the same time. When people are under heavy System 2 loading, they begin to make more “selfish” choices. Research suggests that there is a single pool of mental energy which is depleted by thinking, physical and emotional effort. As the pool of mental energy is drained, a person begins to cut corners to preserve remaining energy. The energy referred to here is real metabolic energy – not some sort of metaphoric energy. The brain’s primary fuel is glucose and the brain is one of the major consumers of glucose in the body and generators of heat. A bout of heavy thinking can measurably decrease the concentration of glucose in the blood. Because System 2 creates this drain, it is very efficient to defer as much to System 1’s automatic associative thinking in order to save energy.
System 1 is an association-driven approach to decision making, where the definition of decision is very broad. Associations between ideas are not treated as ideas by System 1 but realities. If a person is shown two “disgusting” words, they grimace as if experiencing the reality of disgust. Two “happy” words induce a smile (these gestures may not be visible to the eye, but can be detected with instruments). Over a person’s life, they build up a huge network of associations. This network enables a phenomenon called “priming”. Priming occurs when an input triggers an association network such that the next bit of information leads to a decision. The book gives the example of SO_P. If you are shown the word EAT first, then you know that the word is soup. But if you are shown the word CLEAN first, you know the word is soap. The flood of information we experience is constantly priming us for decisions to come. For most occasions, the priming is useful and we make decisions automatically. Driving a car is mostly an automatic experience, where events prepare us for decisions that we will need to make. We knew that person was going to change lanes without signaling.
Such priming acts in many less obvious ways. If you make somebody ashamed, they are more likely to make the word SOAP out of the incomplete word S__P than soup. And the effect can be very specific. If you make them tell a lie by email, they will subsequently prefer to buy soap over mouthwash. If they deliver the lie over the phone, they prefer the mouthwash. This is one of many such studies, each of which shows that we are easily influenced by our history and circumstances into taking predictable actions. Our self image is of independent thinkers who take each instance as it comes and make decisions based on the relevant information. This self image is not what is observed in these experiments.
What’s more, people are unaware of these tendencies. The idea you should focus on, however, is that disbelief is not an option. The results are not made up, nor are they statistical flukes. You have no choice but to accept that the major conclusions of these studies are true. More important, you must accept that they are true about you.*
In a practical demonstration of the impact of priming on people, an office kitchen had a box for people to put money into along with a price list for coffee, tea etc. The money collected did not always cover the consumption observed. One day a small banner photo was placed right above the price list showing a pair of eyes. The photo was changed each week, sometimes showing flowers and sometimes showing eyes. Collections were much higher in weeks showing eyes than in weeks showing flowers. The suggestion that someone was watching was sufficient to trigger payment.
If association is a main drive of decision making, the second main influence is “cognitive ease”. In a way this is the inverse of cognitive dissonance. If something makes sense, we are at ease. A false statement creates a feeling of unease that cues us to look more closely. The determination of ease is done by System 1. Information that is easily and quickly processed is viewed as more valid than information that is difficult to process. A statement that is difficult to read due to font, contrast or color is judged less true than one that is easier to read. Thus short simple sentences are more believable (bulleted lists might be even better). Ease can derive from feelings of familiarity, truth or effortlessness. These feelings can be based on experience and lead to good decisions. They can also be based on inappropriate analogies with past instances and mislead. For example, a repeated false statement will become familiar. When probed later, you might recognize the statement as familiar and accept it as true. Combining a true statement with an untrue statement will result in you attributing “more” truth to the whole because you will create an illusion of truth about the whole. Expressed differently, humans prefer repetition to novelty, so false repetition may be preferable to novel truth.
Anytime that System 2 is engaged, cognitive ease drops. Thus, anytime we really need to think about something, we are less comfortable. But this has an interesting corollary. There are a variety of cognitive puzzles that can be presented to people. One classic is:
In a lake, there is a patch of lily pads. Each day, the patch doubles in size. If it takes 48 days for the patch to cover the whole lake, how long would it take to cover half the lake? 24 days or 47 days?
A group of Princeton students was presented a test containing puzzles like this, but half were presented the puzzle in a crisp clear print and half were presented the puzzle in a faint but legible print….90% of the students who saw the CRT in normal font made at least one mistake in the test, but the proportion dropped to 35% when the font was barely legible. You read this correctly: performance was better with the bad font. Cognitive strain, whatever its source, mobilizes System 2, which is more likely to reject the intuitive answer suggested by System 1 (the correct answer to the puzzle is 47 days).
One interesting application of cognitive ease relates to a “sense of knowing”. In effect, people often know when a problem can be solved even before they actually know the solution. People in a good mood are more accurate in their perception than people in a bad mood….when in a good mood, people become more intuitive and more creative, but also less vigilant and more prone to logical errors.
The importance of familiarity is also shown by the speed by which abnormal events are detected. System 1 is monitoring the environment and making predicts about what will happen next. Events that are consistent with the predictions cause ease, but events contrary to prediction cause surprise. Such surprise has been shown to arise in tenths of a second. This is part of a cycle of cause-and-effect. While the standard logical thinking about this is that causes induce effects, the brain treats this as more of a symmetric condition. Causes have effects and effects have causes. Presented with an effect and a list of coincident events, System 1 tries to assemble the information into causes-and-effect. Nicholas Taleb’s book, The Black Swan, describes reporting on Bloomberg News on the day that Saddam Hussein was captured. Early in the day, prices rose and the headline was U.S. TREASURIES RISE; HUSSIEN’S CAPTURE MAY NOT CURB TERRORISM. Prices fell later that day and the headline was U.S. TREASURIES FALL; HUSSEIN’S CAPTURE BOOSTS ALLURE OF RISKY ASSETS. One event “caused” opposite outcomes. More likely, there was no connection between Hussein’s capture and price changes, but there is a need to believe that there must be a cause. The most prominent coincident event must be the cause – especially if they precede the event of interest. People are very adept at creating cause-and-effect meaning from their experiences. This expectation of causation prevents us from seeing events in a probabilistic context. System 1 seeks causation, but it is System 2 that uses statistical reasoning. The expectation of causality combined with the constant prediction of System 1 creates a wonderful ability to jump to conclusions.
One expression of cognitive ease is exaggerated emotional coherence, better known as the halo effect. The effect arises when we use the answer to one question (Ann a good speaker) to answer another question for which no information is available (Is Ann generous?). For System 1, this is a quick way to solve problems based on an assumption that all things associated with something (Ann) are correlated. Assignment of this overarching status on something eliminates inconsistency in my “feelings” about Ann and puts my mind at ease. If I am seeking to maintain valid opinions about a topic, I need to decorrelate my information. This may require that I deliberately create “independent” thoughts about the topic. This is a function of System 2. One common way to improve decision making is to ask for other opinions. The goal is to get an independent (uncorrelated) perspective on a situation. This a potential benefit of engaging a group in a decisions. But group discussions build correlation in the group and degrade the independence of thought. A simple rule can help: before an issue is discussed, all members of the committee should be asked to write a brief summary of their position….The standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them.
The basics of making judgments involve some basic assessments. Basic assessment sis a broad topic but two specific aspects of interest are “sets and prototypes” and “intensity matching”. One of the basic assessments that System 1 makes is where to categorize a situation. This is a sophisticated version of pigeonholing. Almost anything can be pigeonholed into an existing category based on previous experience. Intensity matching is a process where the answer to one question is obtained by answering a different question involving an emotional element. For example, the answer to the question “how much would you contribute to save Honduran frogs” would be answered by considering the question “how much do I care about frogs”. This process matches the intensity of my commitment to frogs with my willingness to contribute and supplies a quick answer.
Most judgments are made using heuristics, which are technically simplified procedures, but in this context are substitutions of one question for another. The major heuristics are:
- The 3D heuristic – This heuristic interprets a situation based on a perceptual illusion. For example, you look at a photograph and interpret it in 3D, even though it is in 2D. This is an illusion that can’t be overcome with information.
mood heuristic for happiness – The heuristic substitutes an evaluation of
events for the answer to the question – am I happy. The book cites an
experiment where student’s answers to the following questions were
- How happy are you these days?
- How many dates have you had in the last month?
But asked in the opposite order, the answers were highly correlated. In the latter case, feelings about recent romantic experience substituted for general happiness. The first answer primed System 1 for the second question.
- The affect heuristic - In an affect heuristic , people let their likes and dislikes determine their beliefs about the world. For example, if you don’t like red meat – you probably don’t think it is safe.
All of this description of System 1 makes it seem like a danger, but it is a vital part of good function. System 1 is the result of a constant learning process, guided by System 2. System 1 handles many routine functions with minimal effort while monitoring the environment for threats. When you react to something “before” you are even aware of it; that is System 1’s work. Most of the decisions made by System 1 are justifiable, useful and effective. But on occasion, the results are objectively wrong. This will be the subject of later summaries.
Comments & interpretations
- System 1 might be an important part of associative creativity. The sort of “instant” connection may be part of the ideas that come “out of the blue”.
- It is common advice to focus on a task. If a task is simple or complicated (which is really just a long series of simple tasks), focus reduces the error rate. But when we focus on a complex problem, we are more likely to miss important unexpected cues that emerge from the work. There is probably an optimal level of attention for some work, and it is not always intense.
- The idea that a person has a single energy pool for all activities is not a surprise. What is interesting is the impact of depletion on subsequent decisions. For example, if I spend a lot of time thinking hard about something, I lose interest in eating healthy foods, exercise, and being social. But the same must be true in reverse. If I must expend a lot of energy being social, then I am less likely to have the energy to eat well or exercise properly. We know that the US population has been gaining weight for two decades. Could this be a consequence of our increasingly complex social and work environment which requires greater cognitive loading and results in diminished control?
- Almost everybody hates Power Point presentations. I wonder how much can be traced to our intuitive understanding that Power Point corrupts communication and understanding. One the one hand, the bulleted lists create the illusion of factuality and logic because they are easy to read. On the other hand, we can load up a slide with lots of detail and know that nobody will actually look at it, because it is too much work. Presenters often use this, by showing us the slide and skipping over it. Here is the detail you need, but I won’t take you through it because it is hard. Power Point manipulates us in ways that almost no other medium does. We know this, even when we do not know how or why.
- Is one reason that people resist new ideas and innovation that they lack familiarity? I’d assume that the greater the novelty of an idea, the more difficult it is for people to accept. Perhaps this also links to all the stories of innovators who ultimately succeed by persistence and repetition. Eventually, a novel idea becomes familiar.