Thinking in Bets
Annie Duke
This is a book written about decision making based on what the author (a trained psychologist and professional gambler) learned playing poker. The book is not about poker or gambling. It is about dedication to an approach to making decisions based on rationality. Duke explains that many decisions are really bets. We are choosing to take an action that will deliver consequences in the future under conditions of uncertainty.
To become a better decision maker, one of the first things to do is to recognize that luck and skill both play a role. Our problem arises when we attribute all good outcomes to skill and all bad outcomes to luck. We tend to determine the quality of a decision by the quality of the outcome, but this is inappropriate. Duke calls this behavior resulting – we determine the quality of a decision by its results. Resulting blocks learning to decide better. Low probability events happen. When you choose a high probability event over a low probability event, you are hoping that the odds prevail.
In the 2015 Super Bowl, Seattle was trailing with less than 30 seconds to play and the ball at the 2-yard line. Seattle attempted a pass play, the ball was intercepted and Seattle lost. Pundits called it the “worst call ever”. But the statistics say that the chance of an interception was only 4%. The probability of success with other play choices were about equal, so the choice was rationally justified. You would take this play and win more often than not. A skilled decision with a bad outcome. The pundits were resulting.
Making rational decisions is hard – for anyone. Our brains are not really built for it. Our brains are built to make fast, “easy” decisions. And for the majority of our decisions, this works well. Most of these decisions are made based on habit in familiar situations. That is because most of our brain is dedicated to automatic decision making and our “deliberative” brain is fully in use. Duke strongly makes the point that we are not suffering from a lack of “will” or “discipline” when we make reflex decisions, because we do not really choose how to think. We must train our brain to have some additional reflexes.
One of our existing reflexive thoughts is that we think we know what we need to. Expressed differently, we are confident in what we know and that confidence affects our decision making. Confident understates our mental state – we convince ourselves that we are both right and certain that we are right. One of the first steps in becoming a better decision maker is to embrace the limits to what we can be certain about. We are discouraged from saying “I don’t know” or “I’m not sure.” We regard these expressions are vague, and even evasive. But getting comfortable with “I’m not sure” is a vital step to being a better decision maker. We have to make peace with not knowing.*
A good decision is based on our state of knowledge and an accurate assessment of our knowledge includes knowing what we are not certain about. This is similar to the idea of knowing what we don’t know. Very few decisions occur in isolation from other circumstances, and we can’t know all of them. Even things that we know as facts, may not be true anymore or at this moment. In the past, cigarettes were advocated by doctors are an aid to health and now they are not. The coelacanth, which was “extinct” for millions of years is alive, just like hundreds of other extinct species.
A barrier to being less certain is that it is hard to change our mind. We tend to develop beliefs very easily and those beliefs are hard to change. Numerous studies have shown that people tend to reject information that contradicts what they believe. It seems that our process for forming a belief is (1) we hear/see something, (2) we believe it, and (3) maybe we will try to confirm it sometime. Our belief system about of belief formation has step 2 be consideration of whether what we saw was correct, but it is the last thing that we do – if we do it at all. Duke describes an experiments in which people were presented a news story that reported that a fire started near a closet with old paint cans. A few minutes later, a correction was provided that the fire started elsewhere. When people then recounted the fire to the experimenters, they tended to explain that paint fumes were a cause of the fire. Even if we read corrections, they are not very effective. Belief supports reflexive decision making.
You may be thinking that the inability to displace incorrect beliefs is a sign of poor education or lack of intelligence. Smarter people have more trouble changing their minds because they are better at rationalizing away conflicts between beliefs and reality. It is relatively easy to observe biased thinking in others, but it is hard to see it in ourselves. We have a blind spot for our own biases – and some evidence suggests that the blind spot is bigger for more intelligent people. Being smart is no defense against poor decision making.
The book describes some issues and strategies for improving the decision making process. One good strategy is to get a learning group. This group helps you review your thinking after a decision has been made to see what can be learned from it. It is a form of after-action review. It is not there to approve a decision a priori, but to provide accountability (in the sense of needing to explain to this group) and illusion-busting. This is the sort of group where everyone needs to participate as both decider and reviewer – this is a “flat” group. Not everyone is ready to hear this sort of feedback, so the group may not be large or stabilize quickly. This group takes advantage if the ability to see the blind spot in others to teach us about our blind spot. The purpose of the group is not to hear you vent either. The charter for such a learning group needs to be “exploratory” – not “affirmatory”.
“Complex and open-minded thought is most likely to be activated when decision makers learn prior to forming any opinions that they will be accountable to an audience (a) whose views are unknown, (b) who is interested in accuracy, (c) who is reasonably well-informed, and (d) who has a legitimate reason for inquiring into the reason behind participants’ judgment/choices” according to Lerner and Tetlock. To be useful, the group needs a diversity of perspectives to explore decisions in different ways. A focus on accuracy/truth, accountability and open-mindedness should guide such a group.
A drive to create an accurate understanding is the basis of increasing skill. To explore a decision means look at it from different angles. Duke supplies a set of example questions to guide the exploration.
- Why might my belief not be true?
- What other evidence might be out there bearing on my belief?
- Are there similar areas I can look toward to gauge whether similar beliefs are true?
- What sources of information could I have missed or minimized on the way to reaching my belief?
- What are the reasons someone else could have a different belief, what’s their support, and why might they be right instead of me?
- What other perspectives are there as to why things turned out the way they did?
It is hard to imagine that you are someone else and use that perspective, it is easier to ask other people to provide it. They are not invested in your narrative about being right.
A second approach to learning is to foster dissent. This goes further than just being open-minded to involve opposition. But the conflict must follow a set of rules that gives everybody the same information. It is not useful when some people have more information than others. Robert K. Merton developed a set of norms that goes by the acronym CUDOS. These are:
- Communism, which means that all of the data and information belongs to the group – including conflicting information.
- Universalism which means using the same standards to information no matter where it comes from
- Disinterestedness which strive to minimize conflicts of interest or preferences
- Organized Skepticism which means structured dissent
We are familiar with the phrase “Don’t shoot the messenger”, but it is as important to not shoot the message either. We tend to attach credibility to information based on who we hear it from. This distorts our evaluation of the information. Frequently, we can’t really evaluate the sources credibility because we don’t have enough information of perspective about the source or their process. This is one reason that we need many sources of information and a degree of disinterest in evaluation. Skepticism plays into this because it attempts to place all information in the same condition – questionable. Skepticism gets a bum rap because it tends to be associated with negative character traits. A skeptic might ask questions about everything, with useful questions like “Why might that be untrue?” or “What would make this conclusion incorrect?” Productive skepticism questions to uncover sources of uncertainty, weak logic or poor assumptions. A group that structures this questioning through a “devil’s advocate” or a “red team” may find it easier to uncover dissent or disconfirming information.
Initiating dissent requires some communication skill. Some people will be uncomfortable hearing dissent, so these suggestions might make it easier for them.
- Start by expressing your uncertainty. You do not know that something is wrong – you wonder if it is wrong for a reason.
- Second, identify what seems correct and useful. There is almost always something that is correct that forms the basis for future action.
- Third, link your different perspective through an AND. For example, I agree with that projection, and I think it needs to account for the uncertainty about X.
- Fourth, ask the group to find a way to test the validity of the proposition. You are not asking to substitute your judgment for the other person’s, but you are asking to explore the issue.
A different approach (based in psychology) attempts to overcome our inability to predict the future. There is plenty of evidence to show that humans are poor at predicting either the outcomes or impacts of their decisions. However, humans are quite good at explaining the causes of events once they happen. One version of this approach is to project ourselves into the future while imagining that things did not work out as wished – but maybe as expected. For example, you have something important at work tomorrow, but you want to watch the end of the game tonight. Right now you feel just fine, but if you project yourself to tomorrow morning, you will realize that you will regret staying up. That regret moves from after the decision to before, where it might make a difference. Another version asks you to look back at a decision from 10 minutes, 10 months or 10 years. A decision that you regret in 10 minutes might not even be important on the scale of 10 years. We sometimes exaggerate the importance of now and imagine that we regret the temporary set-back we experience immediately more than we will regret the lost opportunity in 10 years.
An approach (based on expected value) helps by changing your view of the decision surrounding the information. For simplicity, the expected value of a decision is the probability of the event occurring times the benefit associate with the event. If you flip a coin, there is a 50% chance of getting heads, if you are at a “fair” and there is booth with a game. They will pay you $100 if you get three heads in a row. How much would you pay for a ticket? The expected value of this game is:
Any ticket less than $12.50 probably will probably earn a gain, if you played many times. So buying a ticket for $5 is a good decision, but you will lose 87.5% of the time. There are other versions of this sort of approach and they serve to introduce a purely rational perspective to the process. As this example shows, you are more likely to lose than gain. That does not make the $5 ticket a bad decision, but it does makes a $13 ticket one.
When a decision is going to have tangible consequences, there is value to slowing down and thinking about the situation. This is not always possible, so emergency responses take a different sort of decision making. But it pays in the long run to think of decisions as bets. Some will work out and some won’t. Because our skill levels can rise with time and practice, we want to increase our skill so that our outcomes are less dependent on luck and more dependent on skill.
Comment and interpretation:
- Duke makes the point that ignorance is the great driver of science. Most advance is driven by people noticing something that is inconsistent with or is unexplained by the current state of knowledge. Experiments are developed to fill the knowledge gap. According to James Clerk Maxwell (he was a famous and important scientist), “Thoroughly conscious ignorance is the prelude to every real advance in science.”
- Our belief creation system is the secret of successful lying. If a lie is the first thing you hear – you believe it. Worse, we will believe things to be true that are presented as lies. Right now you may be developing a belief that we can’t tell truth from lies or you may be rejecting this idea because this is not how you The consequences in business decision making may be profound. Companies develop common perspectives on the world, strategy or markets. These get entrenched as beliefs that prevent people in the company see the world changing. If this is true, almost the only way that a company can change is by changing all of the people – or at least changing all of the people’s jobs some that people come to a new area of business with different beliefs than the incumbents.
- The book has a delightful section in which a reality show actor is explaining all of the drama going on around her and attributing the drama to them. There are many people involved and it is complicated – good for reality TV, maybe not in real life. The interviewer then asks, “…maybe you are the problem.” He explains that he used to think everyone else was the problem and then realized to everyone else, he was the problem. The point is that we are part of the story, part of the problem. The question should not be ‘how do they get better?’, but ‘how do I get better?’ Becoming a better decision maker is self-improvement and your decisions are the only ones that you really control.
- I often find myself communicating with people who want to make a decision based on a summary of information. I find this frustrating because I know that the information that most influences me distorts the summary. This inevitably happens in these summaries. I emphasize some parts and ignore others, which might be different from what the author would emphasize. But more importantly, I tend to think that details matter. There needs to be a coherent understanding of the situation at low. Medium and high levels of detail (or abstraction). A summary loses nuance and ambiguity that is visible in the details. A summary make look more certain than it is and certainly does not explain the options for dealing with future problems. Duke describes numerous occasions when a poker hand is being discussed – it is all detail. Good decisions are made based on an understanding of the details and o good after-action review explores the details. Why are people unwilling to explore the details permitted to make important decisions?
- Related to this comment on detail is a story about the “power sweep” (an American Football play). John Madden went to a seminar on this play held by Vince Lombardi (who “invented” the play). On the surface, the play looks simple. The seminar lasted 8 hours and it was not boring (to a football coach anyway). Success requires a focus on details. In this context, I think of Toyota’s Production System which is devoted to innovation by people with the best understanding of production – the production employees. I think about the practice putting young engineers in positions to learn the details of production in practice. I think about the philosophy of delegating authority to those best positioned to decide because they really know the nitty-gritty of how things work. He is one of my biases on view.
- Check out Stuart Firestein’s TED talk at: Firestein on ignorance. He also has a book called “Ignorance: How it Drives Science”. These make the point that an open mind means that your mind is not already full of beliefs and facts. It is hard to learn when you already know everything – I don’t know who said this, but it feels true.
*text in italics is directly quoted from the book
Recent Comments