Last month, during an executive education session, we were discussing how very smart people can make decisions that, upon the benefit of 20/20 hindsight, seem boneheaded. In fact, when we analyzed a situation, the most common response was “What were they thinking?” We humans pride ourselves on being good thinkers and decision makers. We talk about how we go through a rational decision making process, and indeed, the heart of any executive’s role is that of making good decisions.
In many cases, we think we go through a very rational decision making process: We look at the problem, we gather facts about the problem, we look at alternatives and options,we weigh the advantages and disadvantages of each option, and we make a decision based upon the “best” option available. In fact, the more experienced we become, we then think through a mental chess match of the impact of that decision in multiple layers of the organization, our partners, our competitors, and then create a second level of rational decisions based upon what we anticipate are the most likely decisions of the other party. Rodin’s sculpture, “The Thinker”, and Descartes, “I think, therefore, I am” are shining examples of this rational decision making model. This is the basis of the scientific method. Milton Friedman won a Nobel Prize for his model of rational decision making and the concept of “satisficing” decisions. The most common of decisions focuses upon receiving a recommendation or series of recommendations from another member of the team. Usually, the decision is whether to accept a recommendation, reject or modify the recommendation, and then either make the decision or refer it to a more senior level.
Yet recent research in behavioral economics, led by recent Nobel Prize winner, Daniel Kahneman, outlines a series of biases that can distort the thinking and reasoning of even the most senior of executives. In an article in HBR, he and his colleagues, Lovallo, and Sibony, outline a series of tactics that you and I can use to reduce our biases and make better decisions.
Kahneman, et. al, note that we have two types of thinking-intuitive and reflective. Intuitive thinking is almost like autopilot-we walk, drive, brush our teeth, prepare coffee, and engage in everyday conversation. Intuitive thinking is strongly linked with our senses and creates context for different words, phrases, and helps us make quick links between seemingly unrelated ideas. For those who have “seen this before”, this intuitive thinking can help rapidly make sense of new situations.
Yet, in the background is reflective thinking, ready to engage when we do something new, important, or takes a great deal of concentration. Unfortunately, unless we intentionally engage reflective thinking, we can be led astray. If we don’t have experience with a new situation and rely on our gut or hunch, we may get entirely different (and boneheaded) solutions. For example, take the word, “shot”. For a basketball fan, especially here in ACC country, shot means something completely different than for someone who is an Olympic track and field fan.
Kahneman and his colleagues note that simply knowing that you and I have these two types of thinking and their accompanying biases is necessary, yet insufficient. In future posts, I’ll outline some of their major points and some ways to dramatically improve your team’s decision-making processes and results.