Did You Know
Under crisis conditions, you are more prone to poor decision-making, not only because your brain produces hasty (and often biased) decision options, but also because your brain uses several tricks to deceive you into thinking they are well reasoned decisions.
Why it Matters
During a crisis and especially at the start, when stress, emotions, and ambiguity are particularly high, you will often observe crisis managers making poor decisions and following through on them until it's too late and damage is done. During after-action reviews, they nearly always wonders why they took the path they did when better option were clearly on the table. This is a common and serious problem in crisis management, especially among those crisis teams that don't employ decision-making safeguards. Research offers insights that helps break down this problem into more manageable parts.
The significant impact of biases and heuristics (mental shortcuts) on routine decision-making is well established and frequently discussed in management courses and articles, so I won't focus on that very much here. One of the best and most popular book on this topic is Thinking Fast and Slow by psychologist and Nobel laureate Daniel Kahneman.
But what do we know about decision-making biases under crisis conditions? How do crisis conditions such as time-pressure, stress, and ambiguous and fluid situations affect our biases, and to what extent? Which of the 100+ biases get amplified during a crisis and which ones tend to kill individual and team performance in a crisis? These are central questions in crisis management but still a relatively new topic of research. That said, Kahneman provides us with a few useful insights, buried in various parts of his book, that I attempt to summarize here.
First, in high pressure and/or ambiguous situations, our brain has a tendency to use mental shortcuts and jump to conclusions, even more so than in everyday decision-making. In what psychologists call "strong situations", our brain's default setting is speed, not analysis, and our brain can generate scenarios and decision options faster by relying on our memory of previous experiences than by processing the facts of the ground. Our brain uses the facts on the ground to construct a viable story in our mind but disregards many key facts in the process. In practice, this seems to lead to a number of common and serious mistakes among crisis teams, such as: acting promptly on the first "easy to imagine" option that is put on the table, acting in a way that is inconsistent with the situation or facts on the ground (but consistent with the story or outcome they want), and being overly optimistic that the approach or solutions they used in the last crisis will work again.
Second, and to make the situation worse, our brain also employs an array of powerful and unconscious tricks to convince us that these automatic and biased decisions are good choices. He explains that our brain not only "invents causes and intentions" but also "..tricks you into feeling that you are making well-reasoned decisions" that are "..reinforced by pleasant feelings, illusions of truth and reduced vigilance”. In these conditions, our brain pumps out an array of hormones and chemicals, like dopamine, that make us feel better and more confident about our decisions - no matter what they are. It's no surprise then that in a crisis, we have a hard time changing our course of action. Our brains may have evolved this powerful trick as a means of encouraging (or forcing) us to act decisively in order to get out of a threatening situation. Whatever the reason, it does not serve us very well is most (corporate) crisis management situations*. In practice, this may partly explain why we often see crisis managers and teams overestimating their chances of success, sticking to a course of action when it's clearly not working, and dismissing alternatives. One thing is very clear: our crisis teams stand little chance unless they systematically employ decision-making safeguard.
So the next time you find yourself in a crisis and feeling pretty good and comfortable with an important decision, it's worth pausing to ask: did I process it through a consultative decision-making process? Is it indeed based on facts on the ground, critical thinking and good reasoning? Or is my brain playing tricks?
Crisis teams should employ at least one formal and consultative decision-making process that engages critical thinking. It is also good practice to employ additional safeguards, such as a reflection team (devil's advocate team) or additional processes that engage individual and team critical thinking.
I hope you find this post interesting and please comment below if you have observed this during a simulation or real crisis.
by Sebastien Hogan
*This effect may not be as pronounced in well trained and experienced emergency teams that deal with recurring similar scenarios.