Why Good People Do Bad Things in a Crisis

 

Did You Know

During a corporate crisis (and in other strong situations), your "ethical switch" can get temporarily and unconsciously turned off, making you vulnerable to (very) bad decision-making. This phenomenon, called ethical blindness, is a temporary, involuntary and unconscious state that is so potent that people who experience it (and make poor decisions) are often shocked and surprised by their own behaviour afterwards. Could this help explain why we see good leaders and employees doing uncharacteristically bad things in a crisis?

Why it Matters

We are seeing a wave of corporate crises caused and amplified by unethical behaviour. We normally attribute bad behaviour to bad people, and while this may true in many cases, it does not explain why we often see good, honest and intelligent people crossing over ethical boundaries during a crisis. Research indicates this may be the result of a serious, temporary and unconscious decision-making problem that can affect us all.  While most corporate crisis management teams employ safeguards to mitigate common biases, ethical safeguards are rarely used. This must change and research is yielding useful insights and practical solutions that can be adopted by crisis teams.

Research Insights

The following insights are drawn from the course Unethical Decision Making in Organisations and the work of Drs. Palazz0, Hoffrage and Krings at the University of Lausanne.

Ethical blindness can be defined as the temporary inability of a decision-maker to see the ethical dimension of a decision at stake.  It is an unconscious, temporary and involuntary state. In strong contexts and under certain conditions, our brain's perceptions and reasoning become altered. Our brain becomes focused on certain aspects of the situation at the expense of others, and our ethical reasoning can get slowly or swiftly kicked out of our conscious. We only get our normal ethical reasoning back online once the context weakens, or by wrestling it back through conscious effort.

And if you think it cannot happen to you, think again. Research suggests that ethical blindness is caused more by the contextual conditions than the character or intelligence of the person affected.  People of high intelligence and good moral character seem just as vulnerable. Therefore, ethical blindness safeguards must be designed to mitigate the strong conditions that warps our thinking, rather than relying on our good moral character to jump to our defense when we need it most. 

While ethical blindness can result from a number of different organizational, cultural and environmental factors, research indicates that time pressure, authority pressure, role pressure and peer pressure can be significant drivers. It's therefore easy to see why crises are Petrie dishes for ethical blindness given that they are often characterized by high time pressures, more (than usual) directive or authoritative management approaches, less familiar and more risky roles and responsibilities, and intense scrutiny and pressure from peers and other stakeholders.

Practical Advice

Here are some practical suggestions based on research and practical experience that, taken together, should reduce the risk of ethical blindness in a crisis management team. 

  1. Discuss ethical blindness with your team to raise awareness. This is a key first step.
  2. Add a short reflection pause into your routine decision-making process, without fail. Research shows that taking just 3 minutes of reflection time during a decision increases the probability of making an ethical decision.  
  3. Add an ethics checkpoint at the end of each meeting. For example, you could ask the team a "pre-mortem" question such as: "assume for a moment we acted on these decisions; how do our stakeholders, public, management and families perceive it?" Encourage each member of the team to speak up, since crises can make people less willing to raise concerns. 
  4. Don't ever let the team deviate from the code of conduct or cross ethical boundaries, even on seemingly small issues. Research suggests that ethical blindness is a slippery slope that often starts with minor ethical deviations.
  5. Task someone on the team to act as the "white-hat" or ethical watchdog to monitor and enforce points 2, 3 and 4 above.  Ethical blindness can creep in if no one is specifically tasked. To create a shared responsibility for this, rotate this task among the team members on a daily basis. 
  6. Task the Reflection Team leader (or an outsider) to provide ethical oversight. Even with the best of intentions, leaders and teams can slip into periods of ethical blindness and an outsider perspective is an important backup. 
  7. Always use a formal and consultative crisis management process to make decisions and assign tasks.  Gathering views from team members helps avoid narrow framing issues and can help ease authority pressure.  If you are lucky enough to be choosing the crisis manager, then pick a decisive person with a consultative decision-making style and avoid authoritarians at all cost. 
  8. Assign tasks in small doable chunks with clear purpose, instructions and reasonable time frames, and prepare them for their crisis roles and responsibilities. People are less prone to ethical blindness when feel more in control. The last thing you want is someone on the team going blind and inadvertently start working against the long long term interests of the team and organization. 

The free online course “Unethical Decision Making in Organisations” offers a wide range of other useful and practical safeguards that can be applied at the individual, team and organisations level. Nearly every module of the course offers valuable insights for crisis managers. I promise you will find it time well spent (and I'm not paid to say this).

I hope you find this post interesting and please comment below if you have observed this during a simulation or real crisis. 

By Sebastien Hogan