Crisis Decision Making Biases

Spare a moment of thought for those who have crisis decision making responsibilities.  Hindsight is a wonderful thing and we can all be critical of such decisions in the comfort of the office or coffee room. Crisis decision making is a pressured environment in which decisions are necessary but are not always accompanied by the luxury of time or information. Nevertheless these decisions are often significant with real implications on resources such as time, money and people, and that may result in disruption to the routine of others or place the decision makers out on a limb. The difficulty is that like any other individual people faced with crisis decision making are prone to what is known as perceptual and cognitive biases. Such biases have been defined as “the decision-making traps that afflict all of us as we try to make choices. We fall into these traps because of cognitive limitations that are characteristic of all human beings”[1]. These biases can go a long way to explaining poor crisis decision making so it is useful to consider them [2].

Crisis Decision Making

  • Vividness. When a previous event is vivid, either for reasons that may be assessed as positive or negative, it can cause a heightened sense of recall and serve to enhance estimations of possibility and impact, in other words the risk, of similar future events. People who vividly recall preparation for the Millennium Bug might down play similar threats. Those who witnessed first hand a fire may do the opposite.
  • Primacy/Recency Effect. People prefer to concentrate on the first or last items in lists. They may base decisions more heavily on the very first or last sets of information they received, ignoring more salient points that are in the middle.
  • Overweighting of Small Probabilities. Outcomes with relatively small probabilities, when combined with vivid events, can influence the decision making process disproportionally while more probably events are downgraded in influence.
  • Availability. If there are plenty of similar examples that can be easily recalled then the perceived probability of similar outcome will rise. If there are few or none it will drop, meaning novel events that pose high risks may be underestimated.
  • Attribution. An enhanced degree of significance is placed on the decision-making and input of individuals by those individuals as opposed to how they perceive the value of the input of others. This could result in the contribution of others being considered wrongly to be superfluous and so ignored.
  • Sunk Cost. Decision makers stay loyal to the first course of action they arrived at despite the increasing likelihood that long-term results look unlikely to be positive.
  • Wishful Thinking. Individuals are attracted to positive projections of the future that can become fact in the minds of the decision makers. Thus they might down play the risk or the need to respond in a certain way, as they believe all will work out well.
  • Overconfidence. Some individuals may overrate their own abilities and take decisions they feel are unchallengeable or do not require further scrutiny or support. Great if they are right but potentially disastrous if they are wrong.
  • Confirmation Bias. Perhaps linked to wishful thinking people search for evidence that supports the desired hypothesis about the likely outcome or interpret available information in this way. This may be contrary to the actual situation being faced but as a result of this bias the decision maker is blind to this fact.
  • Diagnoses Bias. For a decision that has been arrived at any future information that challenges this diagnoses is given less prominence or even ignored. In this way false decisions prevail even when the evidence is there to suggest a timely change is required.
  • Cognitive Narrowing. In this case people focus on certain elements of what they know and not the whole, perhaps as a mechanism to cope with stress. For example fixating on the technical nature of an incident at the expense of assessing the wider secondary impacts it is causing could result in the belief that other teams are not required to support the response.

[1] Roberto, M. (2009) The Art Of Critical Decision-Making. Chantilly VA: The Teaching Company

[2] MacFarlane, R and Leigh, M. (2014) Information Management and Shared Situational Awareness Emergency Planning College Occasional Paper Number 12. Easingwold:EPC