Wednesday, October 20, 2010

Why do we make poor risk-based decisions?

By Donald Holden, CISSP-ISSMP 

 

Everybody makes decision about accepting risks, often without knowing the actual probability of an adverse outcome. In business, we talk about security being concerned with the management of risk. When we make critical decisions that affect the privacy, safety, or security of individuals or corporate assets, do we base these decisions on objectively determined probabilities or upon our perception of risk and rewards, in other words our gut feel? Our perception of risk can change when the adverse impact does not happen even though the actual probability has not changed. People who have been warned to evacuate due to a hurricane that then misses them, tend to minimize the risk of future hurricane warnings. Or if you own a house, have you installed a burglar and fire alarm? If so did you do so after a break-in or fire? The probability of a future break-in is the same before and after the event that may have caused you to install an alarm. Two academic studies provide valuable insight into how we react to perceived risk rather than calculated or statistical risk. 

The first study analyzed how we lower our perceived risk after we have successfully avoided a near-miss or had a close call. A near-miss is defined as an “event that could have been a failure but for luck or chance.”  The second study looked at how we postpone mitigating known critical risks based on short term optimizing goals. Or in the words of the authors, we have a “psychological bias toward short-term maximization instead of long-term planning—a psychological bias all humans share.”  Understanding and then overcoming these human biases in decision making is necessary to improving the safety and security of our corporate and personal environment. 

The first study concerning near-misses was published in Management Science[1] and discussed in the McDonough School of Business (Georgetown) Magazine[2]. Two professors, Robin Dillon-Merrill and Catherine Tinsley, from Georgetown’s McDonough School of Business looked at the impact of near-misses on how we make decisions using perceived risk rather than calculated or statistical risk. They looked at how the near-misses in the American Space Shuttle Program led to the Columbia Space Shuttle catastrophe.  Falling foam insulation during previous space shuttle had fallen during lift off; some had struck the heat tiles but caused no major damage. There had been a calculation of the risk that foam insulation could cause damage to tiles but the experience with near-misses affected the decisions that led to the Space Shuttle Columbia catastrophe in 2003 where the foam did cause major damage to the tiles. Basically, the professors’ research and subsequent experiments with people playing a simulation with near-misses showed that when people have a near-miss, they see it as successfully avoiding the adverse impact rather than a near failure that could happen again with an adverse outcome. Surprisingly, the researchers found that the participants in the experiments did not actually believe that there was a reduction in the calculated or statistical probability of the adverse impact. It was their perception of risk that was lowered; this was an emotional not a rational reaction to the near-misses. 

In an article Masters of Disasters[3] two professors at Wharton’s Risk Management and Decision Processes Center ran a computer simulation where participants are told that they have a house and a bank account with $20,000 which pays 10% interest and they were given a warning that an earthquake could occur at any moment and 3 to 5 mild to severe quakes will happen during the game. Players could spend money on structural improvements to the house or continue to earn interest on the money left in the bank. Although initially players spent some money on improvements, they postponed major improvements thinking that a severe quake would not happen in the next few minutes of the games. By taking these risks, all players lost everything when a quake did happen. Their initial perception of risk seems to change based on the non-occurrence of the quake and the opportunity to earn interest. The Quake players which included both students and then corporate executives found “a sense of security from observing the flimsiness of one another’s houses. If everyone around you has a house of straw, having a straw house yourself seems somehow safer.” Does this sound similar to how businesses look at security risks? 

The authors of the near-miss study recommend treating the near-miss events not as successes but as failures using counterfactual thinking; that is,  imagining how an outcome could have turned out differently, and how the antecedents that led to the event might have been different. Just as we try to learn from our mistakes and failures, we need to learn from the near-misses before they become failures. We should recognize that our experience with near-misses usually causes us to reduce our perception of risk. When we combine this tendency with our preferences for short term gain at the expense of longer term impacts, we can see how a range of risk-based decisions are more emotional than rational.  Recognizing this tendency in ourselves and others can help us make better and more rational risk-based decisions that affect safety and security. 

Last week’s quiz question
What fact about South Hall makes it unique among Norwich University buildings?

Answer: It is LEED certified as a “green” building.

You can learn what this means at:

Winner: William Westwater

This week’s quiz question
In what year did the current Norwich University library open?

Past winners
Andrey N. Chernyaev:  5 wins
Matt Bambrick: 3 wins
Dianne Tarpy: 2 wins
Bill Lampe: 2 wins
Scott Madden: 2 wins
Sam Moore
Autumn Crossett
Gil Varney, Jr.
Glen Calvo
Thomas Reardon
Sherryl Fraser
Srinivas Chandrasekar
Marc Ariano
Linda Rosa
Joanna D'Aquanni
Srinivas Bedre
Christian Sandy
Joseph Puchalski
Ken Desforges
William Westwater




[1] Dillon R, Tinsley C. “ How Near-misses influences decision making under risk.”  Management Science ,V54, 2008 Aug: 1425-40.
[2] Blose, Chris.  “Researching risky business.”, McDonough School of Business Magazine. 2009 June.
[3] Fagone J.  “Masters of disasters.”. Wharton Magazine. Summer 2010.

No comments:

Post a Comment