Reckoning with the Risk of Catastrophes: The Challenge of Interpreting Prior Near-Miss

Transcription

Reckoning with the Risk of Catastrophes: The Challenge of Interpreting Prior Near-Miss
Reckoning with the Risk of
Catastrophes: The Challenge of
Interpreting Prior Near-Miss
Events
Robin L. Dillon-Merrill
Associate Professor, McDonough School of Business,
Georgetown University
With collaborators Cathy Tinsley (Georgetown University)
and Peter Madsen (Brigham Young University)
Risk Analysis Provided in 1990
(From:
Paté-Cornell
& Fischbeck,
1999)
Columbia Data (CAIB, 2003)
In-Flight Anomalies Reported for each
NASA Space Shuttle Mission (1981-2010)
80
70
60
50
40
30
20
10
0
1980
1985
1990
1995
2000
2005
2010
Latent errors interacting with enabling
conditions
• Slices = safety
systems
• Holes = latent errors
& hazards
• System failure only
occur when the
holes in several
layers line up
indicating the
interaction of
several errors with
some enabling
conditions (and
generally bad luck)
What is a Near-Miss?
• Near-miss
– An event that has some non-trivial probability
of a negative (even fatal) outcome the actual
outcome is non-hazardous
– A success that could have been a failure
except for good luck
• “Almost” vs. “Could Have” Events are processed
differently
Could
Have
Almost
Recommendations for Recognizing and
Preventing Near-Misses
1. Pay attention to High Pressure Situations. Ask “If I had
more time and more resources would I make the same
decision?”
2. Watch for normalization of deviance. Ask “Have we
always been comfortable with this level of risk? Has our
policy toward this risk changed over time?”
3. Search for and uncover root causes. Ask “Why did this
effect happen? What was required to produce this
effect? What do we need to do to address the root
cause?”
4. Demand accountability. Ask “Does the corporate culture
make us feel accountable for our decisions?”
Recommendations for Recognizing and
Preventing Near-Misses
5. Consider worst-case scenarios. Ask “Could we have seen
other outcomes? How bad could the outcome have
been?”
6. Evaluate projects at every stage. Ask “Can we ‘pause and
learn’ something at this project milestone?”
7. Reward owning up. Ask “How can we create an
organizational culture that recognizes and rewards
uncovering near-misses?”
Conclusions
• Cognitive biases make near-misses hard to see
• Even when leaders recognize them, often it is
hard to grasp their significance
• Thus, we miss opportunities for organizational
improvement when cost is small and before
disaster strikes
Articles on Near-Misses
•
•
•
•
•
Catherine H. Tinsley, Robin L. Dillon, Matthew A. Cronin, “How NearMiss Events Amplify or Attenuate Risky Decision Making”,
Management Science, September 2012, pp. 1596-1613.
Catherine H. Tinsley, Robin L. Dillon, and Peter M. Madsen, “How to
Avoid Catastrophe,” Harvard Business Review, April 2011, pp. 90-97.
Robin L. Dillon, Catherine H. Tinsley, and Matthew A. Cronin, “Why
Near-Miss Events Can Decrease an Individual’s Protective Response
to Hurricanes,” Risk Analysis, Vol. 31, No. 3, March 2011, pp. 440-449
– selected as one of six Best Papers of 2011 by the editorial staff.
Robin L. Dillon and Catherine H. Tinsley, “How near-misses influence
decision making under risk: A missed opportunity for learning”,
Management Science, Vol. 54, No. 8, August 2008, pp. 1425-1440.
Robin L. Dillon and Catherine H. Tinsley, “Interpreting Near-Miss
Events,” Engineering Management Journal, Vol. 17, No. 4, December
2005, pp. 25-29.