Episode Summary

In today’s episode, we discuss another in our series of foundational papers: “Applying Systems Thinking to Analyze and Learn from Events” published in a 2011 volume of Safety Science by Nancy Leveson.  Leveson is a renowned Professor of Aeronautics and Astronautics and also a Professor of Engineering Systems at MIT. She is an elected member of the National Academy of Engineering (NAE). Professor Leveson conducts research on the topics of system safety, software safety, software and system engineering, and human-computer interaction.

Episode Notes

We will review each section of Leveson’s paper and discuss how she sets each section up by stating a general assumption and then proceeds to break that assumption down.We will discuss her analysis of:

  1. Safety vs. Reliability
  2. Retrospective vs. Prospective Analysis
  3. Three Levels of Accident Causes:
  4. Proximal event chain
  5. Conditions that allowed the event
  6. Systemic factors that contributed to both the conditions and the event

Discussion Points:

  • Unlike some others, Leveson makes her work openly available on her website
  • Leveson’s books, SafeWare: System Safety and Computers (1995) and Engineering a Safer World: Systems Thinking Applied to Safety (2011)
  • Drew describes Leveson as a “prickly character” and once worked for her, and was eventually fired by her
  • Leveson came to engineering with a psychology background
  • Many safety professionals express concern regarding how major accidents keep happening and bemoaning – ‘why we can’t learn enough to prevent them?’
  • The first section of Leveson’s paper: Safety vs. Reliability – sometimes these concepts are at odds, sometimes they are the same thing
  • How cybernetics used to be ‘the thing’ but the theory of simple feedback loops fell apart
  • Summing up this section: safety is not the sum of reliability components
  • The second section of the paper: Retrospective vs. Prospective Accident Analysis
  • Most safety experts rely on and agree that retrospective accident analysis is still the best way to learn
  • Example – where technology changes slowly, ie airplanes, it’s acceptable to run a two-year investigation into accident causes
  • Example – where technology changes quickly, ie the 1999 Mars Climate Orbiter crash vs. Polar Lander crash, there is no way to use retrospective analysis to change the next iteration in time
  • The third section of the paper: Three Levels of Analysis
  • Its easiest to find the causes that led to the proximal event chain and the conditions that allowed the event, but identifying the systemic factors is more difficult because it’s not as easy to draw a causal link, it’s too indirect
  • The “5 Whys” method to analyzing an event or failure
  • Practical takeaways from Leveson’s paper–
  • STAMP (System-Theoretic Accident Model and Processes) using the accident causality model based on systems theory
  • Investigations should focus on fixing the part of the system that changes slowest
  • The exact front line events of the accident often don’t matter that much in improving safety
  • Closing question: “What exactly is systems thinking?” It is the adoption of the Rasmussian causation model– that accidents arise from a change in risk over time, and analyzing what causes that change in risk


“Leveson says, ‘If we can get it right some of the time, why can’t we get it right all of the time?’” – Dr. David Provan

“Leveson says, ‘the more complex your system gets, that sort of local autonomy becomes dangerous because the accidents don’t happen at that local level.’” – Dr. Drew Rae

“In linear systems, if you try to model things as chains of events, you just end up in circles.’” – Dr. Drew Rae

“‘Never buy the first model of a new series [of new cars], wait for the subsequent models where the engineers had a chance to iron out all the bugs of that first model!” – Dr. David Provan

“Leveson says the reason systemic factors don’t show up in accident reports is just because its so hard to draw a causal link.’” – Dr. Drew Rae

“A lot of what Leveson is doing is drawing on a deep well of cybernetics theory.” – Dr. Drew Rae


Applying Systems Thinking Paper by Leveson

Nancy Leveson– Full List of Publications

Nancy Leveson of MIT

The Safety of Work Podcast

The Safety of Work on LinkedIn