In our very special 100th episode, we attempt to answer our title question with a discussion around the book, “Normal Accidents: Living with High-Risk Technologies” by Charles Perrow. This book was first published in 1984, but later editions were released in 1999 and beyond.
The book explains Perrow’s theory that catastrophic accidents are inevitable in tightly coupled and complex systems. His theory predicts that failures will occur in multiple and unforeseen ways that are virtually impossible to predict.
Charles B. Perrow (1925 – 2019) was an emeritus professor of sociology at Yale University and visiting professor at Stanford University. He authored several books and many articles on organizations and their impact on society. One of his most cited works is Complex Organizations: A Critical Essay, first published in 1972.
- David and Drew reminisce about the podcast and achieving 100 episodes
- Outsiders from sociology, management, and engineering entered the field in the 70s and 80s
- Perrow was not a safety scientist, as he positioned himself against the academic establishment
- Perrow’s strong bias against nuclear power weakens his writing
- The 1979 near-disaster at Three Mile Island – Perrow was asked to write a report, which became the book, “Normal Accidents…”
- The main tenets of Perrow’s core arguments:
- Start with a ‘complex high-risk technology’ – aircraft, nuclear, etc
- Two or more values start the accident
- “Interactive Complexity”
- 787 Boeing failures – failed system + unexpected operator response lead to disaster
- There will always be separate individual failures, but can we predict or prevent the ‘perfect storm’ of mulitple failures at once?
- Better technology is not the answer
- Perrow predicted complex high-risk technology to be a major part of future accidents
- Perrow believed nuclear power/nuclear weapons should be abandoned – risks outweigh benefits
- Three reasons people may see his theories as wrong:
- If you believe the risk assessments of nuclear are correct, then my theories are wrong
- If they are contrary to public opinion and values
- If safety requires more safe and error-free organizations
- If there is a safer way to run the systems outside all of the above
- The modern takeaway is a tradeoff between adding more controls, and increased complexity
- The hierarchy of designers vs operators
- We don’t think nearly enough about the role of power- who decides vs. who actually takes the risks?
- There should be incentives to reduce complexity of systems and the uncertainty it creates
- To answer this show’s question – not entirely, and we are constantly asking why
“Perrow definitely wouldn’t consider himself a safety scientist, because he deliberately positioned himself against the academic establishment in safety.” – Drew
“For an author whom I agree with an awful lot about, I absolutely HATE the way all of his writing is colored by…a bias against nuclear power.” – Drew
[Perrow] has got a real skepticism of technological power.” – Drew
“Small failures abound in big systems.” – David
“So technology is both potentially a risk control, and a hazard itself, in [Perrow’s] simple language.” – David