web site hit counter Normal Accidents: Living with High-Risk Technologies - Ebooks PDF Online
Hot Best Seller

Normal Accidents: Living with High-Risk Technologies

Availability: Ready to download

Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, t Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them. The first edition fulfilled one reviewer's prediction that it may mark the beginning of accident research. In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the quintessential 'Normal Accident' of our time: the Y2K computer problem.


Compare

Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, t Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them. The first edition fulfilled one reviewer's prediction that it may mark the beginning of accident research. In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the quintessential 'Normal Accident' of our time: the Y2K computer problem.

30 review for Normal Accidents: Living with High-Risk Technologies

  1. 5 out of 5

    Michael Burnam-Fink

    Normal Accidents is a momument in the field of research into sociotechnial systems, but while eminently readable and enjoyable, it is dangerously under-theorized. Perrow introduces the idea of the 'Normal Accident', the idea that within complex and tightly coupled sociotechnical systems, catastrophe is inevitable. The addition of more oversight and more safety devices merely adds new failure modes and encourages the operation of existing systems with thinner margins for error. Perrow provides nu Normal Accidents is a momument in the field of research into sociotechnial systems, but while eminently readable and enjoyable, it is dangerously under-theorized. Perrow introduces the idea of the 'Normal Accident', the idea that within complex and tightly coupled sociotechnical systems, catastrophe is inevitable. The addition of more oversight and more safety devices merely adds new failure modes and encourages the operation of existing systems with thinner margins for error. Perrow provides numerous examples from areas of technology like nuclear power, maritime transport, chemical processing, spaceflight and mining. What he does not do adequately explain why some systems are to be regarded as inherently unsafe (nuclear power) and others have achieved such dramatic increases in safety (air travel). Perrow defines complexity as the ability of a single component in a system to affect many other components, and tight coupling as a characteristic of having close and rapid associations between changes in one part of the system and changes in another part. The basic idea is that errors in a single component cascade rapidly to other parts of the system faster than operators can detect and correct them, leading to disaster. In some cases, this is incontrovertible: A nuclear reactor has millisecond relationships between pressure, temperature, and activity in the core, all controlled by a plumbers nightmare of coolant pipes-and there's little operators can do in an emergency that doesn't potentially vent radioactive material to the environment. However, it seems to me that complexity and tight coupling are a matter of analytic frames rather than facts: complexity and coupling can be increased or reduced by zooming in or out. The choice of where the boundaries of a system exist can always be debated. My STS reading group is looking at alternative axis for analyzing systems, but I'd note that the systems that seem particularly accident prone are distinctly high energy (usually thermal or kinetic, or the potential forms of either). When something that's heated to several hundred degrees, could catch fire and explode, is moving at hundreds of miles per hour, or is the size of a city block, does something unexpected it's no surprise that the results are disastrous. And whatever Perrow might recommend, there is no industrial civilization without high energy systems. One major problem is that Perrow's predictions of many nuclear disaster simply haven't come true. Three Mile Island aside, there hasn't been another major American nuclear disaster. Chernobyl could be fairly described as management error: while an inherently unsafe design, the reactor was pushed beyond its limits by an untrained crew as part of an ill-planned experiment before the disaster. Fukushima was hardly 'normal', in that it took a major earthquake, tsunami, and a series of hydrogen explosions to destroy the plant. The 1999 afterward on Y2K is mostly hilarious in retrospect. Perrow rightly rails against operator error as the most frequent cause of accidents. Blaming operators shields the more powerful and wealthy owners and designers of technological systems from responsibility, while operators are often conveniently dead and unable to defend themselves in court. The problem is that his alternative is the normal accident-a paralyzing realization that we must simply live with an incredible amount of danger and risk all around us. Normal accidents offers some useful, if frequently impractical advice for creating systems that are not dangerous, but more often it tends to encourage apathy and complacency. After all, if accidents are "normal", we should get used to glowing in the dark.

  2. 4 out of 5

    Andy

    This is a classic in its field and an exemplar of an entire genre that seems to be endangered now: serious non-fiction (as opposed to the oxymoronic creative non-fiction). Perrow demonstrates why the usual suspect of "operator error" is not a good explanation of what causes major accidents. His exposition of Normal Accident Theory is too detailed for most readers to dive into in detail, but the general points make sense (think about systems) and some of the specificities about how safety feature This is a classic in its field and an exemplar of an entire genre that seems to be endangered now: serious non-fiction (as opposed to the oxymoronic creative non-fiction). Perrow demonstrates why the usual suspect of "operator error" is not a good explanation of what causes major accidents. His exposition of Normal Accident Theory is too detailed for most readers to dive into in detail, but the general points make sense (think about systems) and some of the specificities about how safety features can increase danger are fascinating. He also gets a bit into exploring how and why the powers that be maintain the dangers they expose us to as well as the bogus operator error story. Subsequent books on the same topic are better written and more up-to-date but this is the real McCoy and so it's worthwhile if you want to get a feel for the thought processes of someone coming up with an original idea.

  3. 4 out of 5

    Eric_W

    8/14/2011 I keep recommending this book and with the BP disaster, it continues to be very, very timely. One of the points made by Perrow is that when complex technology "meets large corporate and government hierarchies, lack of accountability will lead inexorably to destructive failures of systems that might have operated safely." (From a review of A Sea in Flames: The Deepwater Horizon Oil Blowout by Gregg Easterbrook in the NY Times April 23, 2011.) Note added 3/2/09: Perrow's discussion of the 8/14/2011 I keep recommending this book and with the BP disaster, it continues to be very, very timely. One of the points made by Perrow is that when complex technology "meets large corporate and government hierarchies, lack of accountability will lead inexorably to destructive failures of systems that might have operated safely." (From a review of A Sea in Flames: The Deepwater Horizon Oil Blowout by Gregg Easterbrook in the NY Times April 23, 2011.) Note added 3/2/09: Perrow's discussion of the problems inherent in tightly coupled systems is certainly timely given the intricacies of the recent financial disaster. Certainly of a tightly coupled system that cause the entire system to collapse when only one component fails. ** This is a totally mesmerizing book. Perrow explains how human reliance on technology and over-design will inevitably lead to failure precisely because of inherent safety design. Good companion book for those who enjoy Henry Petroski. Some quotes: "Above all, I will argue, sensible living with risky systems means keeping the controversies alive, listening to the public, and recognizing the essentially political nature of risk assessment. Unfortunately, the issue is not risk, but power; the power to impose risks on the many for the benefit of the few (p. 306)," and further on, "Risks from risky technologies are not borne equally by the different social classes [and I would add, countries:]; risk assessments ignore the social class distribution of risk (p. 310)." and "The risks that made our country great were not industrial risks such as unsafe coal mines or chemical pollution, but social and political risks associated with democratic institutions, decentralized political structures, religious freedom and plurality, and universal suffrage (p. 311).

  4. 5 out of 5

    Patricia

    Sea Story: I worked as a shipboard Radio Officer for Exxon Shipping company on their tanker fleet, and I spent 30 days aboard the Exxon Valdez shortly after it came out of the shipyard as a brand new tanker. When I was home on vacation, a neighbor called and told me about the ship grounding in Prince William Sound. I happened to be with a former Captain of the Valdez. The first thing out of his mouth was, "I hope it was _______" (the last name of one of the two Captains who rotated tours on that Sea Story: I worked as a shipboard Radio Officer for Exxon Shipping company on their tanker fleet, and I spent 30 days aboard the Exxon Valdez shortly after it came out of the shipyard as a brand new tanker. When I was home on vacation, a neighbor called and told me about the ship grounding in Prince William Sound. I happened to be with a former Captain of the Valdez. The first thing out of his mouth was, "I hope it was _______" (the last name of one of the two Captains who rotated tours on that ship). When he heard it was Hazelwood, he was shocked and heartbroken, as they had been good friends. That accident was a recipe for disaster. Everything that could go wrong, did go wrong and at precisely the wrong ("right" for the accident) time. The conclusion I drew after the accident was this: there is no way, no how, no matter how thoroughly and carefully you set up a system to be foolproof, you can't anticipate all the weaknesses left in your system until an accident actually occurs. This book is not exactly about that, but it was somewhat relevant because it addresses "loosely-coupled" vs. "tightly-coupled" systems and how the more tightly-coupled a system is (the more every aspect is controlled) , the more accident prone it tends to be. The Exxon Valdez was definitely a tightly-coupled system. One of the fourteen reasons for the Valdez accident, according to the final 50-page NTSB report, was "Exxon's work rules and overtime policies contributed to employee fatigue and poor morale". Enough said.

  5. 4 out of 5

    Jan

    The book discusses various systems and their tendency to fail. It is full of evidence of various failures in the past but it's very dry and boring to crunch through all of the stories. Also I was very furious about author's attempt to draw conclusions and generalization from something that was anecdotal evidence at most. The book discusses various systems and their tendency to fail. It is full of evidence of various failures in the past but it's very dry and boring to crunch through all of the stories. Also I was very furious about author's attempt to draw conclusions and generalization from something that was anecdotal evidence at most.

  6. 4 out of 5

    Sarah

    Man was this book ever a slog. I had high hopes for it - I have a morbid habit of reading accounts of failure analysis. Plus, I thought I might learn something useful, since I work on a complex system, where (at least we'd like to think) we've done a pretty good job of planning for single component failures, but we do still have potential for unanticipated system interactions. I was aware this book was written by a sociologist, but I thought it might be even better for that reason - could be goo Man was this book ever a slog. I had high hopes for it - I have a morbid habit of reading accounts of failure analysis. Plus, I thought I might learn something useful, since I work on a complex system, where (at least we'd like to think) we've done a pretty good job of planning for single component failures, but we do still have potential for unanticipated system interactions. I was aware this book was written by a sociologist, but I thought it might be even better for that reason - could be good to a get a perspective from outside my field. Sadly, I just found myself wishing the author had consulted a few more engineers. In the early chapters on Three Mile Island (which, by the way, he led off by saying "It is not necessary to understand the technology in any depth"), he'd give a confusing halfway explanation of some aspect of the situation and then just throw up his hands and basically say "it was all just too complicated for anyone to understand." I mean, just because YOU don't understand it, or even if most of the people who talked about it didn't understand it, that doesn't mean that NOBODY could understand it. And don't even get me started on the space section; Tom Wolfe's The Right Stuff is immensely entertaining, but not exactly the best resource to draw on as a technical reference. Plus, Perrow refers to the Houston flight controllers as "middle managers", which misses the point sort of hilariously - there is plenty of middle management at NASA, but that's a poor descriptor for most of the people actually on console (especially back in the days of the space race). I think part of the issue is that this book just hasn't aged that well. Maybe it was revolutionary in 1984 to suggest that operator error was a convenient scapegoat but not necessarily the whole story, but by now that feels like an obvious baseline assumption. Plus, I'm looking back from the other side of over 30 years of exponential growth in computing power. (Speaking of computers, there's a blast from the past in the 1999 reprinted edition I read; there's a whole postscript about the Y2K problem. Yep.) I have to admit I don't actually disagree with some of the conclusions Perrow draws; if a technology has a high potential for catastrophic disasters, we should consider whether it's worth pursuing because its benefits are so great and there's nothing else comparable (or because we could make it significantly safer with reasonable effort), or whether we should give it up because it's not that much better than safer alternatives. I even more or less agree with his classifications of various technologies based on those criteria. But I can't help feeling like he got to those conclusions almost in spite of himself. In most of the accidents he described in the original book, as well as the notable ones he touches on in the afterword of the updated edition (Bhopal, Chernobyl, Challenger), the root cause didn't seem to be that the system was just too complicated for anybody to understand. Instead, there were plenty of more prosaic reasons: poor maintenance, insufficient training, management pressure, schedule pressure, etc. It wasn't that we just couldn't anticipate all the dangers; it's that somewhere along the line, people made choices that made things less safe, usually because of external pressures which seemed to have more certain and immediate negative consequences, and maybe sometimes because the people making the decisions weren't directly in the line of fire of the catastrophic failure. Perrow gets there himself! He walks right up to the idea of "externalities," and then he unfortunately drifts right back into waving his hands about how complicated all this technology is. Sigh. (Also, this book could have stood a better copy editor. There was a significant number of typos, and one lengthy section where Perrow kept referring back to the position of "flying" on a certain chart, when the word didn't actually appear anywhere on the chart at all. Came to find out he'd used "aircraft" on the chart in question, and "flying" on a table showing a different arrangement of the same set of activities, SIX CHAPTERS LATER. Consistency isn't that hard, people.) I sat down to write this review expecting I'd rate the book two stars, and instead I managed to talk myself into giving it just one instead. Maybe I should write about the books I actually liked more often, just for a change of pace.

  7. 5 out of 5

    Ryder Author Resources

    I first learned about this book from the bibliography of a Michael Chrichton novel (I think -- Airframe, maybe?) more than 20 years ago, and I've read it three times since. I'm sure not everyone would find it as riveting as I did, and it can get a little dry and/or repetitive at times, but it's a fascinating exploration of complex systems. Many negative reviews focus on the book's failings, such as the fact that Perrow's "doomsday" predictions haven't come to pass or that he doesn't offer workabl I first learned about this book from the bibliography of a Michael Chrichton novel (I think -- Airframe, maybe?) more than 20 years ago, and I've read it three times since. I'm sure not everyone would find it as riveting as I did, and it can get a little dry and/or repetitive at times, but it's a fascinating exploration of complex systems. Many negative reviews focus on the book's failings, such as the fact that Perrow's "doomsday" predictions haven't come to pass or that he doesn't offer workable solutions, but to me those are unimportant details. Taken on balance, the book is a solid introduction to concepts most of us don't give any thought to at all: what's involved in the dangerous technologies that allow us to go about our daily lives, and whether or not we can ever properly safeguard the systems we create to run those technologies. The real takeaway for me is a better understanding of how the industrialized world actually *works*, and the role human psychology plays in it all (spoiler: sometimes we strengthen the systems and sometimes we weaken them). It's a reminder that we're part of the complex system of life on earth, not separate from it. The "problems" Perrow identifies are often insoluble, unless humanity undergoes a global neurological evolution or the laws of physics change, but awareness can help us at least improve our ability to mange them.

  8. 5 out of 5

    AJ Armstrong

    I had high hopes for this oft-cited work, but it unfortunately continues the perfect record sociologists have for disappointing me. While there us certainly good primary research in evidence, and the discussion of coupling in complex systems would have been valuable at the time of initial publication (it scans a bit trite now), the author clearly has an inexplicably luddite agenda. It is obvious, almost from the outset of the work, that rather than analyzing complex systems with an interest to i I had high hopes for this oft-cited work, but it unfortunately continues the perfect record sociologists have for disappointing me. While there us certainly good primary research in evidence, and the discussion of coupling in complex systems would have been valuable at the time of initial publication (it scans a bit trite now), the author clearly has an inexplicably luddite agenda. It is obvious, almost from the outset of the work, that rather than analyzing complex systems with an interest to identifying systematic risk factors and their proper mitigation, he simply advocates abandoning anything that can't be made completely risk-free (an obviously absurd requirement). In the end, the whole exercise converts a valuable system analysis into a pseudoscientific justification for ivory tower liberal mores. The snide asides he casually levels at corporations or competing theories just reinforce the desire to put it aside and read one of the many much better and more balanced treatment of the topic by Perrow's abler heirs.

  9. 5 out of 5

    Kevin J. Rogers

    Dr. Perrow makes a striking point in this excellent analysis of the risks of complex technology: the very engineering safeguards that are intended to make high-risk technologies safer may in fact make them riskier by adding a an additional level of complexity, one that has more to do with the perceptions and interactions of the people intending to manage the system than the system itself. The solution, according to Dr. Perrow, is a two-dimesional analytical framework combining complex vs. linear Dr. Perrow makes a striking point in this excellent analysis of the risks of complex technology: the very engineering safeguards that are intended to make high-risk technologies safer may in fact make them riskier by adding a an additional level of complexity, one that has more to do with the perceptions and interactions of the people intending to manage the system than the system itself. The solution, according to Dr. Perrow, is a two-dimesional analytical framework combining complex vs. linear interactions with tight vs. loose coupling. It sounds more complex than it is; Dr. Perrow makes the point that complex systems react and interact in often unanticipated ways, and the solution to managing those systems depends on the recognition of that fact. Clearly written, this book takes a potentially difficult engineering subject and brings it forward as a human question. Well-done, and written for the lay reader.

  10. 4 out of 5

    Geoffry

    I had heard quite a bit about this book, and it mainly delivered. I enjoyed the attempts to quantify system accidents. I appreciated the the invention of a framework to discuss accidents, work systems, and victims. I also enjoyed the numerous case studies presented in the book because it is in the specific cases we can glen ideas at prevention. Unfortunately, I found myself disagreeing with some of the main conclusions of his analysis, particularly relating to abandoning certain kinds of technol I had heard quite a bit about this book, and it mainly delivered. I enjoyed the attempts to quantify system accidents. I appreciated the the invention of a framework to discuss accidents, work systems, and victims. I also enjoyed the numerous case studies presented in the book because it is in the specific cases we can glen ideas at prevention. Unfortunately, I found myself disagreeing with some of the main conclusions of his analysis, particularly relating to abandoning certain kinds of technology, and his assertion that cost-benefit analysis should include how people feel about a technology or a particular industrial activity. In particular, I don't think we should abandon technology because people dread accidents or over estimate the danger of a technology. That is a PR and education issue, not an accident prevention issue.

  11. 5 out of 5

    Evelyn

    (Read for ES module) Although this looks like a dense academic textbook on high tech technology, it's actually very accessible and absolutely fascinating to read. Has some excellent points about how over-complicating technology often leads to accidents which could have easily been prevented had a simpler method been implemented instead. Covers parts on nuclear accidents such as Chernobyl and Three Mile Island etc. (Read for ES module) Although this looks like a dense academic textbook on high tech technology, it's actually very accessible and absolutely fascinating to read. Has some excellent points about how over-complicating technology often leads to accidents which could have easily been prevented had a simpler method been implemented instead. Covers parts on nuclear accidents such as Chernobyl and Three Mile Island etc.

  12. 4 out of 5

    Steve

    This is a re-read. Highly recommended book. May serve as the core of my thesis, so I am definitely a fan.

  13. 5 out of 5

    Greg Stoll

    As I've mentioned before, I have a bit of a fascination with airplane crashes, and several books I've read mentioned this one as a seminal work in describing how accidents in complex systems happen. The main part of the book is setting up a system for categorizing systems. One dimension is "loosely-coupled" versus "tightly-coupled" - this roughly corresponds to how much slack there is in the system. A good example of a tightly-coupled system is an assembly line if parts are going down a conveyor As I've mentioned before, I have a bit of a fascination with airplane crashes, and several books I've read mentioned this one as a seminal work in describing how accidents in complex systems happen. The main part of the book is setting up a system for categorizing systems. One dimension is "loosely-coupled" versus "tightly-coupled" - this roughly corresponds to how much slack there is in the system. A good example of a tightly-coupled system is an assembly line if parts are going down a conveyor belt or something - if something goes wrong to mess up a widget at one station, that widget will quickly be at the next station which can cause other problems. The other dimension is "linear" versus "complex", which roughly describes the interactions between parts of the system. An assembly line with a conveyor belt is a good example of a "linear" system because the interactions between the different stations are pretty predictable. Usually the more compact in space a system is, the more "complex" it is because lots of different parts of it are close together. Tightly-coupled complex systems are prone to what the author calls "normal" accidents which aren't really preventable. Basically, when a system is tightly-coupled you need to have a pretty strict plan for how to deal with things when something goes wrong, because you don't have a lot of time for analysis or debate. (a military-like structure can help, although obviously this can have bad consequences for organizations that are not the military) But complex systems require more deliberation to figure out what's actually going on and possibly more ingenuity to find a solution. It's interesting because in retrospect for each particular accident it's usually easy to see what went wrong and what the people involved did wrong. (or what the organization did wrong before that point) The author's point is that most of the time blaming the people involved is missing the point - these sorts of accidents are inevitable. Most of the book is looking at specific systems (nuclear power plants, chemical plants, airplanes, marine shipping, dams, spacecraft, etc.), trying to categorize them, and looking at examples of accidents. (I should point out that I'm grossly oversimplifying here...) I think I mostly agree with his points, but I really don't have the depth of experience to know how reasonable his approach is. The book was written just before Chernobyl (so the part about nuclear power plants seems prescient), but there's also an afterward written in the late 90s about the Y2K problem and how maybe everything will be fine but there will likely be unpredictable serious problems, which didn't pan out. So I dunno. The book itself is pretty academic and was kind of a slog to get through even though I am interested in the topic.

  14. 4 out of 5

    AJ

    Normal Accidents is a pretty fascinating look at the complex systems in our lives and how failures of these systems is inevitable given their complicated inner workings. The fact that it took me over a month to finish this book is not indicative of it being boring, on the contrary, it was a very good read. While I was reading this book, I was easily able to come up with countless examples of system accidents that occurred since the book's manuscript was completed (1983): Chernobyl, Fukushima, Exx Normal Accidents is a pretty fascinating look at the complex systems in our lives and how failures of these systems is inevitable given their complicated inner workings. The fact that it took me over a month to finish this book is not indicative of it being boring, on the contrary, it was a very good read. While I was reading this book, I was easily able to come up with countless examples of system accidents that occurred since the book's manuscript was completed (1983): Chernobyl, Fukushima, Exxon Valdez, Bhopal, Boeing 737 Max-8, Challenger, Columbia, etc. The book argues that for complex systems that cannot be made more linear or less complex, and for which the possible disasters outweigh the benefits (nuclear power and nuclear weapons, for example), we should abandon the technologies. Unfortunately, this is not a stance that the global elite agrees with, and thus we are stuck with disasters such as Chernobyl and Fukushima. The afterward, written in 1999, contains more analysis of disasters that happened in the late 20th century, and anticipates those that may be in store in the future. (Which is now our past.) I found interesting a few sentences speaking of the possibility of issues coming from the complexity of our financial system. Unfortunately, the author spent an entire extra chapter talking about Y2K, something that caused approximately zero problems, rather than the financial sector, which caused a global recession unmatched by anything since the Great Depression. (Of course, as the author mentions, hindsight is 20-20, and causes us to find problems that may have remained hidden if not for the problem(s) that caused it to come to light.) I believe that even though this book is rather outdated that it is still a very useful read, and I wish that the system of corporate capitalism and short-term profits didn't reign supreme even more so today than it did in 1983. I feel even less hopeful now that anybody will take these sorts of reasoned arguments against pointlessly risky technology seriously, and that we will only be exposing ourselves to more potential disasters.

  15. 5 out of 5

    Bill Conrad

    We think of “accidents” as tragedies that plague our lives. A car crash where a beloved family member dies. A plane crash in bad weather kills hundreds. Normal Accidents takes a high-level view and shows us that incidents should be expected and they can be predicted. First off, this book is not a statistical analysis. IE, car crashes are X% likely. Rather, what Charles attempts to point out is that the more complex a system gets, the more likely an accident will occur. In addition, humans have ma We think of “accidents” as tragedies that plague our lives. A car crash where a beloved family member dies. A plane crash in bad weather kills hundreds. Normal Accidents takes a high-level view and shows us that incidents should be expected and they can be predicted. First off, this book is not a statistical analysis. IE, car crashes are X% likely. Rather, what Charles attempts to point out is that the more complex a system gets, the more likely an accident will occur. In addition, humans have many flaws that play a large part to play in causing and preventing accidents. Specifically, they attain a mindset that lulls them into a false sense of security. Normal Accidents provided a framework to recognize complex systems, and it raises awareness into the prospect of preventing problems. It also makes us consider how systems are designed, how they internally interact, how they connect with other systems and how human operators use them. There is a lot going on in this book. It begins by taking a deep dive into the 3-mile island nuclear incident. The initial conclusion listed the primary issue as operator error. Charles argues that the complex system had many inherent flaws, complex interactions. These factors made an accident of this type inevitable. Why? He asserts that nuclear technology is relatively new and there are only a few plants around the world of that size. Therefore, the flaws inherent in its design had yet to be discovered. This is a great book; a powerful book. It is important for us as a species to understand what we have built, who we are and where potential problems could be. I recommended it to an Engineering friend of mine.

  16. 4 out of 5

    Boris

    This book describes a history of industrial accidents in a variety of industries. While the descriptions and analyses of specific accidents are very interesting, the greatest value of this book is the development of a novel theory of accidents. Indeed it appears that in the years and decades that followed the publication of this book, the author's theory has become a standard against which other theories are gauged. In the subsequent literature about safety and risk, the Normal Accidents Theory This book describes a history of industrial accidents in a variety of industries. While the descriptions and analyses of specific accidents are very interesting, the greatest value of this book is the development of a novel theory of accidents. Indeed it appears that in the years and decades that followed the publication of this book, the author's theory has become a standard against which other theories are gauged. In the subsequent literature about safety and risk, the Normal Accidents Theory (NAT) is often discussed along with its competitor, High Reliability Theory (HRT). Perrow submits convincing arguments that complex systems are bound to fail catastrophically sooner or later - and this would include not just technological systems, but other complex systems as well (e.g. our world economy). Complex systems are analyzed with respect to the number of interconnected parts, redundant components, additional fail-safe components, how quickly anomalies will propagate within the system, system fault tolerance, the potential for damage, etc. The book concludes with a discussion about risk tolerance, and what constitutes acceptable risk. The author recommends abandoning nuclear power, figuring the risks are not worth the benefits - this is amazing, as he practically predicts the Chernobyl disaster, which came to pass only a few years after publication of the book. If you read this book in combination with Ellsberg's Doomsday Machine, you'll be seriously concerned about the survival of our species beyond the present atomic power/weapons era.

  17. 5 out of 5

    Kevin Mccormick

    Overall, I found this book interesting, but not particularly compelling. Normal Accidents starts with a thought-provoking first chapter, suggesting the central thesis of the book. However, as the chapters wear on, it becomes more apparent that his theory best applies to nuclear energy. For subsequent chapters, it starts feeling more like a square peg in a round hole. There are still plenty of individually interesting stories about catastrophic accidents in a variety of industries, which I did fi Overall, I found this book interesting, but not particularly compelling. Normal Accidents starts with a thought-provoking first chapter, suggesting the central thesis of the book. However, as the chapters wear on, it becomes more apparent that his theory best applies to nuclear energy. For subsequent chapters, it starts feeling more like a square peg in a round hole. There are still plenty of individually interesting stories about catastrophic accidents in a variety of industries, which I did find educational enough to finish this book. The central idea that Perrow suggested is still incredibly applicable today - I work in distributed software systems, and the concepts presented - systems accidents, component failure accidents, tight coupling, complex interactivity, and error-inducing systems seemed aptly descriptive of the challenges present in building such systems today. However, I probably wouldn't recommend this book for anyone who is looking to gain deep insight beyond the first chapter - it simply isn't worth the time to read the rather lengthy middle-part and anti-climactic conclusion. If you're simply looking for a loosely-knit together collection of well-researched stories about industrial accidents, this is worth reading.

  18. 4 out of 5

    EG Gilbert

    Overall Impressions It’s a solid theoretical work on risk and organizational behavior, but a difficult read unless you enjoy lectures delivered with all the pompousness and self-assured arrogance of a late-20th-Century white male academic. The use of first-person “I” statements grew annoying, as did the repeated use of the word “elites” to describe powerful people making decisions (usually to force risks onto the many for the profit and benefit of the few.) Some Significant Points • Accidents can re Overall Impressions It’s a solid theoretical work on risk and organizational behavior, but a difficult read unless you enjoy lectures delivered with all the pompousness and self-assured arrogance of a late-20th-Century white male academic. The use of first-person “I” statements grew annoying, as did the repeated use of the word “elites” to describe powerful people making decisions (usually to force risks onto the many for the profit and benefit of the few.) Some Significant Points • Accidents can result from multiple failures of system components including Design, Equipment, Procedures, Operators, Supplies & materials, and Environment. Abbreviated with acronym DEPOSE. (p8) • Systems can be tightly-coupled or loosely-coupled, and Linear or Complex, then plotted on a matrix for analysis. • The Normal Accidents Theory says that large accidents caused by the interactions of multiple small failures is inevitable. The consequences vary depending on the Linear vs Complex and Loose vs Tight coupling. • Common Mode Failures occur when one component affects more than one process, e.g. a pump used for both circulating coolant in one system and providing necessary heating to another. It’s efficient design, but if the pump fails then both systems are affected. • Some systems are inherently error-inducing and others error-avoiding. Error-avoiding example is air travel. Pilots, the public, and airlines are all punished by crashes and have incentive to prevent them. For maritime shipping that is not the case. It’s rare for individual ship to sink, the consequences are also not felt by consumers directly, insurance pays the owners, etc. • Safety devices are another system that can fail and interact in unexpected ways; often because they are added later and are not part of the original design. • Safety devices often allow more risk and the accident rate remains unchanged. Ships used to slow down in the fog. Radar allows higher speeds which reduces the time to correct course and avoid collision. Also, in good weather it allows higher speeds which makes the consequences of collisions greater. • In crisis situations we make mental models to reduce ambiguity so we can take action. As information comes in, we try to fit it into the mental model. If it doesn’t fit, we are more likely to question or reject the information than to alter the model. This is how two ships on courses to safely pass each other have collided due to a last minute maneuver where one ship crosses directly into the path of another. (Which happens alarmingly more often than you might expect.) Example. Ship captain thought he was overtaking another ship traveling in the same direction at night. Thought he would pass on his starboard side (the port side of other ship). In reality the two ships were going in opposite directions toward each other. Staying on course would mean no collision. As gap narrowed captain could see ships were too close so adjusted to the port; thinking it would create more buffer as he overtook. Results showed the ships closer than ever so he adjusted again hard to port. Mental model was flawed so instead of his actions increasing the space, he crossed directly in front and caused a collision. • A lot of accidents are caused by bias toward continued production. Keep the process running. When something starts going wrong we search for a minimum impact explanation first and don’t consider the catastrophic. (p277) • “The main point of the book is to see these human constructions as systems…” “…the theme has been that it is the way the parts fit together, interact, that is important.” (p351) • There is a discussion of expert risk assessment being different from public risk assessment. The differences are influenced by dread, consequences, and catastrophic potential. • Insurers used to insist on safety measures that lowered their exposure to payouts. This collaterally benefitted workers and communities. As insurers shifted to gaining profits from finance, they reduced inspections and insured more operations to gain more dollars from premiums to invest. The led to more accidents due to lower standards, but it was acceptable to insurers because it was affordable. (p361) • Risks and injury statistics can be outsourced by hiring subcontractors. The plant appears safer because the company only reports injuries to their employees. (p362) Conclusions The material is strong, the theory is solid, and the expertise is unquestionable (he participated in the investigation of the Three Mile Island nuclear plant accident), but the style made reading the book an exercise in perseverance in the later chapters. The tone was matter-of-fact and not condescending (mostly), but thick with unconscious biases that were common at the time but have grown increasingly unacceptable twenty years into the new century. For example (emphasis added): “We do not know the extent to which… [production schedule demand forces errors] …or, on the other hand, the extent to which there is a ‘macho’ culture that provides psychic rewards for risk taking. I am sure that the first exceeds the second; a risk-taking macho culture has probably developed to make sense of…” (p246) “…rather like the ritual with a cannabis joint.” (p359) “…as an organizational theorist, I am familiar with all the problems that can occur and the mistakes that can be made…” (p388) The lack of humility, and confidence in wild assumptions are grating, and this is not the first time I’ve been annoyed in that way but still found value in the material. [original review 7/2020; edit 1/2021 to correct spelling only]

  19. 4 out of 5

    Nikolay

    Comprehensive study of famous technological and organizational accidents. A window into world of hard engineering — nuclear, dams, chemical and petroleum plants, air and marine transport. Recommended to all SWE, DE and everyone placing Engineering or Science in their title. What makes incident accident? How safe are nuclear power plants? If one would break, how it would break exactly? How to minimize risk? Did you ever hear about "loosely/tightly coupled". How to analyze large systems? If you li Comprehensive study of famous technological and organizational accidents. A window into world of hard engineering — nuclear, dams, chemical and petroleum plants, air and marine transport. Recommended to all SWE, DE and everyone placing Engineering or Science in their title. What makes incident accident? How safe are nuclear power plants? If one would break, how it would break exactly? How to minimize risk? Did you ever hear about "loosely/tightly coupled". How to analyze large systems? If you like diving deep, you would enjoy following author in his unpacking of these different systems and dynamics with clarity. You would be surprised.

  20. 4 out of 5

    Karen Bilo

    It is a well researched book and he clearly knows what he's talking about and is pulling from a litany of experts, but it is repetitive and a bit pedantic. One of his later books - Meltdown - has the essence of this book but put into an easier to digest format. This is definitely the more academic version though, i'd consider Meltdown to be the pop version. The other problem with this is that it was clearly written a long time ago and hasn't included any recent disasters. It is a well researched book and he clearly knows what he's talking about and is pulling from a litany of experts, but it is repetitive and a bit pedantic. One of his later books - Meltdown - has the essence of this book but put into an easier to digest format. This is definitely the more academic version though, i'd consider Meltdown to be the pop version. The other problem with this is that it was clearly written a long time ago and hasn't included any recent disasters.

  21. 5 out of 5

    Isaac Perez Moncho

    An excellent in-depth view of why complex systems fail and how to build some anti-fragility in your systems and processes. Some parts of the book go into too much detail for my liking, as they distract from the principles taught in the book. There are many lessons from this book. The most important is that in interactive, complex and tightly coupled systems accidents become normal because they will happen; and we should understand that fact to build safer systems.

  22. 5 out of 5

    Ferhat Culfaz

    Superb! Founder of NAT (Normal Accident Theory). Widely cited and applied by a number of organisations. One should read this book just because once you have read it, it will make you look at systems, complex ones in particular, in a fundamentally different way, especially interrelationships, safety design errors, human factors etc. Excellent.

  23. 4 out of 5

    Georg Lehner

    Enlightening reading on the assesment of systems with respect to their risk potential. Also enlightening the reflection on social ethics and responsibilities with respect to the technical risks. This book empowers the everyday person to get an informed view on technical and industrial developments.

  24. 5 out of 5

    Gary Boland

    Absolutely essential reading for anyone who works in technology. A brilliant insight on how tightly coupled complex systems have accidents built into them. This all that has to happen is that once enough time passes they become inevitable (accident waiting to happen). Highly recommend

  25. 5 out of 5

    Dan Becker

    If you work with or rely upon complex systems, you need to understand this.

  26. 4 out of 5

    David

    Ok, so it reads like a textbook on risk management, but in the best possible way. It present engaging examples and presents an unpopular viewpoint that grapples with very real problems.

  27. 5 out of 5

    Andrew Hatch

    Brilliant book, an essential read for anyone interested in safety and complex systems

  28. 5 out of 5

    Tony

    A summary suffices, at this point. Brilliant and useful idea, but the details are dated and unfocused.

  29. 4 out of 5

    Nils

    Complex, tightly coupled systems produce unanticipatable and potentially catastrophic accidents.

  30. 4 out of 5

    Dee Eisel

    This is a tome. It's dense, it's dated (he's talking about disasters of the late 70s and early 80s most of the time), and it's difficult to penetrate. For people who understand what he's talking about, I bet they get a lot out of it. For me, it was a struggle. I began this book in July and finally finished it today. But I did get a few things out of it. I understand more of the idea of coupling, of events that seemingly inevitably lead one to another because of the way a system works. I am gettin This is a tome. It's dense, it's dated (he's talking about disasters of the late 70s and early 80s most of the time), and it's difficult to penetrate. For people who understand what he's talking about, I bet they get a lot out of it. For me, it was a struggle. I began this book in July and finally finished it today. But I did get a few things out of it. I understand more of the idea of coupling, of events that seemingly inevitably lead one to another because of the way a system works. I am getting better at understanding systemic accidents - why some places we would actually expect to have accidents more often than not. I have new insight into why the Bermuda Triangle really isn't a big deal at all - after reading the chapter on marine accidents, I'm more convinced than ever that you could draw a triangle over any patch of ocean and get similar results and that in fact there are places like Chesapeake Bay where the only reason more ships don't "vanish" is they're too well-monitored but there are more sinkings. This is not a book I'd recommend for understanding why accidents happen. There are better books for that - I'd recommend Inviting Disaster over Normal Accidents for anyone not really interested in digging deep and getting into systems. (Did I mention it's dense? It's dense. I don't consider myself a lightweight, but I constantly found myself going back and re-reading sections to be sure I understood what Perrow was trying to get across.) One factor that definitely will affect some peoples' opinions is how dated the book is. The Three Mile Island investigation was recent and in some respects ongoing as he wrote. Piper Alpha, Bhopal, even Chernobyl were in the future. It is with a dark sense of appreciation that I read his prediction of a major nuclear accident within ten years - yep, he got that one right, no question. I think Perrow would much rather have been wrong. In the Scribd version I read, there is an update where he addresses some of the changes since his book was first published, but he doesn't have space to do more than address a few. I am glad to see that Bhopal was one, and sorry to see that there wasn't more made of Chernobyl. Because he is dealing with man-made systems and accidents, Perrow doesn't touch on big weather or earth science-related disasters more than to note some ways to improve response, and that's OK. Because of its density and the datedness, I give this three of five stars. Disaster and disaster planning buffs should read it. Everyone else can skip it.

Add a review

Your email address will not be published. Required fields are marked *

Loading...
We use cookies to give you the best online experience. By using our website you agree to our use of cookies in accordance with our cookie policy.