Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them.
The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem.
|Publisher:||Princeton University Press|
|Edition description:||Updated edition with a New afterword and a new postscript by the author|
|Product dimensions:||6.00(w) x 9.25(h) x (d)|
About the Author
Charles Perrow is Professor of Sociology at Yale University. His other books include The Radical Attack on Business, Organizational Analysis: A Sociological View, Complex Organizations: A Critical Essay, and The AIDS Disaster: The Failure of Organizations in New York and the Nation.
Table of Contents
Abnormal Blessings vii
1. Normal Accident at Three Mile Island 15
2. Nuclear Power as a High-Risk System: Why We Have Not Had More TMIsBut Will Soon 32
3. Complexity, Coupling, and Catastrophe 62
4. Petrochemical Plants 101
5. Aircraft and Airways 123
6. Marine Accidents 170
7. Earthbound Systems: Dams, Quakes, Mines, and Lakes 232
8. Exotics: Space, Weapons, and DNA 256
9. Living with High-Risk Systems 304
Postscript: The Y2K Problem 388
List of Acronyms 413
Most Helpful Customer Reviews
I enjoyed this book very much. As an engineer it provided me with some insights that were new to me. I really appreciated how Perrow showed how sociology had an important and understandable role to play in safety.
This book was recently reviewed positively in the Economist, which is usually a fairly good tip, and as I am interested in systems and complexity, I picked it up. Having now finished its 400-odd pages, I am not entirely convinced that the Economist reviewer actually ploughed through the whole book.It starts well, with a convincing development of a theory of the inevitability of accidents in systems which are both complex and tightly-coupled. These concepts are explored in some detail, with numerous interesting case studies (of varying quality). This first section, in which the core thesis is developed, is excellent, and a useful contribution to the study of complex systems, risk, and failure.However, Perrow then goes on to draw conclusions such as "nuclear power should be abandoned because it can never be made safe". In attempting to shore up this rather shaky position, some quite dubious analyses are put forward, including the arguments that space missions can never cause catastrophes for third parties (there is a village in China which would disagree), that aircraft carriers are "self-correcting" systems, and that chemical accidents are inherently small-scale (by contrast to the nuclear industry). The events at Bhopal would tend to destroy this argument: they are explicitly addressed in a postscript, where they are termed a "non-systemic" accident by reason of drawing the boundaries of the event in a rather arbitrary manner (a passingly acknowledged weakness of "Normal Accident Theory").The final postscript on Y2K issues is also somewhat spurious.Overall, worth reading - but if you feel like stopping half-way through, you're not missing much.