Normal Accidents

Last updated
Normal Accidents
Normal Accidents (Charles Perrow book) cover art.jpg
Author Charles Perrow
PublisherBasic Books
Publication date
1984
ISBN 978-0-691-00412-9

Normal Accidents: Living with High-Risk Technologies is a 1984 book by Yale sociologist Charles Perrow, which provides a detailed analysis of complex systems from a sociological perspective. It was the first to "propose a framework for characterizing complex technological systems such as air traffic, marine traffic, chemical plants, dams, and especially nuclear power plants according to their riskiness". Perrow argues that multiple and unexpected failures are built into society's complex and tightly coupled systems. Such accidents are unavoidable and cannot be designed around. [1]

Contents

Perrow's argument, based on systemic features and human error, is that big accidents tend to escalate, and technology is not the problem, the organizations are. Each of these principles is still relevant today. [1] [2]

System accidents

"Normal" accidents, or system accidents, are so-called by Perrow because such accidents are inevitable in extremely complex systems. Given the characteristic of the system involved, multiple failures that interact with each other will occur, despite efforts to avoid them. Perrow said that, while operator error is a very common problem, many failures relate to organizations rather than technology, and big accidents almost always have very small beginnings. [3] Such events appear trivial to begin with before unpredictably cascading through the system to create a large event with severe consequences. [1]

Normal Accidents contributed key concepts to a set of intellectual developments in the 1980s that revolutionized the conception of safety and risk. It made the case for examining technological failures as the product of highly interacting systems, and highlighted organizational and management factors as the main causes of failures. Technological disasters could no longer be ascribed to isolated equipment malfunction, operator error, or acts of God. [4]

Perrow identifies three conditions that make a system likely to be susceptible to Normal Accidents. These are:

Three Mile Island

The inspiration for Perrow's books was the 1979 Three Mile Island accident, where a nuclear accident resulted from an unanticipated interaction of multiple failures in a complex system. [2] The event was an example of a normal accident because it was "unexpected, incomprehensible, uncontrollable and unavoidable". [5]

Perrow concluded that the failure at Three Mile Island was a consequence of the system's immense complexity. Such modern high-risk systems, he realized, were prone to failures however well they were managed. It was inevitable that they would eventually suffer what he termed a 'normal accident'. Therefore, he suggested, we might do better to contemplate a radical redesign, or if that was not possible, to abandon such technology entirely. [4]

New reactor designs

One disadvantage of any new nuclear reactor technology is that safety risks may be greater initially as reactor operators have little experience with the new design. Nuclear engineer David Lochbaum has explained that almost all serious nuclear accidents have occurred with what was at the time the most recent technology. He argues that "the problem with new reactors and accidents is twofold: scenarios arise that are impossible to plan for in simulations; and humans make mistakes". [6] As Dennis Berry, Director Emeritus of Sandia National Laboratory [7] put it, "fabrication, construction, operation, and maintenance of new reactors will face a steep learning curve: advanced technologies will have a heightened risk of accidents and mistakes. The technology may be proven, but people are not". [6]

Sometimes, engineering redundancies which are put in place to help ensure safety, may backfire and produce less, not more reliability. This may happen in three ways: First, redundant safety devices result in a more complex system, more prone to errors and accidents. Second, redundancy may lead to shirking of responsibility among workers. Third, redundancy may lead to increased production pressures, resulting in a system that operates at higher speeds, but less safely. [8]

Readership

Normal Accidents is a very widely cited book, with more than 1,000 citations in the Social Sciences Citation Index and Science Citation Index to 2003. [8] A German translation of the book was published in 1987, with a second edition in 1992. [9]

See also

Literature

Related Research Articles

<span class="mw-page-title-main">Three Mile Island accident</span> 1979 nuclear accident in Pennsylvania, US

The Three Mile Island accident was a partial meltdown of the Three Mile Island, Unit 2 (TMI-2) reactor on the Susquehanna River in Londonderry Township, Pennsylvania, near the Pennsylvania capital of Harrisburg. It began at 4 a.m. on March 28, 1979. It is the most significant accident in U.S. commercial nuclear power plant history. On the seven-point International Nuclear Event Scale, it is rated Level 5 – Accident with Wider Consequences.

<span class="mw-page-title-main">Nuclear meltdown</span> Severe nuclear reactor accident that results in core damage from overheating

A nuclear meltdown is a severe nuclear reactor accident that results in core damage from overheating. The term nuclear meltdown is not officially defined by the International Atomic Energy Agency or by the United States Nuclear Regulatory Commission. It has been defined to mean the accidental melting of the core of a nuclear reactor, however, and is in common usage a reference to the core's either complete or partial collapse.

<span class="mw-page-title-main">Nuclear power plant</span> Thermal power station where the heat source is a nuclear reactor

A nuclear power plant (NPP) is a thermal power station in which the heat source is a nuclear reactor. As is typical of thermal power stations, heat is used to generate steam that drives a steam turbine connected to a generator that produces electricity. As of 2022, the International Atomic Energy Agency reported there were 422 nuclear power reactors in operation in 32 countries around the world, and 57 nuclear power reactors under construction.

<span class="mw-page-title-main">Nuclear and radiation accidents and incidents</span> Severe disruptive events involving fissile or fusile materials

A nuclear and radiation accident is defined by the International Atomic Energy Agency (IAEA) as "an event that has led to significant consequences to people, the environment or the facility. Examples include lethal effects to individuals, large radioactivity release to the environment, reactor core melt." The prime example of a "major nuclear accident" is one in which a reactor core is damaged and significant amounts of radioactive isotopes are released, such as in the Chernobyl disaster in 1986 and Fukushima nuclear disaster in 2011.

<span class="mw-page-title-main">Redundancy (engineering)</span> Duplication of critical components to increase reliability of a system

In engineering, redundancy is the intentional duplication of critical components or functions of a system with the goal of increasing reliability of the system, usually in the form of a backup or fail-safe, or to improve actual system performance, such as in the case of GNSS receivers, or multi-threaded computer processing.

A high reliability organization (HRO) is an organization that has succeeded in avoiding catastrophes in an environment where normal accidents can be expected due to risk factors and complexity.

<span class="mw-page-title-main">Nuclear safety and security</span> Regulations for uses of radioactive materials

Nuclear safety is defined by the International Atomic Energy Agency (IAEA) as "The achievement of proper operating conditions, prevention of accidents or mitigation of accident consequences, resulting in protection of workers, the public and the environment from undue radiation hazards". The IAEA defines nuclear security as "The prevention and detection of and response to, theft, sabotage, unauthorized access, illegal transfer or other malicious acts involving nuclear materials, other radioactive substances or their associated facilities".

Charles B. Perrow was an emeritus professor of sociology at Yale University and visiting professor at Stanford University. He authored several books and many articles on organizations, and was primarily concerned with the impact of large organizations on society.

A system accident is an "unanticipated interaction of multiple failures" in a complex system. This complexity can either be of technology or of human organizations, and is frequently both. A system accident can be easy to see in hindsight, but extremely difficult in foresight because there are simply too many action pathways to seriously consider all of them. Charles Perrow first developed these ideas in the mid-1980s. William Langewiesche in the late 1990s wrote, "the control and operation of some of the riskiest technologies require organizations so complex that serious failures are virtually guaranteed to occur."

Nuclear history of the United States describes the history of nuclear affairs in the United States whether civilian or military.

<span class="mw-page-title-main">Nuclear safety in the United States</span> US safety regulations for nuclear power and weapons

Nuclear safety in the United States is governed by federal regulations issued by the Nuclear Regulatory Commission (NRC). The NRC regulates all nuclear plants and materials in the United States except for nuclear plants and materials controlled by the U.S. government, as well those powering naval vessels.

<span class="mw-page-title-main">Anti-nuclear movement in the United States</span> Movement opposing the use of nuclear power, weapons, and/or uranium mining

The anti-nuclear movement in the United States consists of more than 80 anti-nuclear groups that oppose nuclear power, nuclear weapons, and/or uranium mining. These have included the Abalone Alliance, Clamshell Alliance, Committee for Nuclear Responsibility, Nevada Desert Experience, Nuclear Information and Resource Service, Physicians for Social Responsibility, Plowshares Movement, Women Strike for Peace, and Women's International League for Peace and Freedom. Some fringe aspects of the anti-nuclear movement have delayed construction or halted commitments to build some new nuclear plants, and have pressured the Nuclear Regulatory Commission to enforce and strengthen the safety regulations for nuclear power plants. Most groups in the movement focus on nuclear weapons.

<span class="mw-page-title-main">Nuclear power debate</span> Controversy over the use of nuclear power

The nuclear power debate is a long-running controversy about the risks and benefits of using nuclear reactors to generate electricity for civilian purposes. The debate about nuclear power peaked during the 1970s and 1980s, as more and more reactors were built and came online, and "reached an intensity unprecedented in the history of technology controversies" in some countries. Thereafter, the nuclear industry created jobs, focused on safety, and public concerns mostly waned.

<span class="mw-page-title-main">Nuclear reactor accidents in the United States</span> Nuclear reactor accidents that occurred in the United States

The United States Government Accountability Office reported more than 150 incidents from 2001 to 2006 of nuclear plants not performing within acceptable safety guidelines. According to a 2010 survey of energy accidents, there have been at least 56 accidents at nuclear reactors in the United States. The most serious of these was the Three Mile Island accident in 1979. Davis-Besse Nuclear Power Plant has been the source of two of the top five most dangerous nuclear incidents in the United States since 1979. Relatively few accidents have involved fatalities.

The Investigation Committee on the Accident at the Fukushima Nuclear Power Stations of Tokyo Electric Power Company was formed June 7, 2011 by the Japanese government as an independent body to investigate the March Fukushima Daiichi nuclear disaster. The Investigation Committee issued an interim report in December 2011, and issued its final report in July 2012.

The VVER-TOI or WWER-TOI is a generation III+ nuclear power reactor based on VVER technology developed by Rosatom. The VVER-TOI design is intended to improve the competitiveness of Russian VVER technology in international markets. It would use VVER-1300/510 water pressurized reactors constructed to meet modern nuclear and radiation safety requirements.

The vulnerability of nuclear plants to deliberate attack is of concern in the area of nuclear safety and security. Nuclear power plants, civilian research reactors, certain naval fuel facilities, uranium enrichment plants, fuel fabrication plants, and even potentially uranium mines are vulnerable to attacks which could lead to widespread radioactive contamination. The attack threat is of several general types: commando-like ground-based attacks on equipment which if disabled could lead to a reactor core meltdown or widespread dispersal of radioactivity; and external attacks such as an aircraft crash into a reactor complex, or cyber attacks.

U.S. non-military nuclear material is regulated by the U.S. Nuclear Regulatory Commission, which uses the concept of defense in depth when protecting the health and safety of the public from the hazards associated with nuclear materials. The NRC defines defense in depth as creating multiple independent and redundant layers of protection and response to failures, accidents, or fires in power plants. For example, defense in depth means that if one fire suppression system fails, there will be another to back it up. The idea is that no single layer, no matter how robust, is exclusively relied upon; access controls, physical barriers, redundant and diverse key safety functions, and emergency response measures are used. Defense in depth is designed to compensate for potential human and mechanical failures, which are assumed to be unavoidable.

A defence in depth uses multi-layered protections, similar to redundant protections, to create a reliable system despite any one layer's unreliability.

References

  1. 1 2 3 Whitney, Daniel (2003). ""Normal Accidents" by Charles Perrow - Reviewed by Daniel Whitney". Massachusetts Institute of Technology. CiteSeerX   10.1.1.359.7385 .
  2. 1 2 Clearfield, Chris; Tilcsik, András (2018). Meltdown: Why Our Systems Fail and What We Can Do About It. New York: Penguin Press. ISBN   9780735222632.
  3. Perrow, Charles. Normal Accidents: Living with High-Risk Technologies New York: Basic Books, 1984. p.5
  4. 1 2 Pidgeon, Nick (22 September 2011). "In retrospect:Normal accidents". Nature. 477 (7365): 404–405. Bibcode:2011Natur.477..404P. doi: 10.1038/477404a . S2CID   4419144.
  5. Perrow, C. (1982), "The President’s Commission and the Normal Accident", in Sils, D., Wolf, C. and Shelanski, V. (Eds), Accident at Three Mile Island: The Human Dimensions, Westview, Boulder, pp.173–184.
  6. 1 2 Benjamin K. Sovacool. A Critical Evaluation of Nuclear Power and Renewable Electricity in Asia, Journal of Contemporary Asia, Vol. 40, No. 3, August 2010, p. 381.
  7. Contesting the Future of Nuclear Power ..., Benjamin K. Sovacool
  8. 1 2 Scott D. Sagan (March 2004). "Learning from Normal Accidents" (PDF). Organization & Environment. Archived from the original (PDF) on 2004-07-14.
  9. See data of the book in the German National Library