The Citizens Handbook
Normal Accidents

This is a summary of the work of sociologist Charles Perrow published in "Normal Accidents" and several other books. Above: The explosion of the space shuttle Challenger.

1 Complexity
High-risk systems often involve numerous interacting components and processes, making them inherently complex. The more complex a system, the greater the likelihood of unforeseen interactions leading to accidents. Suppose we start with a setting with a lot parts, procedures, operators. Complexity contributed to the Deepwater Horizon oil spill in 2010. The oil drilling operation involved multiple interconnected systems and components, making it a highly complex operation. The failure of a single component, the blowout preventer, led to a series of cascading failures that resulted in one of the largest environmental disasters in history.

2 Tight Coupling
Tight coupling refers to the lack of slack or room for error in a system. In tightly coupled systems, failures can quickly cascade and amplify, leading to catastrophic outcomes. The Fukushima Daiichi nuclear disaster in 2011 exemplifies tight coupling. The nuclear power plant's systems were tightly interdependent, with little margin for error. When a massive earthquake and tsunami damaged the plant, the failures in cooling systems and backup power quickly escalated into a full-scale nuclear meltdown.

3 Interactive Complexity 
The interactions between components in a system can be highly complex and unpredictable. Small changes in one part of the system can have far-reaching and unintended consequences elsewhere. Suppose two or more failures occur among components that interact in some unexpected way. No one dreamed that when X failed, Y would also be out of order and the two failures would interact so as to both start a fire and silence the fire alarm. The Three Mile Island nuclear accident in 1979 highlighted the interactive complexity of high-risk systems. A combination of equipment failures, human errors, and miscommunication led to a partial meltdown of the reactor core. The complex interactions between these factors made it challenging for operators to diagnose and mitigate the situation effectively.

4 Organizational Hierarchies
Organizational structures, hierarchies, and communication patterns can hinder the flow of critical information and impede effective decision-making during emergencies. In the case of the Columbia space shuttle disaster in 2003, organizational hierarchies played a role. NASA's decision-making processes and communication channels were influenced by hierarchical structures that inhibited lower-level engineers from effectively communicating concerns about the shuttle's heat shield. This lack of communication and response to critical information contributed to the shuttle's tragic demise during re-entry.

5 Normalization of Deviance
Over time, organizations may come to accept deviations from safety protocols as normal behavior. This normalization of deviance can increase the risk of accidents by eroding safety margins. The Challenger space shuttle disaster in 1986 exemplifies the normalization of deviance. O-rings in the shuttle's solid rocket boosters had shown signs of failure in cold temperatures, but this warning sign became normalized over time. Engineers and managers gradually accepted the deviation from safety protocols, leading to the tragic failure of the O-rings during launch and the subsequent explosion of the shuttle.

6 Stunned Bureaucracy
When the tasks people perform are well understood, predictable, routine, and repetitive, a bureaucratic structure is the most efficient. Things can be "programmed," to use March and Simon's term. When tasks are not well understood and non-routine, bureaucratic structure fails. This is worrying because American society today is shaped not nearly as much by vast open spaces as it is by vast, bureaucratic organizations. Over half the working population toils away at enterprises with 500 or more employees — up from zero percent in 1800.

7 Attached Mud
People do not exist just for organizations. They track all kinds of mud from the rest of their lives into the organization, and they have all kinds of interests that are independent of the organization. Human shortcomings mean that warnings are ignored, unnecessary risks taken, sloppy work done, deception and downright lying practiced. This occurs in all organizations.

The Citizen's Handbook / Home / Table of Contents
The Citizen's Handbook / Charles Dobson /
cover image

The Troublemaker's Teaparty is a print version of The Citizen's Handbook published in 2003. It contains all of The Handbook plus additional material on preventing grassroots rot, strategic action, direct action and media advocacy. You can get a copy of The Teaparty from bookstores, Amazon or New Society Publishers.