When Systems Fracture

On the tendency of advanced technology to promote self-deception

As the enormity of September 11 sank in, a fertilizer factory near Toulouse, France, exploded, killing 29 people and hospitalizing at least 780. The first event was terrorism; French authorities say the other was almost certainly an accident. But think of the technological vulnerabilities the assailants exploited: hundred-story towers with single exits, jumbo jets loaded with fuel. Like potentially explosive chemical plants, these engieering landmarks have become part of the fabric of advanced industrial society.

The independent writer James R. Chiles '77 completed Inviting Disaster: Lessons from the Edge of Technology after the happy outcome of the Year 2000 Crisis had calmed technological anxiety. Security against attack is not his major concern. But his book reminds us that even without lethal fanaticism, the human-made world is more dangerous than ever. Technological risk has not vanished, and indeed the number of disasters and fatalities has multiplied. There are more potentially hostile states with intercontinental ballistic missiles--meaning more risk of rapid launches and false warnings like those that indicated Soviet attacks in 1979 and 1980. Chemical plants and electric-generating systems operate with unprecedented pressures and temperatures. Superjumbo aircraft and 10,000-container tankers are on the way. Freak accidents can cripple the most sophisticated technology. Last year, an 18-inch strip of titanium on a Paris runway triggered a chain reaction of failures in an Air France Concorde, leading to flames, loss of control, and 113 deaths.

Chiles sees these catastrophes as "system fractures," comparing technical defects to the tiny cracks that appear in newly milled aluminum. Harmless initially, they can grow and propagate, especially if the surface is allowed to corrode or if it is cut the wrong way: as was demonstrated when square windows, originally a stunning design innovation, inadvertently promoted a series of tragic breakups of the de Havilland Comet, and with it the British lead in commercial jet aviation. The safety of complex systems demands a series of technical and human measures to keep the cracks from spreading. Inviting Disaster is not just an anatomy of failure but a study of successful "crackstopping." Thus, while acknowledging the contributions of the sociologist Charles Perrow, who has argued that some technologies make disasters almost inevitable, Chiles turns more to the classic study of an aircraft carrier as a "self-designing high-reliability organization" by the political scientists Gene Rochlin and Todd LaPorte and the psychologist K.H. Roberts.

Inviting Disaster takes a fresh approach to familiar tragedy. We think we know all about the Challenger disaster, but Chiles shows how many of the same problems, especially the cost and deadline pressures that prevented the solution of technical problems, helped doom a British airship, the R.101, more than 40 years earlier: pathologies of national prestige technology.

Even without political demands for results, failure also awaits organizations that neglect testing. Chiles tells the stunning story of the Newport Torpedo Station in Rhode Island and its miracle weapon, a proximity fuse triggered for maximum damage by an enemy ship's interaction with the earth's magnetic field. Never tested under battle conditions during the low-budget interwar years, the Mark 14 torpedo failed to explode when finally used in World War II. Its contact detonator, too, failed on impact. As the Hubble Space Telescope later showed, only rigorous testing can check the tendency of advanced technology to promote self-deception. Nor can we count on even highly trained men and women to react correctly under stress; hypervigilance and fixation on assumed solutions can easily defeat common sense. And both machine performance and human judgment can degrade disastrously and suddenly when certain thresholds--in the machines' case, marked by red lines on gauges--are exceeded.

Some organizations have been more able than others to deal with perils like these. Chiles calls them "crack-stopper companies." They encourage employees to admit errors and report their mistakes without fear of reprisals. They assign teams of specialists to comb assembled aircraft for loose parts, debris, and tools that could later prove fatal. In case of doubt, they assume the worst, tearing down and reconstructing systems as Admiral Hyman Rickover, one of the author's heroes, did when nuclear submarines were under construction and compliance of tubing with a specification could not be verified.

We are living on a machine frontier, this book concludes, but we need not be fearful because resourceful men and women can implement proven techniques and concepts such as uniform control systems for aircraft that reduce the likelihood of operator confusion. Technicians and managers can attain a state "like the satori of Zen Buddhism," following "the Way of the Machine."

Inviting Disaster is absorbing reading, going beyond official reports and bringing readers vividly into some of the scariest places on earth. It presents striking but neglected data: for example, that the Chernobyl meltdown released 200 times as much radioactivity into the atmosphere as the U.S. atomic bombs dropped on Hiroshima and Nagasaki. It also introduces us to concepts of reliability engineering that may not solve our everyday technological woes but at least make them more comprehensible: the refusal of my Honda Civic to start in hot weather (which it never repeats when left with a mechanic) is an example of what NASA engineers call a sneak.

 

Chiles makes his intended case solidly. Inviting Disaster is essential reading for anyone grappling with technological risk. Still, his evidence at times seems to question his own prescriptions. In the nineteenth century, the explosives manufacturer Lammot du Pont made his employees slosh through water to foil clandestine smoking, but still died in an explosion. During the Cuban missile crisis in 1962, the staff of Malmstrom Air Force Base secretly circumvented safeguards to make it possible to launch their Minuteman ICBMs without presidential authorization. The problem is that they thought they were being flexible and improving the reliability of the system, as were the Chernobyl engineers determined to proceed with tests for upgrading their safety technology. When, if ever, should an automated safety system let operators override it? Never, if they're panicked and in a cognitive lock. Always, if the software or hardware has hidden flaws and the human beings are courageous, resourceful paragons. But how can the machine tell when it's a better or worse risk than the people?

Why can't everybody be like Admiral Rickover and Captain Bryce McCormick, who saved American Airlines flight 96 in 1972 partly because he had requested extra simulator time that gave him vital skills? Why can't all high-risk organizations have the élan and morale of aircraft-carrier crews? Yet Chiles also points out that some of the same can-do enthusiasm may be fatal when other outstanding personalities cut corners in forcing prestigious projects to completion. He does not claim that disasters can be eliminated, only that the principles of high-reliability organizations can reduce their incidence to a level that is acceptable to you and me. But what is an "acceptable" level of risk for thermonuclear war? And is it possible to design effective safety systems for entirely new technologies without having to learn through a cycle of catastrophes like those of the explosives industries and steamship lines?

Inviting Disaster focuses on organizational style, operator behavior, and machine design. It has less to say about danger and the law, beyond recommending full disclosure of incidents. If design and training for high reliability are imperative, do current international standards and laws suffice? Do existing regulations need better enforcement? Or do we need new standards and laws to make all organizations meet the records of the top performers? Who--government or private certification authorities--should police these? Should they act like Admiral Rickover? Safety-conscious firms keep meticulous internal logs of design failures, but how would executives in the fierce global marketplace feel about putting development blunders on the Web?

What about costs? Tragically, a young boy in an MRI machine in a New York hospital was killed recently after being struck by a stainless-steel oxygen tank magnetized and attracted by the device. Is the answer better staff training alone, or use of aluminum oxygen containers? Could not the safer but costly nonferrous tanks indirectly pose other risks to health by raising the price of medical services?

The most serious question of all is how long a culture of ultrareliability can be sustained. Organizations and states decay, even if not always as spectacularly as the former Soviet Union. Market economies, too, are sometimes stressed severely. Standard-setting companies periodically founder. They can forget as well as learn. Will security programs enhance reliability? Antiterrorism initiatives might reawaken interest in innovations like more crash-resistant aircraft design and tighter standards for hazardous facilities. But reliability tends to be about people controlling machines, and security about machines surveilling people. A one-sided concentration on thwarting attacks could drain resources from reducing other and sometimes greater risks. The goal of a high-reliability world seems more distant than ever.

Edward Tenner, Jf '72, a contributing editor of this magazine, is author of Why Things Bite Back: Technology and the Revenge of Unintended Consequences.

You might also like

Reparations as Public Health

A Harvard forum on the racial health gap

Unionizing Harvard Academic Workers

Pay, child care, workplace protections at issue 

Should AI Be Scaled Down?

The case for maximizing AI models’ efficiency—not size

Most popular

AWOL from Academics

Behind students' increasing pull toward extracurriculars

Why Americans Love to Hate Harvard

The president emeritus on elite universities’ academic accomplishments—and a rising tide of antagonism

The Broken Social Contract

Danielle Allen on America’s broken social contract

More to explore

Darker Days

The current disquiets compared to Harvard’s Vietnam-era traumas

Making Space

The natural history of Junko Yamamoto’s art and architecture

Spellbound on Stage

Actor and young adult novelist Aislinn Brophy