State - INACTIVE

Oil Spill Aftermath: Complex systems hard to gauge

In the weeks since the Deepwater Horizon explosion, the political debate has fallen into predictably partisan and often puerile categories. Conservatives say this is Barack Obama's Katrina. Liberals say the spill proves the government should have more control over industry. But the real issue has to do with risk assessment. It has to do with the bloody crossroads where complex technical systems meet human psychology.

Over the past decades, we've come to depend on an ever-expanding array of intricate high-tech systems. These hardware and software systems are the guts of financial markets, energy exploration, space exploration, air travel, defense programs and modern production plants.

These systems, which allow us to live as well as we do, are too complex for any single person to understand. Yet every day, individuals are asked to monitor the health of these networks, weigh the risks of a system failure and take appropriate measures to reduce risks.

If there is one thing we've learned, it is that humans are not great at measuring and responding to risk when placed in situations too complicated to understand.

In the first place, people have trouble imagining how small failings can combine to lead to catastrophic disasters.

Second, people have a tendency to get acclimated to risk. As the physicist Richard Feynman wrote in a report on the Challenger disaster, as years went by, NASA officials got used to living with small failures. If faulty O rings didn't produce a catastrophe last time, they probably won't this time, they figured. As things seemed to be going well, people unconsciously adjust their definition of acceptable risk.

Third, people have a tendency to place elaborate faith in backup systems and safety devices. More pedestrians die in crosswalks than when jaywalking. That's because they have a false sense of security in crosswalks and are less likely to look both ways.

On the Deepwater Horizon oil rig, a Transocean official apparently tried to close off a safety debate by reminding everybody the blowout preventer would save them if something went wrong. The illusion of the safety system encouraged the crew to behave in more reckless ways.

Fourth, people have a tendency to match complicated technical systems with complicated governing structures.

Fifth, people tend to spread good news and hide bad news. Everybody wants to be part of a project that comes in under budget and nobody wants to be responsible for the reverse. For decades, millions of barrels of oil seeped out of a drill off the Guadalupe Dunes in California. A culture of silence settled upon all concerned, from front-line workers who didn't want to lose their jobs to executives who didn't want to hurt profits.

Finally, people in the same field begin to think alike, whether they are in oversight roles or not. The oil industry's capture of the Minerals Management Service is actually misleading because the agency was so appalling and corrupt. Cognitive capture is more common and harder to detect.

In the weeks and hours leading up to the Deepwater Horizon disaster, engineers were compelled to make a series of decision without any clear sense of the risks and in an environment that seems to have encouraged overconfidence.

Over the past years, we have seen smart people at Fannie Mae, Lehman Brothers, NASA and the CIA make similarly catastrophic risk assessments. So it seems important, in the months ahead, to not only focus on mechanical ways to make drilling safer, but also more broadly on helping people deal with potentially catastrophic complexity. There must be ways to improve the choice architecture -- to help people guard against risk creep, false security, groupthink, the good-news bias and all the rest. This isn't just about oil.

THE NEW YORK TIMES

  Comments