Posner is Correct: From the oil spill to the financial crisis, why we don’t plan for the worst

June 7th, 2010

Take a look at this excellent WaPo piece by the inestimable Judge Richard Posner.

In all these cases, observers recognized the existence of catastrophic risk but deemed it to be small. Many other risks like this are lying in wait, whether a lethal flu epidemic, widespread extinctions, nuclear accidents, abrupt global warming that causes a sudden and catastrophic rise in sea levels, or a collision with an asteroid.

Such certainty about timing is rare; indeed, a key obstacle to taking preventive measures against unlikely disasters is precisely that they are unlikely to occur in the near future.

Of course, if the consequences of the disaster would be very grave, the fact that the risk is low is hardly a good reason to ignore it. But there is a natural tendency to postpone preventive action against dangers that are likely to occur at some uncertain point in the future (“sufficient unto the day is the evil thereof,” as the Bible says), especially if prevention is expensive, and especially because there is so much else to do in the here and now.

Our tendency to procrastinate is aggravated by three additional circumstances: when fixing things after the fact seems like a feasible alternative to preventing disaster in the first place; when the people responsible have a short time horizon; and when the risk is uncertain in the sense that no objective probability can be attached to it.

The BP oil leak reveals a similar pattern, though not an identical one. One difference is that the companies involved must have known that in the event of an accident on a deepwater rig, prompt and effective remedies for an oil leak would be unlikely — meaning that there was no reliable alternative to preventing an accident. But the risk of such an accident could not quantified, and it was believed to be low because there hadn’t been many serious accidents involved in deepwater drilling. (No one knew how low; the claim by BP chief executive Tony Hayward that the chance of such an accident was “one in a million” was simply a shorthand way of saying that the company assumed the risk was very small.)

But other forces were similar in the leak and the financial crisis. If deepwater oil drilling had been forbidden or greatly curtailed, the sacrifice of corporate profits and of consumer welfare (which is dependent on low gasoline prices) would have been great. The regulators who could have insisted on greater preventive efforts were afflicted with the usual short horizons of government officials. Elected representatives did not want to shut down deepwater drilling over an uncertain risk of a disastrous spill, and this reluctance doubtless influenced the response (or lack of it) of the civil servants who do the regulating.

The horizon of the private actors was foreshortened as well. Stockholders often don’t worry about the risks taken by the firms in which they invest, because diversified stock holdings can help insulate them. Managers worry more, but they are not personally liable for the debts of the firms they oversee and, more important, the danger to their own livelihood posed by seemingly small threats is not enough to discourage risk-taking. It seems that no one has much incentive to adopt or even call for safeguards against low-probability, but potentially catastrophic, disasters.


It would be nice to be able to draw up a complete list of disaster possibilities, rank them by their expected cost, decide how much we want to spend on preventing each one and proceed down the list until the total cost of prevention equals the total expected cost averted. But that isn’t feasible. Many of the probabilities are unknown. The consequences are unknown. The costs of prevention and remediation are unknown. And anyway, governments won’t focus on remote possibilities, however ominous in expected-cost terms


A politician who proposed a campaign of preventing asteroid collisions with Earth, for example, would be ridiculed and probably voted out of office. Yet, planetary scientist John S. Lewis has estimated that there is a 1 percent chance of an asteroid of one or more kilometers in diameter hitting the Earth in a millennium, and that such a hit would probably kill on the order of 1 billion people. That works out to 10,000 deaths per year, far exceeding the annual deaths from airplane crashes.

There are many stubborn obstacles to effective disaster prevention, and I do not expect them to be solved. We must brace for further crises, magnified by increases in world population (meaning more potential victims) and by the relentless march of technology, whether in oil extraction or financial speculation.

Yep.