I think I should probably agree with this piece in the NYT:
For all the criticism BP executives may deserve, they are far from the only people to struggle with such low-probability, high-cost events. Nearly everyone does. “These are precisely the kinds of events that are hard for us as humans to get our hands around and react to rationally,” Robert N. Stavins, an environmental economist at Harvard, says. We make two basic — and opposite — types of mistakes. When an event is difficult to imagine, we tend to underestimate its likelihood. This is the proverbial black swan. Most of the people running Deepwater Horizon probably never had a rig explode on them. So they assumed it would not happen, at least not to them.
On the other hand, when an unlikely event is all too easy to imagine, we often go in the opposite direction and overestimate the odds. After the 9/11 attacks, Americans canceled plane trips and took to the road. There were no terrorist attacks in this country in 2002, yet the additional driving apparently led to an increase in traffic fatalities
If the crux of his piece is, it's difficult to assess the potential for an unprecedented catastrophe, he's right.
The yucky feeling I get comes from this part:
When the stakes are high enough, it falls to government to help its citizens avoid these entirely human errors. The market, left to its own devices, often cannot do so. Yet in the case of Deepwater Horizon, government policy actually went the other way. It encouraged BP to underestimate the odds of a catastrophe.
If you read on, he supports the point well enough. But I can't get over the idea of how the government can clearly assess the risks of industry where industry cannot. It seems fundamentally illogical that when talking about a bajillion dollar operation like a deep sea oil rig, that the G would have a better handle on the safety of the asset than the actual company would.