Fire and Chaos at Sea
December 27, 2010 6:14 AM   Subscribe

The final hours of the Deepwater Horizon.
But this was a disaster with two distinct parts — first a blowout, then the destruction of the Horizon. The second part, which killed 11 people and injured dozens, has escaped intense scrutiny, as if it were an inevitable casualty of the blowout.
It was not.
David Barstow, David Rohde and Stephanie Saul report for the New York Times on the Deepwater Horizon disaster.

Related materials, all highly worth checking out: Audio and photos document testimony from survivors, a video documenting the crew's response, and a powerful slideshow of images of the fire and collapse.

This is a major work of investigative journalism.
posted by spitbull (71 comments total) 31 users marked this as a favorite
 
Thanks spitbull, I'd missed this. After working offshore, this is all too real. The parallels between this and Piper Alpha are clear. I'll write more when I've had a chance to digest it, but for now I'm too choked up to comment further.

.
posted by arcticseal at 6:46 AM on December 27, 2010


Still reading it, but this seems to sum up the situation:
Their training sessions contemplated a blowout coming up through only the drilling pipe. This one, it seemed, was erupting from the whole well opening. “I had no idea it could do what it did,” Mr. Holloway said.
Human understanding of the forces of nature may be decent, but it still has a lot of holes in it.
posted by nomadicink at 6:46 AM on December 27, 2010


More, indicating the weakness of human response:
Ms. Fleytas, 23, had graduated from maritime school in 2008 and had only been on the Horizon for 18 months. This was her first well-control emergency. But she had been trained, she said, to immediately sound the general master alarm if two or more sensors detected gas. She knew it had to be activated manually. She also knew how important it was to get crew members out of spaces filled with gas.

Yet with as many as 20 sensors glowing magenta on her console, Ms. Fleytas hesitated. She did not sound the general master alarm. Instead she began pressing buttons that told the system that the bridge crew was aware of the alarms.

“It was a lot to take in,” she testified. “There was a lot going on.”

Her boss, Yancy Keplinger, was also on the bridge. The alarms, in addition to flashing magenta, were making a warning sound. Mr. Keplinger said he kept trying to silence the alarms so he could think about what to do next. “I don’t think anybody was trained for the massive detectors that were going off that night,” he said.

Ms. Fleytas and Mr. Keplinger had another powerful tool at their fingertips — the emergency shutdown system. They could have used it to shut down the ventilation fans and inhibit the flow of gas. They could have used it to turn off electrical equipment and limit ignition sources. They could have even used it to shut down the engines.

They did none of these things.

Ms. Fleytas’s lawyer, Tim Johnson, said that with so much gas, explosions were all but inevitable. “I don’t think anything she could have done would have changed the situation out there,” he said. Mr. Keplinger’s lawyer, Steve Gordon, said the bridge crew faced “an insurmountable situation.”

As with the general master alarm, the effectiveness of the Horizon’s emergency shutdown system relied on human judgment. Transocean had been warned that the human element — the need for crew members to act quickly and correctly under stress — made the shutdown system vulnerable. In 2002, a safety consultant specifically urged Transocean to consider changing the system “so that human input is not critical for success.” Transocean says that having an automatic system is less safe.

Ms. Fleytas said it never occurred to her to use the emergency shutdown system. In any event, she explained, she had not been taught how to use it. “I don’t know of any procedures,” she said.
posted by nomadicink at 6:54 AM on December 27, 2010 [3 favorites]


I'd like to believe that this would put the nail in the coffin of the "Accidents happen guys, there's nothing that could've been done to change this" inanity.
posted by Pope Guilty at 7:12 AM on December 27, 2010


And yes, this David Rohde is David "I Escaped From the Taliban" Rohde. He and his wife just released a book about both sides of their ordeal.
posted by grabbingsand at 7:13 AM on December 27, 2010 [3 favorites]


The most depressing thing about this, is that if they automated a bit more, it seems so much could have been prevented. They had all the sensors and computer systems in place, and for the most critical functions, they relied on human judgement.

Of course, in a parallel world where the multiple high gas alarms set off the general alarm and activated the EDS, there'd be some poor developer in a Houston office explaining that this was the specification, even if the crew said it wasn't "that bad," and it his highly unlikely all the sensors were reporting false alarms. After a thorough review and a lot of bonuses forgone, the recommendation would be to set the EDS to manual activation ...
posted by geoff. at 7:18 AM on December 27, 2010 [16 favorites]


After the reading the whole thing, it sounds like the perfect storm of greed and poor training. They took forever to activate the systems and the blowout preventer, often worrying about the cost and investigation that would occur. The blowout preventer itself was behind on it's maintenance schedule and the owners had bee told by the manufactories of the preventer that it was behind schedule and potentially dangerous.

When the crew finally got the ok to activate the preventer, it didn't work and no one knows or probably ever will know why exactly. All told there seems to have been a window of about 10 minutes where various systems could have been activated but weren't, due to an overwhelmed crew that hadn't trained for a situation that had at least been thought of. Mind you, the rig itself was considered an example of good safety measures and the crew was highly experienced and considered among the best in the world. But they were behind schedule on this well and the company was breathing down their neck and the crew seem to have ignored and not read correctly a crucial test.
posted by nomadicink at 7:23 AM on December 27, 2010 [2 favorites]


geoff.: The most depressing thing about this, is that if they automated a bit more, it seems so much could have been prevented. They had all the sensors and computer systems in place, and for the most critical functions, they relied on human judgement.
I see what you mean, but I wonder if that really would have helped. Humans are constantly overriding automatic safety systems intended for their protection.

The Chernobyl disaster happened, more or less, because humans continuously ignored the correct predictions of imminent disaster the instruments made and even undermined some of the automatic systems that attempted to avert it. Planes and ships supposedly lost in the "Bermuda Triangle" often turned out to piloted by humans who favored their intuition over their instruments and made remarks like "Both of my [dual, redundant] compasses are out."

Would it be so surprising if, in response to an automatic rig shutdown, a supervisor said "What the hell? Override that thing and lets get back to work!"
posted by Western Infidels at 8:00 AM on December 27, 2010 [4 favorites]


Wow, that was a gripping Monday morning read. Thanks.

They had all the sensors and computer systems in place, and for the most critical functions, they relied on human judgement.

And for really dubious reasons:

As originally designed, this system would also automatically trigger the general master alarm — the shrill warning that signaled evacuation of the rig — if it detected high levels of gas. Transocean, though, had set the system so that the general master alarm had to be activated manually. The change had the Coast Guard’s blessing, but Mike Williams, an electronics technician who maintained the system, testified that he had raised concerns about the setup’s safety.

“They did not want people woke up at 3 o’clock in the morning due to false alarms,” he said.

posted by mediareport at 8:03 AM on December 27, 2010 [1 favorite]


The Deepwater Horizon disaster and the mistakes and slow response by the crew reminded of an incident that happened when Apollo 12 launched. Less than a minute into after taking off, it was struck by lightning twice, throwing systems into disarray. Let a controller on the ground understood what was going, thanks vaguely similar problem and radioed the astronauts how to handle it via an obscure switch on the craft. One of the astronauts remembered the switch from a previous training scenario and flipped it, preventing the mission from aborting.

It's nothing like the Deepwater Horizon situation, but the intense training the astronauts and ground crew did made a huge impact when something went wrong. Compare their response to one of the Deepwater crew who was trying to turn off all the blaring alarms, just so he could think about what to do next. We're drilling for energy in dangerous situations, using equipment and techniques which could kill people and screw up the environment, yet the training to potential disasters doesn't seem to be as good as it should and needs to be. Hopefully new procedures and techniques will be implemented.
posted by nomadicink at 8:09 AM on December 27, 2010 [5 favorites]


Also, what was up with only two lifeboats on the rig, both located in the same area? Having two large ships designed to evacuate everyone as opposed to several smaller one all around the rig seems foolishly optimistic about having abandon ship. It's like the designers planned for a neat and orderly departure.
posted by nomadicink at 8:13 AM on December 27, 2010


The thing that bugged the shit out of me:

"In a few hours, the drilling crew’s 21-day hitch would be done."

At some point in the development of the offshore industry (my recollection is 1989(?), the standard crew tour was extended from the traditional 14 days to 21 days. They did this to save money. After 14 days of 12 hours on-12 hours off my own brain turned into mush. The longest day of my life was the last day of the one 21 offshore duty tour I participated in.

I wonder if the people who drove this schedule ever did 21 days offshore.

Meta-comment: this is a great story in the Times, but there is a little man in my head who wonders why it is published on the day the fewest people read the newspapers. This is not the first time I have noticed a story which embarrasses powerful people is published on a day that almost nobody is going to see it.
posted by bukvich at 8:14 AM on December 27, 2010 [31 favorites]


Mr. Keplinger said he kept trying to silence the alarms so he could think about what to do next.

One thing that really stood out for me was the bit about the alarms being so constant and loud that Keplinger and Yancy couldn't think about what to do. I've always wondered about this tradeoff with alarm systems, at every level, between making sure everyone knows there is trouble and where it is, and overloading the stress receptors of the people who need to act. I can imagine it does not help control one's panic when there are screaming alarms going off in all direction. Alternatively, it numbs one to the alarms and makes it possible one might miss a crucial signal (the car alarm on the corner phenomenon).

No matter how well trained one is, the ability to act calmly and rationally in a true emergency is something only learned or uncovered through experience. Training is essential, knowing how to override panic can be learned to some extent, but the ability to go coldly rational when everyone is screaming and alarms are going off and you think you'll probably be dead in a bit is another thing entirely. I've been in true life and death situations before and been shocked at how much edge I lost immediately off my usual ability to process information and think in straight lines.
posted by spitbull at 8:16 AM on December 27, 2010 [3 favorites]


One of my friends is in the coast guard. He regularly, regularly trains in pretty intense conditions- they have a room that fills with water and you're supposed to fix an engine while this is happening. Another friend is a firefighter- they have a building which fills with heat and smoke and you have to practice crawling on the ground - it's so hot your helmet will melt if you stand up.

You can dry train the situation a hundred times over, but emergencies require training in like conditions to really be effective.

It also helps the fact, that, unlike an oil corporation, when you have the single purpose of saving lives, regardless of financial cost, you don't have the incentive to NOT deploy systems as soon as they're suggested.

(Granted, that's at a smaller scale. Obviously, we see this thing fall apart in large disasters like Katrina when you have higher authorities deciding, "Well, maybe it isn't such a big deal after all..." while the city floods...)
posted by yeloson at 8:16 AM on December 27, 2010 [1 favorite]



After reading, two things bothered me:

1) It is subtle, but the wording in one or two sections suggest one of the three people writing the article thought that, had they been in the same position of certain parties, they would have saved the day. Not to say any of them would ever say or consciously believe that, but one of them has a subconscious overconfidence that manifested in the rhetoric of specific sentences.

2) There's a Hollywood stooge somewhere, sitting next to his heated pool behind his 9800 sq. ft. home, videoconferencing with two others doing the same, hashing out the marketing plan and the script duties for the blockbuster movie about this. At least one said "Heck, we got enough for the story board, the script will write itself. Michael, Bay-bee? How big of an explosion are we talking?"
posted by Bathtub Bobsled at 8:25 AM on December 27, 2010


"(Granted, that's at a smaller scale. Obviously, we see this thing fall apart in large disasters like Katrina when you have higher authorities deciding, "Well, maybe it isn't such a big deal after all..." while the city floods...)"

When you've got the governor refusing assistance while she 'thinks about it' for 24 hours and a mayor that refused to evacuate despite there being plans in place for it and equipment to hand (recalling the lot full of drowned school busses) that doesn't help matters either - nor does a levee board that skimped so adequate levels of kickbacks and payoffs could be maintained. (One must have priorities, after all.)

High frequency, high-decibel alarms WILL cause you to freeze in semi-panic - a "OMG sonofabitch whatthehell do I do now" feeling. As Spitbull describes it, it's almost as if you designed a scenario that would cause you to NOT be able to think. She was trained to react to a single warning light - but a reflexive reaction to a red light can be overridden by adding a dozen more lights and a half-dozen shrieking alarms. Probably the best bet would be to limit (just as a thought) the alarm volume to 80 DB - loud enough to really get your attention, not so loud as to be potentially fatally distracting.
posted by JB71 at 8:36 AM on December 27, 2010 [2 favorites]


Can we say hubris? This is yet one more story of humans swaggering on about invulnerability - about their power over nature. The 'state of the art' technology, a "Floating Hilton", fail-safe systems with checks and balances ... did they use the same PR firm as the Titanic?
posted by Surfurrus at 8:36 AM on December 27, 2010


The human error aspect reminds me of the Colgan air crash in Buffalo.
posted by exogenous at 8:43 AM on December 27, 2010


Bathtub Bobsled: “It is subtle, but the wording in one or two sections suggest one of the three people writing the article thought that, had they been in the same position of certain parties, they would have saved the day. Not to say any of them would ever say or consciously believe that, but one of them has a subconscious overconfidence that manifested in the rhetoric of specific sentences.”

I don't know exactly which sections you're talking about, but isn't that kind of the point? If the facts cited in the article are correct, then some people made mistakes. It would seem to me to be impossible to write an article about people doing the wrong thing and yet conclude that you'd want to do exactly the same thing in their situation. I mean, it's not subconscious; the question is: were these inevitable accidents, did everyone involved make the best choices in the runup to them, or were they caused by bad decisions? The article seeks to show that there were bad decisions. So of course it's taking the stance that the writers of the article would have acted differently in their place. I don't see what's wrong with that. It's certainly not an arrogant or overconfident thing to say. Or would you prefer they preface the article with a statement saying they don't know what it's really like, and maybe it's just too hard to be one of the operators in the story and make the right decisions?
posted by koeselitz at 8:47 AM on December 27, 2010 [3 favorites]


What's always amazing to me, and I suppose this is how disasters of this magnitude occur, is how many people made exactly the wrong decision at the perfect time. In light of that fact, how anyone would ever choose a human trigger over automation really just boggles my mind.

Or, alternately, I guess we could survey the crew of the rig and find out whether they would prefer whatever they've been through over being awakened by false alarms a few times at 3am.

And also, once again you have people whose lives are in danger making choices which ultimately kill people in order to save money. It never fails to blow my mind.
posted by nevercalm at 9:01 AM on December 27, 2010 [3 favorites]


Meta-comment: this is a great story in the Times, but there is a little man in my head who wonders why it is published on the day the fewest people read the newspapers. This is not the first time I have noticed a story which embarrasses powerful people is published on a day that almost nobody is going to see it.

Sorry to derail, but do you have a cite for this? The NY Times and other papers (WaPo, e.g.) usually publish their Big Stories on Sundays because I thought Sunday circulations are generally higher - you get subscribers who are Sunday-only as well as the daily folks. And that's when the advertising circulars come out.

In any case, I saw this story online on Saturday, but resisted reading it until my dead-tree version arrived Sunday morning. Maybe it's just me.

I was also struck by the 21-day schedule - I don't think I could survive 12 hours on/12 hours off for three weeks. It would be all I could do to tie my shoes, let alone do a job that requires serious attention to detail.
posted by rtha at 9:07 AM on December 27, 2010


Wow. I just realized I hadn't been breathing while reading this story.

People have made good points about automation, training, and reaction to alarms. I have to add that it really seems like one of the problems was a vague division of responsibilities and chain of command, combined with poor communication. With the alarms ringing in three separate places, supposedly any one of the groups would be able to initiate emergency procedures. Instead it lead to confusion and delays as people tried to communicate with other areas.
posted by happyroach at 9:27 AM on December 27, 2010


> he kept trying to silence the alarms so he could think about what to do next.

I was trained decades ago on a telephone switchboard -- you've seen them in black-and-white movies, a panel of quarter-inch phono jacks and sockets with a light and buzzer over each, and the operator has to plug in the cable to the lit socket, talk to the person, put the circuit on hold, move the cable to the desired extension, plug it in, and ring it. And you have to remember who's on which line, and be able to transfer calls manually when buzzed to do it.

Those had a big foot switch that silenced the buzzers, for exactly this reason, because once more than a few circuits were engaged the din overwhelmed the operator.

Surely this wasn't neglected in designing the new alarm system? It's been a well understood feature for anything of this sort, a silence pedal or equivalent, to reduce the distraction.
posted by hank at 9:29 AM on December 27, 2010 [2 favorites]


There is always the impulse to try and save the situation instead of just following procedure.
posted by Ad hominem at 9:29 AM on December 27, 2010 [7 favorites]


but there is a little man in my head who wonders...

I use tin foil to keep the little men out of my head.
posted by ericost at 9:31 AM on December 27, 2010


Koes -

I do see your point, but let's look at this paragraph:

But other than the two brief calls, each only seconds long, there were no communications or coordination among the bridge, the drill shack and the engine control room. The men in the engine control room did nothing.


Well NO SHIT. Seconds is all they had. Would it have been better if someone through together a powerpoint? Now, this is preceded by a reasonable evaluation of why certain people didn't use the EDS. Had this been a typical kick (albeit stronger) then they would have taken a step that would have cost the rig hundreds of thousands, if not a few million dollars, to reconcile after the fact.

In my opinion, this whole scenario turns into a particularly ass-backwards Pascal's wager. If you press the button and save the day, there's little to distinguish whether pressing the button was a good idea or a bad idea, since in both situations, the end result is the same... the rig's ok, but now they have to repair and redrill, and you are going to be under scrutiny and possibly lose your job. If the rig blows up, you'll be under scrutiny for not pressing the button, but you've already lost your job since... well, your job is now on fire and sinking into the Gulf of Mexico.

That deer in headlights hesitation is when a conscious entity recognizes that the wisdom of an action depends entirely upon what doesn't happen. The brain goes into a self-preserving loop, since no action seems the only logical action.

So the "men in the engine control room did nothing," is a pretty shitty thing to say. What communication could you possibly provide with the overwhelming sounds and alarms at that point?? The pure onslaught of information not only creates a sensory overload, but it also inhibit's the ability to categorize it and see what patterns emerge and what conclusions can be drawn. Along with attempting to think straight when you only have 2 minutes to do so how could you possibly know who exactly to call when the only info you can provide is:

"Shit is really fucked up down here, and we're in a position requiring you to make a decision in 30 seconds, based on information I need 1 minute to figure out, and another 45 seconds to explain to you."
posted by Bathtub Bobsled at 9:31 AM on December 27, 2010 [10 favorites]


2) There's a Hollywood stooge somewhere, sitting next to his heated pool behind his 9800 sq. ft. home, videoconferencing with two others doing the same, hashing out the marketing plan and the script duties for the blockbuster movie about this. At least one said "Heck, we got enough for the story board, the script will write itself. Michael, Bay-bee? How big of an explosion are we talking?"

Except they will, of cource, bring in the Terrorists, as "a series of failures caused by stress, greed, and lack of training" is too difficult a concept to convey in a Hollywood blockbuster.
posted by daniel_charms at 9:47 AM on December 27, 2010


Can we say hubris? This is yet one more story of humans swaggering on about invulnerability - about their power over nature. The 'state of the art' technology, a "Floating Hilton", fail-safe systems with checks and balances ... did they use the same PR firm as the Titanic?

No, the PR firms used were the ones that talk about how safe, clean and great things will be once we get off of the stored sunlight and move onto splitting the Atom, in peace, for power.

The PR firm used for the Titanic were the ones who were talking about the canal building on Mars.
posted by rough ashlar at 9:48 AM on December 27, 2010 [1 favorite]


When you've got the governor refusing assistance while she 'thinks about it' for 24 hours and a mayor that refused to evacuate despite there being plans in place for it and equipment to hand

Well, that's what I mean by larger scale, even. The teams trained to deal with emergencies have a pretty good plan of how fast decisions need to be made, and under what conditions, and most importantly - what REALLY matters during an emergency. Everyone else is wondering if it's "really an emergency" and have other priorities on mind:

The Coast Guard out in San Fransisco dealt with an oil spill according to the procedure, exactly by the book a few years back, and, the oil company was underreporting the spill (surprise) and the local government didn't want to catch backlash for letting it happen (surprise), they decided to blame the Coast Guard and people lost their jobs.

...the people who were qualified and did exactly what they were supposed to do, despite bad info and people hampering them along the way.

It's a negative selection process for capability in a lot of ways.
posted by yeloson at 10:07 AM on December 27, 2010 [1 favorite]


Anecdata:

My friend was the night manager at a hotel that caught fire in Canada - early nineties IIRC. Procedure was to silence the fire alarm and check for a false alarm. 11 people died I think.
posted by sfts2 at 10:12 AM on December 27, 2010 [1 favorite]


spitbull: One thing that really stood out for me was the bit about the alarms being so constant and loud that Keplinger and Yancy couldn't think about what to do. I've always wondered about this tradeoff with alarm systems, at every level, between making sure everyone knows there is trouble and where it is, and overloading the stress receptors of the people who need to act.

The thing that really bugs me about this story is that lessons like this have already been learned. A LOT of research into human performance and safety has been conducted by the nuclear industry. In particularmuch effort has been put into the science of alarm prioritization. But even the simple lessons have been ignored like:
  • not having operator action mandatory for safety systems
  • requiring that critical safety systems (like the blowout preventor) be tested and inspected regularly by an empowered regulatory agency
If the US held offshore drill operators to the same standards as its nuclear plant operators we would have two results: 1) an accident like this would never happen again, and 2) offshore oil rigs would become as expensive as nuclear plants, which is fine by me. I'd rather pay for the risk at the pump than have the government (and the environment) pay after an accident.
posted by Popular Ethics at 10:21 AM on December 27, 2010 [6 favorites]


Wikileaks: BP had a similar blow-out in Azerbaijan 18 months prior to the Gulf of Mexico incident. The crew escaped, and BP suppressed information about the explosion.

Other Wikileaks state that BP stole $10bn worth of oil from Azerbaijan.

Don't worry about it, though - we all know that nothing of any real importance has come out of the leaked cables.
posted by kaibutsu at 10:25 AM on December 27, 2010 [14 favorites]


As far as circulation goes, here's an article from the NY Times regarding daily circulation. To sum up, the Times distributes about 1.4 million papers on Sunday, where the daily averages 950,000.

So it looks like Sunday is actually the BEST day to publish something like this. Interesting.
posted by ensign_ricky at 10:30 AM on December 27, 2010


If the US held offshore drill operators to the same standards as its nuclear plant operators...

More wikileaks: Giant British energy supplier RWE built a nuclear facility in Bulgaria with massive ongoing safety issues.
posted by kaibutsu at 10:31 AM on December 27, 2010 [3 favorites]


Bathtub Bobsled and a few others made very good points about the nature of the human errors. As they say, "hindsight is 20/20".

This is definitely, clearly, and directly the result of people being hesitant to act because of the consequences of being wrong - or hell, the consequences of being right. Shut down the operations, save the day, and still get fired because you cost the company several million dollars.

Automatic systems remove the human error, but only to an extent. The problem with automatic systems is that they can fail, too. False positives, and the system will be disabled by people who cannot afford to let them fail like that. False negatives, and the rig goes down in a flaming heap of steel. The end result in either case still comes out to be "human error", for failing to properly maintain the automatic systems.

The fundamental fix is to have people in place who have the authority to make decisions, without the fear of repercussions. You'll always have human error though.

Here's a side story. A friend of mine had a contract doing vegetation clearing with a regional oil operator (who is themselves a corporation put together by the big oil companies). The operating company was almost paranoid about safety. They demanded, rightly so, to be notified of any incident, no matter how small - even down to crew members getting into poison ivy. This oil field is very old, and is rife with unmarked, unknown, unrecorded, abandoned oil, water and gas lines. One day they hit an old rusty line that was buried about six inches underground. The line was abandoned, fortunately, but about five gallons of residual oil and water drained out of the line and onto the ground. They reported the incident, and their contract was immediately terminated.

That's how the operating company kept field incident reports to a minimum, and kept their safety record as clean as possible.
posted by Xoebe at 10:49 AM on December 27, 2010 [11 favorites]


So it looks like Sunday is actually the BEST day to publish something like this. Interesting.

Someone mentioned upthread there are Sunday only subscribers. But there are tons of people who buy the times on Sunday even if they just read the magazine , or do the crossword. It is sort of a New York tradition to get up early and buy the Sunday times and bagels and leisurely read it.

It is officially a thing White People Like
posted by Ad hominem at 10:50 AM on December 27, 2010


All this does is gives us something to compare the next accident to. The only thing we learned from the Exxon Valdez is that oil company oligarchies will use the new regulations to find better ways to fuck people over.
posted by I love you more when I eat paint chips at 10:53 AM on December 27, 2010


There are several groups of people that look bad in this, but Andrea Fleytas comes off looking like the next best thing to a Joseph Hazelwood. Or maybe a Lieutenant Gorman. From shouting that "WE'RE ALL GONNA DIE!" to failing to sound the general alarm or engage other safety measures, she comes off as ineffectual, untrained, dithering, and (maybe) culpable. And yet, she was a 23-year old recent graduate of maritime school. I can't help but think that the reporters here got thrown information about her as a way to generate a fall guy.
posted by norm at 11:54 AM on December 27, 2010 [1 favorite]


From shouting that "WE'RE ALL GONNA DIE!" to failing to sound the general alarm or engage other safety measures, she comes off as ineffectual, untrained, dithering, and (maybe) culpable.

It's odd that you wrote that, considering that her failure to sound the general alarm was noted right before these paragraphs:
Her boss, Yancy Keplinger, was also on the bridge. The alarms, in addition to flashing magenta, were making a warning sound. Mr. Keplinger said he kept trying to silence the alarms so he could think about what to do next. “I don’t think anybody was trained for the massive detectors that were going off that night,” he said.

Ms. Fleytas and Mr. Keplinger had another powerful tool at their fingertips — the emergency shutdown system. They could have used it to shut down the ventilation fans and inhibit the flow of gas. They could have used it to turn off electrical equipment and limit ignition sources. They could have even used it to shut down the engines.

They did none of these things.
Several paragraphs later, it's noted that she didn't know how to shut down the systems as she'd never been trained it. Yes, it's clear that Fleytas could have reacted better but it's extremely clear that those higher up the chain of command with more experience also failed to react well. Your singling out of her seems odd.
posted by nomadicink at 12:21 PM on December 27, 2010 [2 favorites]


norm: "she was a 23-year old recent graduate of maritime school"

This is what slays me. We don't put the 23-year-old programmers in charge of ANYTHING, and the only equipment we have around is a bunch of desktop computers. This was one of the largest, most complicated machines produced in the history of mankind, and we had one fresh-faced kid at the con?
posted by Rat Spatula at 12:35 PM on December 27, 2010


I thought bukvich meant that it was published on the day after Christmas, not that it was a Sunday.
posted by jindc at 12:37 PM on December 27, 2010


It's also interesting that anything dealing with the captain includes statements by the captain's attorney.

...
posted by yeloson at 12:37 PM on December 27, 2010


Right or wrong, Ms. Fleytas is not singled out. The article was, however, extremely critical of the Captain. You didn't even have to read between the lines to get a very unflattering picture of him painted for you. The article's tone was very considerate and understanding when it came to discussing Ms. Fleytas' failure to do the right thing. As nomadicink noted, pains were taken to point out that her boss was RIGHT THERE with her.

That is the article's way of immediately providing her with a huge benefit of the doubt. It's pretty hard to blame an underling when the overling is standing above the same huge switchboard making the same mistakes.
posted by jsturgill at 1:01 PM on December 27, 2010 [1 favorite]


It's also interesting that anything dealing with the captain includes statements by the captain's attorney.

Not really if you read the first paragraph on page 10:
After the explosions, the chief engineer, Steve Bertone, raced from his room to the bridge. He did a quick survey of the rig’s condition. Its engines were dead. There was no power. The phones didn’t work. When he tried the handheld radios, they didn’t either. Meanwhile, he later wrote in a statement to the Coast Guard, Captain Kuchta was screaming at Ms. Fleytas for pushing the Horizon’s distress signal.
Googling Freytas's name reveals that she tried to send out a distress call, got yelled at by the Captain and ended up saying "I'm sorry". Got that, she's sending out a general distress call because the rig is exploding and the she winds up apologizing for do so.

Throw in the fact that Capt. Kuchta was on the bridge talking to suits and showing off a video game type simulator when the explosion happened and conflicting reports of his resistance to activating the blowout preventer and it's surprising his lawyer lets him leave the house.
posted by nomadicink at 1:06 PM on December 27, 2010 [4 favorites]


Yes, it's clear that Fleytas could have reacted better but it's extremely clear that those higher up the chain of command with more experience also failed to react well. Your singling out of her seems odd.

She was mentioned far more times than her boss, with more flavor to her text, and the article contained a reference to her attorney's statement. There's more than enough subtext to pick up on here. I don't think it's just me singling her out.

On the other hand, you could have also read my last sentence: I am highly skeptical that this 23-year old could have been quite as pivotal as the story reads. Remember, every human error on the scene means it's not a failure of procedure or policy, so it's totally in BP's interest to pin this on a low-ranking employee of the contractor they hired.
posted by norm at 1:07 PM on December 27, 2010


The captain seemed to stick out more to me, as being waaaay over his head for the Captain of an oil rig.
posted by nomadicink at 1:14 PM on December 27, 2010


norm there is no way to "pin" this on any single individual. There were a cascade of human errors with different (individual and committee) agents: the casing cement job was faulty, the leak-off test was erroneous, the interpretation of the condition of the hole after the cement job and the leak off test was fouled up, the blowout protector failed, &c

All these events are necessary before you get oil and gas and fire all over the rig which was the central subject of the article. There may be sixty or seventy (or more) individual contributing human errors to this disaster.
posted by bukvich at 1:30 PM on December 27, 2010


I'm definitely not saying that the Captain doesn't come off as pretty terrible too, but his worst actions seemed to have been post-brink; the suggestion I'm reading is that the bridge controllers had three or four ways to mitigate the disaster and just blundered right past.
posted by norm at 1:32 PM on December 27, 2010


And sure, I get that no one is getting pinned with sole responsibility here, but there is a natural tendency, reinforced by corporate thinking, to find people to blame for these things. So Hazelwood got the blame in Alaska, which was a hell of a diversion from the fact that Alyeska/BP had what amounted to a fake response plan filed which compounded the disaster tenfold or so.

And let me say this again: I don't buy any effort to pin this on Freytas, or Captain Kucha, or the drillers alone. What makes sense to me was that there was a system of perverse incentives in place to get that well drilled on schedule, ignoring warning signs that, in retrospect, sound like klaxons.
posted by norm at 1:42 PM on December 27, 2010 [4 favorites]


..."a series of failures caused by stress, greed, and lack of training" is too difficult a concept to convey in a Hollywood blockbuster.

The China Syndrome
posted by Kirth Gerson at 1:45 PM on December 27, 2010


When you've got the governor refusing assistance while she 'thinks about it' for 24 hours

A reference to refusing to cede state authority to the Feds? That was a stupid bit to throw into the discussion of totally different subject here, unless you know nothing re how unusual and suspect/dubious a request that was.
posted by raysmj at 1:49 PM on December 27, 2010


What makes sense to me was that there was a system of perverse incentives in place to get that well drilled on schedule, ignoring warning signs that, in retrospect, sound like klaxons.

Well, when you based schedules on quarterly profits and not on the laws of physics, eventually you find out that one trumps the other, and demand does not always instantly produce solutions.
posted by yeloson at 1:49 PM on December 27, 2010


Otherwise, might help to check your facts.
posted by raysmj at 1:54 PM on December 27, 2010 [4 favorites]


If there are 20 blinking lights suggesting extremely high levels of explosive gas, logic would suggest to just push the damn alarm and deal with the chance of a false alarm later. I can't believe that a) this alarm wasn't automated and b) the alarm wasn't activated.
posted by nickheer at 3:12 PM on December 27, 2010


Wow. A fantastic piece of journalism -- one for the history books.

Quite a few things come to mind while reading this. I'll rattle them off one by one, "stream of consciousness"-style:
  • Every person deserves a fair legal defense... HOWEVER, the lawyers mentioned in the article (and their clients) come across as complete scumbags. I would feel a whole lot better about this ordeal if somebody would have the balls to say "You're right, I should have pressed the alarm," even if there were extenuating circumstances, which there almost certainly were. I'd be a lot more inclined to forgive that person...
  • This entire incident comes across as so, very.... American. I only need to think back to the debate between the design philosophy of Boeing vs. Airbus jets, where modern Airbus planes have an extensive number of safety interlockings that prevent the pilot from deliberately endangering the aircraft, while Boeing jets will sound an alarm, but happily allow the pilot to accelerate the plane to the point where it breaks apart. Guess which one the macho American pilots prefer? "Safety culture" simply does not exist in America outside of unionized workplaces.
  • Which brings me to my next point... it is extremely obvious that employees at every level were scared to death of escalating issues to their managers, or taking action without explicit prior approval. I'll wager a guess that the majority of employees/managers on the rig were ex-military, having worked in a few of those companies myself. This isn't necessarily a bad thing, but the strict hierarchy that it inspired prevented the employees from acting independently, or thinking for themselves. By the time the captain was ready to issue orders, it was too late to save the rig, despite there being multiple opportunities to do so along the way.
  • In every major disaster Post-mortem, I can think of, there is some sort of engineering failure that led to the disaster (duh). However, in each of these, there is either a failsafe that was deliberately disengaged (Chernobyl), or a manager that ignored a report about the exact engineering failure that caused the accident (Challenger, Columbia). I won't defend Chernobyl's design, but even the poorly-designed and poorly-constructed reactor shouldn't have exploded if it was being operated under reasonable conditions by competent personnel.
  • Transocean/BP did not maintain their blowout preventer. They need to be dragged over the coals for that, and legislation needs to be enacted that makes them legally responsible to do so (with the penalties necessarily exceeding the cost savings of neglecting the maintenance). The FAA requires aircraft to be inspected and maintained -- safety devices on ships/oil rigs should be no different.
  • In many of these major accidents, there is a tendency to blame the computer (or an operator), when neither or both were at fault. The June 2009 accident on the Washington Metro immediately springs to mind (NTSB report here -- a fantastic way to waste an afternoon). In short, two (computer-driven) trains collided, because a recently-installed piece of hardware caused the computer on the first train to not detect the second train standing still on the tracks. Immediately, Metro discontinued computer-controlled operation of their trains (and has not resumed to this day). Ultimately, the NTSB concluded that the nature of the failure was not with the computer, and that a human operator would have made the exact same mistake, given that the computer/human were both presented with the same information about the status of the track ahead, since the stopped train was located just around a blind curve. Even though the computerized train control system operated flawlessly, Metro has not reinstated its use, and continues to rely on human operators, which are almost certainly less safe. A great example of humans being spooked by automated systems, despite considerable proof that those automated systems are almost certainly safer than their manual counterparts. The human operator of the second train took a lot of heat too, as he had been involved in a number of past incidents. Fortunately, he had strong union representation, his case was allowed to be heard, and he kept his job, once it was determined that he did not contribute to the accident.
posted by schmod at 5:19 PM on December 27, 2010 [4 favorites]


I'm with you nickheer. Fire alarms are loud and annoying, but I never freeze like a deer in headlights when I hear one. I calmly exit the building (or remove the batteries, as my pomme frites are smoking, whatever).

You can make all the excuses you want -- and I certainly understand the perverse incentives of loosing your job by sounding the alarm -- but that aside, I see no excuse for this. I won't blame one person. I blame them all.. Correction - I mostly blame BP corporate, given the information available to me.

Nevertheless, I'm a little taken aback by the amount of "you couldn't push the alarm, either, if you had to do it with so many alarms, wahhhh" in this thread. I have no doubt that large numbers of people/personalities would not do well in this situation. However, I would expect a large percentage of properly screened employees trained in this very specific situation to handle it better than they did.
posted by mbatch at 5:26 PM on December 27, 2010


More wikileaks: Giant British energy supplier RWE built a nuclear facility in Bulgaria with massive ongoing safety issues.

RWE are German. Britain doesn't have a monopoly on crappy corporations.

I think what we'll see in the Gulf will be processes implemented that are similar to those in the Norwegian and British sectors. They will not be perfect, but they will be an improvement. We will still see humans finding new and inventive ways to screw things up as no system is fool proof nor resistant to outside pressures such as the profit motive. These were good people, but subject to 2 conflicting objectives- one to be safe, the other not to waste time/money.

One factoid from when I worked in the North Sea. We had a day of Norwegian safety legislation, and the telling line in the Norwegian docs is that "operators shall make all efforts to ensure safety"; the equivalent UK docs had the line "operators shall make all reasonable efforts to ensure safety" after lobbying by the industry that it applied an unfair burden on their operations.

In that one word is a lot of profit and a Piper Alpha or Deepwater Horizon.
posted by arcticseal at 6:11 PM on December 27, 2010 [4 favorites]


However, I would expect a large percentage of properly screened employees trained in this very specific situation to handle it better than they did.

Part of the problem is that they weren't trained in this very specific situation:
The paralysis had two main sources, the examination by The Times shows. The first was a failure to train for the worst. The Horizon was like a Gulf Coast town that regularly rehearsed for Category 1 hurricanes but never contemplated the hundred-year storm. The crew members, though expert in responding to the usual range of well problems, were unprepared for a major blowout followed by explosions, fires and a total loss of power.

They were also frozen by the sheer complexity of the Horizon’s defenses, and by the policies that explained when they were to be deployed. One emergency system alone was controlled by 30 buttons.
posted by nomadicink at 6:21 PM on December 27, 2010


mbatch: "I'm with you nickheer. Fire alarms are loud and annoying, but I never freeze like a deer in headlights when I hear one. I calmly exit the building (or remove the batteries, as my pomme frites are smoking, whatever)."

That's fine, and you're right - as long as you've got only a few alarms to deal with. Remember, alarm signals are deliberately designed to grab and hold your attention. Throw up multiple alarms, however, and analytical thinking goes out the window in a fog of "hey, what? 20 simultaneous alarms?! That shouldn't be happening! I know intuitively how they interlink - but where do I start?!".

Which I've faced, albeit in completely non-life-threatening circumstances. Similar to what hank relates, the first step is to silence the current alarms - ideally, new alarms will still sound, and even more ideally you have someone whose job it is to eyeball those new alarms, ensure the senior officer is generally aware of them, and then RA them too. Then you can think, start analysing your little mental map of the sensor and alarm network, and begin to figure out what's going on to cause the problem.

And to be totally honest, when faced with an overwhelming panel of disparate but interrelated alarms, your first thought is "alarm system failure" and your second thought is "even if it's not, hitting the Big Red Button is a big responsibility". Your first attempt at analysis will almost always be to figure out where in the alarm system is the common point - both logically in the alarm chain (e.g. "5 of those alarms brings up one of these alarms") and physically (e.g. "OK, so they meet / run adjacent here, so a single failure here could cause all that") - and attempting to verify that that is in fact the issues. "Shit, all those alarms are real!" runs parallel, but second, to that.

In other words, attending to a panel full of alarms has as much relevance to resetting a smoke detector as flying through hyperspace does to crop-dusting…

Screening for the right type of people does help, yes - but remember you're after a very odd combination; people who can handle the boring mundanities for 100's of days while reacting perfectly to the extraordinary the one time it happens, and enough of them to staff a control room 24/7. And, ideally, policies, responsibilities, and outcomes are clear - but I've never worked in or even heard of an organisation where they are. There's always conflicting policies to ensure everyone's arse is covered, conflicting responsibilities to cover someone your own or else's arse, and conflicting outcomes depending upon who's arse ends up getting covered.

(Besides - how many people, when faced with the single signal of a building fire alarm, sit there and wonder "Is that the alert, evacuate, or all-clear sound? Should I wait for the next one, see what everyone else is doing, or risk going outside and looking like a doofus? ;-)
posted by Pinback at 8:06 PM on December 27, 2010


Another example of the conflicting priorities on rig crew. The 2 buttons used to activate the shear rams from the driller's shack usually had a cover on to prevent accidental activation; I saw on several rigs that this cover had a label on top saying "Think Mortgage" as the crew knew that if they activated the shear rams without good reason, they were going to be out of a job. That kind of thinking gives you pause and goes some way towards explaining the hesitation during that 9 min period.

The 21 day hitches were largely phased out in the North Sea during the late 90s, but you can still do them if you get an exemption. I once did a hitch where I spent 3 weeks on Rig A, went in, caught a chopper to Rig B for 3 weeks that same afternoon, went in to the beach after that and caught the crew change for Rig A for another 3 week stretch - 9 weeks in total. At the end of it I was dog tired and certainly not operating at peak performance.

It's a well established fact that more accidents occur on crew-change days; people are tired at the end of their hitch, or already mentally halfway home so not concentrating. Add that to short shifts, as the night crew rotates to days and accidents are much more likely.

I miss the work offshore as it was interesting, I miss the time off and the money was good but don't kid myself that I'm not happier being home most nights with Mrs arcticseal.
posted by arcticseal at 9:21 PM on December 27, 2010 [2 favorites]


My takeaway from all this is that we, culturally, as a society, make it almost impossible for lower-echelon people to make definitive decisions. We don't have any incentives for low-ranking people to make decisions that might work. Actually, we (and I'm talking about "we" as a society here) seem to incentivize indecision and buck-passing.

Case in point: I used to work in a hospital. When I nurse makes a mistake (like giving too much insulin to a patient, for example), she's supposed to correct the mistake (give oral glucose or D-50), inform the patient's doctor and then submit herself for the appropriate punishment. There is no leniency for even simple mistakes either, since the hospital wants to appear very harsh on mistake-makers for liability purposes.

The effect this has is that nurses have an enormous incentive to conceal mistakes, rather than correcting them. Nobody wants to be punished or fired for making a routine mistake (and medication mistakes are quite routine), so the result is that nobody ever makes mistakes, ever, even when they do.

I'm trying to think of what we, as a society, could do to change this situation, but I'm coming up blank. Some kind of cultural sea-change where we move away from blame and towards a more general acceptance of responsibility? Some kind of system where people are guaranteed their jobs even when they make mistakes?

Neither one of those things are going to fly in our current culture, so beyond a massive cultural shift I have no idea how to fix it.
posted by Avenger at 9:43 PM on December 27, 2010 [1 favorite]


when a* nurse makes a mistake
posted by Avenger at 9:44 PM on December 27, 2010


Actually, my whole argument can be summed up thus: "Here in the United States, we give people enormous incentive to hide their mistakes and blame others. Theres no way to fix this without becoming different people."
posted by Avenger at 9:45 PM on December 27, 2010 [3 favorites]


I get it, pinback, I do.. but you still missed my point. I wasn't really arguing that a single smoke detector is anything like 20+ alarms going off. What I was arguing is that there are a lot of namby-pamby, skittish, and touchy people out there that can't handle a smoke detector with any amount of decorum..

In the same way that there are some (few) people that can handle, say, the intensity of S&R training or, say, being a Navy Seal. I suspect this number of people is getting smaller by the day as our society coddles and entitlement grows. Regardless, my comparison was meant to suggest that there are people who could handle 20+ alarms in a calm and relaxed manner and there are even more people who could be trained to do so. These are the sorts of people they should be hiring.... and, of course, training.

Nevertheless, in our political climate, the cost of this disaster for BP is probably smaller, in the very very long run, than having well screened and highly trained personnel at all worksites, or, say, safety measures that meat regulations or, maybe, a corporate (effective) policy that doesn't condone straight up theft.
posted by mbatch at 11:37 PM on December 27, 2010


there is a natural tendency, reinforced by corporate thinking, to find people to blame for these things

Corporations aren't the only ones who pull this stunt.
posted by Blazecock Pileon at 3:48 AM on December 28, 2010


I'm trying to think of what we, as a society, could do to change this situation, but I'm coming up blank. Some kind of cultural sea-change where we move away from blame and towards a more general acceptance of responsibility? Some kind of system where people are guaranteed their jobs even when they make mistakes?

I'm not normally one to advocate unionization, but this is something that Unions are really good at doing. Union workers are better protected, both from physical accidents, and from egregious terminations due to minor mistakes.

The trick, however, is finding the right balance. There are plenty of unions that have gotten a bit too "cushy," and there are some that have worked to protect genuinely incompetent workers (much to the detriment of their many talented members). Performance should be rewarded, but workers shouldn't lose their livelihood over a minor mistake.
posted by schmod at 6:38 AM on December 28, 2010


This is what slays me. We don't put the 23-year-old programmers in charge of ANYTHING, and the only equipment we have around is a bunch of desktop computers. This was one of the largest, most complicated machines produced in the history of mankind, and we had one fresh-faced kid at the con?

Err, yes. You'll find something similar in all sorts of maritime/naval/military situations.
posted by Jahaza at 8:12 AM on December 28, 2010


I have more than a passing interest in these sort of accounts because many of the same dynamics that led to this disaster also contribute to mishaps in the OR. For the same reason I also read NTSB reports about aircraft accidents and similar literature; I think it helps make make my OR a safer place.

There are a number of things that keep coming up in these narratives. The first is that people are often reluctant to believe their instruments and when alarms go off they start trying to troubleshoot the instrument rather than make sure there isn't a problem. This is the sort of thing that leads pilots into a graveyard spiral when they refuse to believe their plane is not flying straight and level. And I see it all the time in the OR, where residents are so accustomed to false alarms that their first action on hearing an alarm is to silence it rather than check the patient, a hard habit for me to break them of. I remember specifically an incident in the heart room a few years ago when almost every patient monitor started beeping simultaneously. The resident started feverishly trying to silence the alarms and had no idea what was going on, when a simple glance at the heart (either over the drapes in the surgical field or via the echocardiography probe we had in place) revealed that the patient was fibrillating. The surgeon had it figured out and was calling for the defibrillator paddles immediately and things were back to normal in a few seconds, but the point was well made for my resident. In fact the whole topic of medical device alarms, especially in ICUs and ORs is a big and complex one, and there is no consensus on the best way to manage them. Based on the article, it appears that the people on the bridge were behaving just like the resident, as people above have noted.

Another dynamic seen here that occurs in many serious accidents is that a chain of failures is necessary before something happens. A series of incidents that individually can be dealt with soon spiral out of control (see aviation reference above) with bad results. For that reason I make it a policy to not proceed with a case if there is even a minor problem that cannot be explained or corrected before we start (you can't do this in an emergency, of course, which is one reason emergency surgery is slightly more risky). Before about 1980 when oxygen monitoring became standard when administering an anesthetic, one of the main causes of death or serious injury during anesthesia was failure of the oxygen supply. Since then that problem has become almost unheard of. Now many people take the reliabilty of the oxygen supply for granted, but I will refuse to do a case without a functioning oxygen monitor rather than forge the first link in a potentially dangerous chain of events. The same goes for other malfunctioning equipment, missing or unexpected lab results, and so on.

Then there is the group dynamic at work in which people are unwilling to be the first to call attention to a problem if there are others around who can stick their necks out instead. There is a body of research indicating that witnesses are less likely to report a crime if there are others around; there was a surveillance video shown on the science program Nova several years ago in which some kids set a fire in a candy store as a distraction while they stole some candy. In the video the other customers looked at the fire but then went about their business until it was nearly too late to escape. Unfortunately I can't find web links for these. In the OR this behavior can occur when there is uncertainty as to what procedure is to be done on a patient and the staff is accustomed to deferring to the surgeon rather than speaking up. This has caused serious problems. In response the main accreditation organization for US hospitals has come up with a protocol to prevent these mistakes (PDF poster).

I find this whole field fascinating and appreciate you posting this article. For anyone who wants more narratives like this, I recommend the book Set Phasers on Stun, about mishaps in a variety of situations.
posted by TedW at 8:26 AM on December 28, 2010 [6 favorites]


This looks like what you're talking about, TedW: Kid Lights Fire In Store To Steal Candy
posted by mkdg at 10:16 AM on December 28, 2010 [3 favorites]


That is exactly what I was looking for! Thanks, mkdg.
posted by TedW at 1:56 PM on December 28, 2010


schmod, don't your vaunted Airbus computer control have a history of flying passenger planes into hillsides and forests?
posted by vsync at 10:29 PM on December 29, 2010


« Older Brian Butterfield's Christmas.   |   The Place Where You Live Newer »


This thread has been archived and is closed to new comments