Flight 1549
On the 15th of January 2009, US Airways Flight 1549 struck a flock of canada geese as it was climbing away from New York City’s LaGuardia Airport. The plane lost all engine power. The pilots, Chesley Sullenberger and Jeffrey Skiles, were forced to land on the River Hudson off Midtown Manhattan. Boats rescued all 155 passengers and crew. Only 5 people were seriously injured.
An official from the National Transportation Safety Board described the event as “the most successful ditching in aviation history.”
The chief pilot, Chesley Sullenberger, was showered with honours. President-elect Barack Obama said that he was proud of Sullenberger’s “heroic and graceful job in landing the damaged aircraft”.
Was the pilot a hero?
I wouldn’t have kept my nerve in a situation like that. But — according to Mathew Syed in his book Black Box Thinking — in a television interview months after the event Sullenberger was sage enough to say:
Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died . . .
We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.
His point was simple, learn from your mistakes.
Wishful thinking
It is obvious that we should learn from both our mistakes, and from those around us, but few organisations do.
In most cultures failure and blame and deeply interlinked. It is a brave man who is prepared to put his hand up and admit to his failures.
Should we blame those who fail?
There are a whole host of reasons for failure:
- Deliberately ignoring an instruction
- A mistake caused by lack of attention
- Poor training
- A complicated or faulty process
- Unforseeable events
- An unsuccessful experiment
In how many cases is the failure blameworthy? Is it fair or wise to blame people regardless of the cause?
Lessons learnt
Senior managers who clamour for “lessons learnt” after a problem don’t help the situation. The authors (the guys being clamoured at) will only be interested in minimising personal exposure. All they will produce is a vague list of truisms and first order reasons for failure:
- Staff error
- Lack of communication
- Poorly captured requirements
- A lack of relevant experience
- Poor sponsor engagement
A list so sterile and banal that it is useless.
You can only learn if you stop thumping the table and institutionalise the process, making it normal for people to ask why, not who?
Institutionalised learning
Some organisations have made admission of mistakes the norm:
Stop the line:
Operations staff at Toyota are allowed to stop the production line if they find anything that isn’t right. Production lines are very expensive if they aren’t producing things, so the urge to fix the failure and find the cause so it doesn’t happen again is strong.
Embrace the messenger:
A fortune article tells the story of Alan Mulally who became chief executive of Ford in 2006. Mulally introduced a “traffic light” system to weekly meetings. Executives reported progress as red, amber or green. For four weeks all Mulally saw was a sea of green. Mulally challenged his team, “We are going to lose $18 billion this year, so is there anything that’s not going well?” His question was met with silence.
A week later Mark Fields, Ford’s North American President, produced a red report, admitting that a vehicle launch would be delayed. The other executives around the table assumed Mulally would flip and that this would be the end of Fields’ career. But Mulally applauded him and asked the rest of the group what they could do to help. After that, the weekly staff meetings were full of colour.
You have a problem; you are not the problem. ~ Alan Mulally
Challenge teams to fail:
At X Development, a research-and-development facility founded by Google, teams are urged to kill their projects by finding the weaknesses in them and focusing on them first. Better to fail quickly than ignore the big issues with a project and sink millions into it.
Incentivise reporting:
In the US NASA run the “Aviation Safety Reporting System” to capture airline incidents. Reported incidents are treated in confidence and anonymised before they are analysed.
If the Federal Aviation Authority investigate an incident and the pilot has already reported it, then that is treated as a factor in their favour. It shows the pilot has a “constructive safety attitude”.
Failing to learn
How you capture the failures doesn’t really matter. But if you don’t encourage people to come forward and talk about problems then you implicitly create a culture where it isn’t acceptable to speak up or make a mistake.
Where will that take you?
Learn from the mistakes of others. You don’t have time to make them all yourself
~ Eleanor Roosevelt
If you enjoyed this post click here to receive the next
Read another opinion
Image by USACE NY
Philippe Touati says
Great article on lesson to learn. The approach described here is very similar to one taken by Bridgewater. It requires a different culture : a culture of radical transparency.
James, you should should describe other examples.
James Lawther says
Thank you Philippe, I will have a look and see what else I can find.
Steve Brett says
Blame the process not the people. Needs to be driven from the top through actions. If people fear the repercussions, they will stop issues bubbling up.
James Lawther says
Unfortunately it is easier and more instinctive to blame the people. Thanks for reading.
Emma Robbins says
This is a very well observed piece, as usual!
I just wish the government would listen to this sentence….
“Better to fail quickly than ignore the big issues with a project and sink millions into it.”
James Lawther says
Arh, but if they did that Emma, politicians would have to admit they didn’t have all the answers, and I’d have nothing to write about…