In 1974 TWA flight 514 was flying into Washington Dulles airport. As it came into land the air traffic controller said “cleared for approach”. The flight crew thought it meant one thing (we will guide you in) but the control tower meant something ever so slightly different (guide yourself in). It was simple miscommunication.
Flight 514 flew straight into the side of Mount Weather, 510 meters above sea level.
92 people died.
There wasn’t a single survivor, not one, all because of a mis-understanding over a couple of words.
The accident that should never have happened
If you have ever worked in accident prevention you will know about the accident pyramid. It is a simple idea:
- For every fatal accident there are a few serious accidents
- For every serious accident there are several minor accidents
- For every minor accident there are lots of near misses
The causes of all these incidents are the same, but the outcome depends on a myriad of other circumstances.
So the theory goes that if you want to avoid fatalities then get obsessive about the near misses. Every time you have one, report it, remove the cause, and you will reduce the chances of a fatality.
Flouting the law
I’d love to tell you this was my idea but it was first put forward by H. W. Heinrich in 1931.
The people responsible for air safety in the 1970’s at the Federal Aviation Authority knew all about Heinrich’s pyramid. They also knew exactly how to use it. They had even mandated that every near miss be reported, it was the law.
But the law was flouted.
During the investigation into flight 514 it became clear that this wasn’t the first time that Air Traffic Control’s instruction had been misunderstood. Six weeks earlier a United Airlines flight had very narrowly missed the same fate, brushing the tree tops on the same hillside, for exactly the same reason.
Yet the pilots of the United Airlines flight had kept quiet.
Why? Didn’t they care about safety?
Lack of trust in the authorities
It wasn’t that the pilots didn’t care, just that they didn’t trust the FAA. They didn’t trust the FAA for a second.
Why not?
Because as well as being responsible for collecting data on near misses and accident reporting the FAA were also responsible for policing the aviation industry. They had the power to fine, imprison and revoke the licences of pilots who they caught doing wrong.
This meant that a pilot had two options after a near miss:
- Speak up and risk losing his livelihood
- Shut up and hope nobody else had the same misfortune.
What would you do? Human nature at work.
How do you remove fear?
NASA came to the rescue. They took on responsibility for collecting near miss information using the “Aviation Safety Reporting System”. A system that removes fear and blame:
- Pilots report incidents, but NASA has no power to punish them.
- NASA strips all the incriminating details from the reports that they produce. Pilots remain anonymous
- If a pilot is investigated by the FAA and he can show he reported the incident then it counts in his favour. The FAA sees the report as evidence of a “constructive safety attitude”.
NASA’s reporting system has a strap line on its website:
Confidential Voluntary Non-Punitive
It says it all.
The moral of this story
Like all good stories there is one and it is worth spelling out.
Your process, policy or procedure might well have the best intentions in the world. But unless it allows for human nature and is in people’s best interests to follow it, they categorically won’t.
Something to bear in mind when you design your next process.
Interesting post? Click here for free updates
Read another opinion
Image by GeekNeck
Adrian Swinscoe says
Hi James,
NASA seems to have got to the heart of the (dis)incentive that stops people participating in a system that is designed to help them. A process with a human heart and soul. Nice.
Adrian
maz iqbal says
Hello James
Really enjoyed reading and learning from this post. And it reminds me of one of Demings 14 principles: “Drive out fear!”. The system put in place by NASA did just that.
maz
James Lawther says
Thanks Maz, I suspect a lot of what I write comes back to one of those 14 principles, one way or another
James
Karen Jones says
What a totally horrific story and a fantastic point thank you for posting it – Karen
Jamie says
Terrific. Did that air traffic controller get punished for not communicating properly?
James Lawther says
Now that is an interesting question, I don’t know, but I guess the answer should depend on how you feel about blame cultures
Thanks for your comment
James
Duncan McDougall says
Funnily enough, the Royal Navy got round this more than sixty years ago. They had, and still have, a number of formal safety reporting systems, but in the background is another one called ANYMOUSE – which, as the name implies, was anonymous. People are trusted not to abuse it by putting in spurious reports (and they generally do not because they understand the results of crying ‘wolf’ too often).
I believe the US Air Force has a similar system, which also lets people whistle blow if their superiors are faking things like efficiency reports. I’d be interested to know if anybody has any information about this.
James Lawther says
Thanks for the comment Duncan. I hadn’t heard of that.
I had a quick search about the US air force and all I could find was this http://connection.ebscohost.com/c/articles/19609453/anymouses-anniversary
James