World War 2
During the second world war, over 60 million people were killed. That was roughly 3% of the worldwide population. It was a hazardous time.
Amongst the hardest hit were the bomber crews. The Eighth Air Force suffered half of all the U.S. Air Force’s casualties. The British fared as badly. The chances of surviving the war as a member of the RAF’s bomber command were only marginally better than even.
If flying bombing raids wasn’t dangerous enough, landing when you returned home was also fraught with danger. Pilots of the Boeing B-17 Flying Fortress had a series of runway crashes, accidentally retracting the landing gear when they touched down.
Human error
Accident investigators blamed these incidents on pilot (or human) error. There was no obvious mechanical failure.
It wasn’t only Flying Fortresses that had the problem. There were stories of pilots of P-17 Thunderbolts and B-25 Mitchells making the same mistake.
Nobody would deliberately retract the landing gear when they were still hurtling across the tarmac. Why the pilots did so was anybody’s guess. Perhaps the pilot’s attention wandered when they realised they were almost home.
Design error
The authorities asked Alphonse Chapanis, a military psychologist to explain the behaviour. He noticed that the accidents only happened to certain planes and not others. There were thousands of C-47 transport planes buzzing about. Their pilots never suffered from such fatal inattention.
After inspecting the cockpits of the different planes the cause became clear. On B-17s the controls for the flaps and undercarriage were next to one another. They also had the same style of handle. Pilots who retracted the undercarriage when the wheels were on the ground were actually trying to retract the flaps. They just pulled the wrong lever.
In the C-47 the two controls were very different and positioned apart from each other.
The solution
Once he identified the cause Chapanis developed an equally simple solution. Circular rubber disks were stuck to the levers for the undercarriage and triangles were stuck to the levers for the flaps. When a pilot touched the rubber he felt the difference and the crashes stopped.
To err is human
The pilots were well aware of which lever to pull. It was “human error” that caused the mistake. But laying the blame on the pilots wasn’t ever going to solve that.
We all make mistakes. It is in our nature. Don’t fight it, fix it.
If you enjoyed this post click here to get the next
Read another opinion
Image by Joe A. Kunzler
John Hunter says
These types of process improvements combined data an understanding of the process and an understanding of people. I love seeing such system improvements that create more reliable processes using an understanding of data and of the system in question.
James Lawther says
Thanks for the comment John, it appealed to me as well. I think there is so much done in the name of “process improvement” that fails to take the human into account.
Dennis Gershowitz says
Good reminder…I would have also mentioned it is important to consider all key types of personas for different needs.My experience is a faulty process fails many of us, but, the improvement may not have considered the the differences among us.
James Lawther says
That is a very interesting perspective Dennis. If you have some examples I’d love to hear them