There are some jobs you really wouldn’t want…
Perhaps the most dangerous in history was to be a crew member of a World War II bomber (I don’t suppose it matters which side).
You shouldn’t believe everything you read on the internet, but according to some of the more reliable sources, during World War II:
- Over 12,000 Bomber Command aircraft were shot down
- 55,500 aircrew died.
- The life expectancy of a Lancaster bomber was 3 weeks
- Tail-gunners were lucky if they survived four missions.
A terrible waste of lives…
But it was insisted that bombing was critical to the success of the war, so all this destruction lead to a very pressing question:
How do you cut the number of casualties?
Arguments raged:
- One line of thought was the bombers should be more heavily armoured so reducing the damage wrought by the anti-aircraft batteries and enemy fighter planes
- Another was that they should be lightly armoured so they could fly quickly and avoid the lethal bullets in the first place.
What was the solution?
The Hungarian-born mathematician Abraham Wald solved the problem by plotting the bullet holes found on battle-scarred planes on a diagram. It looked something like this:
The wings and fuselage presented the biggest target and were most likely to be hit, so those were the areas that would most benefit from being armour-plated.
A fairly straightforward deduction to make.
But Abraham Wald disagreed. His belief was exactly the opposite, the armour plating should be placed in the areas that hadn’t been hit.
He argued that the sample was biased.
What is sample bias?
Any sampling method that systematically favors some outcomes over others is biased. All the analysis was carried out on planes that returned, not those that were shot down.
Being hit in the areas that were undamaged was far more likely to prove fatal. The unmarked areas were the ones in need of armour plating.
A tragic story but why tell it?
It highlights a number of points
- Data is more useful than speculation
- The Pareto principle holds true — some things are always more important than others.
- Visual analysis (plotting where things have gone wrong) is a useful ploy.
- You should always worry about sample bias.
But perhaps the most important point…
Always apply a little common sense — hits to the cabin and engine are less dangerous than the fuselage — really?
If you enjoyed this post click here to have the next delivered straight to your inbox
Read another opinion
john.allen says
True, I still fly as a crew-chief on a B-25 plane. The old quote the bullet comes in one way goes out the other-no protection. – the pareto test is good. You have to remember the bulk of most armor was around the pilot.
James Lawther says
Thanks for the comment John
Adrian Swinscoe says
James,
I think the same logic could be applied to complaints – those that are received/given and those that are not.
Adrian
John Hunter says
I first heard example from George Box years ago. It has stayed with me ever since. Using data requires thinking. It is often hard to get people to use data more in the first place. But even when you do, the way it is used still leaves a great deal to be desired. Examples like this help people realize that critical thinking is necessary to understand what the data is showing you.
James Lawther says
Thanks for your comment John, we do appear to be wired to jump to conclusions rather than think things through
Vance Butler says
To Comment on John Hunter’s point – people need to realize there is a difference between data and information. They are not the same. Not mutually-exclusive either, you need both, one informs the other. Sometimes it can be hard to get users of data to seperate the two.
James Lawther says
Yes Vance, and I often find that for many the solution is simply more data, thanks for your comment
maz iqbal says
Hello James,
Excellent illustration of the biases in our thinking. And how real thought takes being with the challenge-question longer than most of us. Or put differently, sitting in the question rather than jumping in with the first answer. Some say, that our natural wiring, is not designed for this kind of analytical thinking.
It occurs to me that here it is worth listening to Karl Popper. Specifically, operating from the perspective-stand that all knowledge is provisional: not true so much as not yet shown to be false. And that the scientific way to deal with any hypothesis is to seek to refute it, not to prove it.
All the best
maz
James Lawther says
I like the idea of all knowledge being provisional Maz, that is very astute.
David Cary says
I am finishing a book. As lead in to how we often misinterpret data, I reference the Wald story.
I would like to use the image you used. It sounded like you might have created it. If so, may I use it in my book?
If not, could you tell me where you got it?
Thanks,
James Lawther says
I mocked it up David, there are other versions on the net but I didn’t want to infringe anybody’s copyright. Help yourself.
Good luck with the book, let me know when you publish.
JL