Avoiding Overwhelm
In their book Meltdown, Clearfield and Tilcsik show how complicated systems fail. They discuss nuclear accidents, financial disasters and medical mishaps and show how we can avoid disasters by reducing complexity.
The more complex our systems become, the less easy they are to understand. So they become harder to fix when they break. Add to that complexity our desire to link and automate everything and it is easy to explain how unexpected events can cause horrific domino effects. Unforeseen events can rapidly spiral out of control. Think about the global financial crash of 2008 or the incident at Three-Mile Island and you will understand exactly what I mean
Rationalising alarms
One way to stop events from running away from you is to rationalise warnings and controls. Remove all the unnecessary alarms, warning bells and whistles. This sounds counterintuitive, but it keeps people focused on what is important, rather than distracting them with things that aren’t.
A real-world example
Clearfield and Tilcsik discuss the cockpit of an airliner.
Imagine being an airline pilot faced with one of these nightmare scenarios:
- A fire in the engine.
- Landing gear that isn’t down when coming into land.
- A plane that is about to stall aerodynamically.
- An engine that stops running.
How would you like to be alerted? Which alarms and warnings would you like triggered?
My instinctive reaction is that I want the whole nine yards when any of them occurs. I’d like to be in no doubt at all that something is wrong. They all sound dire to me.
Yet that is not what happens. The engineers at Boeing only turn on all the warnings when the aeroplane is about to stall. If that happens, alarms sound, red lights flash and a red text message appear on the pilot’s screens. If that isn’t enough the control sticks also start to shake violently.
The pilots are left under no illusion that something bad is about to happen. And it would be bad. If an aircraft stalls, it will drop out of the sky. It is quite impressive the lengths the engineers have gone to. They capture the pilots’ undivided attention.
But that isn’t the clever bit. What is really interesting is the fact that nothing else triggers all the alarms. An engine fire sets off warning lights and text messages whilst a bell sounds. The steering column, however, doesn’t start to shake. An engine fire is (I imagine and hope never to find out) a serious event on an aircraft, but it isn’t as pressing an issue as a stall. The plane won’t plummet out of the sky that instant. Pilots have a little time to react.
Less serious incidents trigger less serious alerts with fewer noises or lights. Amber text messages instead of red.
The logic is simple — don’t burden pilots with more information than they need. The designers resisted the desire to add more and more warnings. Instead, they set about reducing complexity and pared back the alerts so that the pilots’ attention is always drawn to the most pressing issue. This reduces overwhelm and increases the probability of a safe landing.
Attention to detail
The rigour that goes into the design process is very impressive.
Boeing engineers work very hard to trigger the right reactions, which sometimes requires an even more nuanced approach.
For example, an engine that quits during the early portion of the takeoff roll requires quick reactions by the pilots to stop the airplane on the runway, so the warnings include red warning lights, a red text message, and a synthetic voice shouting “ENGINE FAIL.”
A few seconds later, as the plane accelerates, there isn’t enough runway to stop, so the airplane automatically inhibits all these warnings except the text message. This is done to avoid triggering the pilots into trying to stop the plane when that cannot be done.
And if an engine fails while the plane is in stable cruising flight, it only sets off amber lights, a beep sound, and an amber text message.
Chris Clearfield and Andras Tilcsik
What happens in your world?
How are your monthly management meetings and risk and control sessions? Are they pared right back, so that you are left in absolutely no doubt what the issues are? Or does your organisation prefer the “more is more” approach throwing as much information at you as it can? Just in case…
Any fool can make something complicated. It takes genius to make it simple
Woodie Guthrie
If you enjoyed this post click here to receive the next
Try the book Meltdown
Image by Vinnie C
Leave a Reply