Normalizing Deviance

Humans are really good at rationalizing. We do it all the time, every time when we cut corners, break rules or ignore evidence in pursuit of a successful outcome. We…

This smoke from one of the shuttle Challenger’s solid-rocket boosters was the first visual evidence of a problem. A myriad of management and human factors issues within NASA allowed important safeguards to be ignored, leading to an attempted launch in weather too cold to guarantee the boosters’ integrity.

Humans are really good at rationalizing. We do it all the time, every time when we cut corners, break rules or ignore evidence in pursuit of a successful outcome. We continue because it often has no consequences. The thing about rationalizing, though, is it can change our behavior. What once seemed wrong starts to feel normal, and outcomes that may be mostly due to good luck start to feel more like skill and deep understanding. In aviation, rationalizing can result in disaster.

In a book that examined the shuttle Challenger disaster and its root causes, sociologist and professor Dr. Diane Vaughan coined a term, the “normalization of deviance.” She defined it as, “the gradual process through which unacceptable practice or standards become acceptable. As the deviant behavior is repeated without catastrophic results, it becomes the social norm for the organization.”

You may recall that NASA officials chose to repeatedly fly the space shuttle despite having knowledge of a design flaw involving the booster rockets’ O-rings and their behavior in cold weather. The “group think” that lead NASA to accept the risk was the defining case for normalized deviance.

Seven years after Vaughan’s book was published, the same mode of failure struck NASA again with the shuttle Columbia. After 22 years, it was well-known that when foam blocks broke off the space shuttle’s external tanks during launch, they would strike and often damage the thermal shielding on the main shuttle. There was never any consequence, so it came to be viewed as a maintenance issue rather than a safety issue. This second normalization of deviance also ended in catastrophe.

Normalizing

Academic safety experts primarily write about the normalization of deviance as an issue that affects corporate safety cultures and organizations, but it is just as applicable to individuals.

Humans are great at rationalizing our actions and normalizing our own deviations. One of our behavioral norms is to drive safely and follow traffic laws, but sometimes we make exceptions. On a busy interstate, I might decide to drive over the speed limit by rationalizing that it is safer to stay with the flow of traffic (and maybe it is). But when the other lanes become empty, am I still driving above the posted limits? If I am, my SOP of following the speed limit has been subverted. My rationalized excuse is gone, but the behavior remains. I am headed down the slippery slope of normalization of deviance. Worse, I haven’t even asked myself whether I was justified in breaking the law in busy traffic or just in a hurry and looking for an excuse.

In a post-accident safety presentation, “The Cost of Silence: Normalization of Deviance and Groupthink,” NASA’s Chief of Safety and Mission Assurance stated, “There is a natural tendency to rationalize shortcuts under pressure.” And pilots are almost always under some form of pressure. Staying ahead of weather, managing fuel, optimizing headwinds, arriving prior to nightfall are all pressures that might lead us to cut corners in the name of safety. And there’s nothing like a good, juicy rationalization.

Rationalizing

Checklist discipline is an area where individual pilots can develop bad habits. We may be creatures of efficiency and discipline, but we don’t like unnecessary tasks. Checking pitot heat on a CAVU day seems redundant. Checking aircraft lights for a day flight seems unnecessary. It becomes pretty easy to disengage, stop reading the excessive details and subsequently miss critical items. It doesn’t help that checklists can be overburdened with details plugged in by lawyers to protect the manufacturer from liability rather than improve pilot safety.

Checklist discipline is completely lost when pilots start normalizing deviance by blowing past items that appear unnecessary. In a 2014 fatal runway overrun in Bedford, Mass., the crew of a Gulfstream IV must have been so comfortable pencil-whipping their checklist, they missed four (!) separate opportunities to remove the gust lock that remained in place on takeoff. That the gust lock is on the checklist four times is both a testament to how important it is to disengage it, but also an explanation about why it may have been ignored: If you do it right the first time on most flights, the other three times seem annoyingly redundant.

Desensitizing ourselves to accept greater risk can be insidious, building for years before disaster strikes. Sidney Decker, a professor at Griffith University in Brisbane’s Safety Science Innovation Lab, wrote about this in “The Field Guide to Understanding ‘Human Error.” He describes the normalization of deviance as “Drift.” He says that Murphy’s law is wrong: “What can go wrong usually goes right and then we draw the wrong conclusion: that it will go right again and again, even if we borrow a little more from our safety margins.” NASA came to a similar conclusion: “The lack of bad outcomes reinforces the rightness of trusting past success instead of objectively assessing risk.”

In other words, when outcomes are successful, it reinforces the natural human tendency to focus on the results and assume the steps leading to the outcome were correct. By extension, if a shortcut was taken, it worked. We conclude that it will likely work again.

Luck ≠ Proficiency

Another challenge is discerning where experience, personal growth, learning and gains in proficiency cross the line into deviance. Nearly all pilots are continuously learning our limits and those of the aircraft we fly.

Take a pilot landing a Cessna 206 in a 29-knot crosswind at full gross. This is well beyond the demonstrated crosswind component of the aircraft, but the pilot has 2000 hours in 206s and this particular aircraft has VGs installed, so it has a bit more rudder authority than a stock Cessna. A plane at full gross has a bit more inertia so it is less apt to get tossed around by gusts. After the pilot safely lands the plane, he thinks he’s the next incarnation of Bob Hoover and can handle this combo just fine. Was it skill or luck? My guess is that 95 percent of pilots will take the ego stroke before considering that it might have been good luck, that maybe the wheels touched down right as the gusty wind backed down. Rationalization was just reinforced and deviance normalized.

The danger is starting to think, based on a good outcome, that we can do 29-knot crosswinds in any 206 with any load, or without VGs.

Inoculating Yourself

Sidney Dekker says the best practice for avoiding drift, the slide into the normalization of deviance, is to stay chronically uneasy. As pilots, that means maintaining a skeptical and questioning attitude about our own competence and discipline. The problem for us as individuals is that preventing our own deviance from becoming normalized is very difficult when we are the subject of the insidious behavior modification. It is hard to objectively see risks when our history of safe flights desensitize us. It just reinforces that our decisions, or short-cuts, are safe enough, our skills are up to the task and our past performance provides evidence for a safe outcome.

While NASA’s findings after their shuttle disasters were written for an organization or team, they are pertinent for us. Below, I have paraphrased and adapted the most important conclusions from NASA:

Beware the false illusion of invulnerability. (When NASA engineers raised the possibility of O-ring blow-by, it was said the risk “was true of every other flight we have had.”) If you feel invulnerable, are you really that good all the time or just lucky? Would a new pilot would accept the risk you are about to accept? If not, why not?

Prove to yourself you ARE safe rather than seeking proof that you are NOT. Be skeptical of your own thinking. Are you just assuming you are safe? What is your evidence?

Check your rationalization. What is novel about this flight that you are ignoring? What is not routine? What are the most likely modes of failure? If the weather is changing (which it usually is), assume it will be worse. Instead of approaching your flight with confidence, assume it will go bad. Be prepared to meet your incompetence.

Listen to skeptics. “Are you really flying in this weather?” If others cast doubt on your proposed flight because it makes them uncomfortable, ask yourself why? Are you really that much better than the person questioning your judgment? Is your plane really up for the task? Do they consider things you may have skipped over in your delusion of confidence?

Self-censor. Are you about to violate a FAR, your own personal minimums, or some reasonable limit, for the sake of expediency? Maybe you should reconsider.

Don’t accept silence as agreement. Are you making assumptions or have you actually checked?

Don’t ignore dissent. If the briefer is saying VFR is not advised, maybe you should file or possibly cancel.

Listen to experts. Bounce your plans off others who are more experienced or perhaps more cautious. You might learn something.

I wince a little about all this nervous-Nellie advice to live in doubt and skepticism because I have never enjoyed listening to buzzkill from wet-blanket, fun-police types treating me like a student pilot and standing in the way of my plans. That said, they deserve some consideration because there is plenty of evidence they are a voice of experience and reason and maybe, just maybe, they have a point I am missing.

It would be nice if circumstances leading to accidents were more obvious, if warning bells or lights went off every time we became overly self-satisfied with our skills or complacent with our decisions and actions. But it doesn’t work that way. The best we can do is to recognize that as we take shortcuts or as we accept greater aviation challenges commensurate with our increasing skills, we may also be normalizing some of our deviance. Even if we did just nail that 30-knot crosswind landing, it may not wholly be due to demonstrating our greater skill and proficiency, but perhaps a bit of luck. None of us are Bob Hoover.


Cockpit Rationalizations

There are a lot of areas common to pilots that are ripe for rationalization. Some examples:

SOP: “Sump the tanks before each flight.”

Rationalization: “I sumped the right one; I’m sure the other one is fine.”


SOP: “Check the oil before each flight.”

Rationalization: “I just changed the oil last week, and I haven’t flown since then.”


SOP: “Calculate fuel burn for the intended trip using most recent weather data.”

Rationalization: “It always takes 20 gallons for this flight, I’ll have enough.”


SOP: “Weigh all the payload, and calculate the weight and balance before each flight.”

Rationalization: “It wasn’t out of CG last time with a similar load, so why do the calculation?”


SOP: “Field is IFR, obtain an IFR clearance.”

Rationalization: “I am only 5 miles from the airport; it will be safer to land before it gets worse."


Aviation Is Ripe for Drift

Normalization of deviance can be found as a root or contributing cause in numerous infamous industrial and transportation accidents, including many aircraft mishaps. It is relatively easy to identify after the fact, similar to lining up the holes in the Swiss cheese or following the links-in-a-chain accident models, which help us identify root causes when it is quite easy to work backward. After an accident, we know the outcome, so we ask, “How did we get here?”

Uncovering normalization of deviance and exposing it to light prior to the accident is much more challenging. An organization can hire outside auditors and evaluators. The primary way an individual uncovers the gradual drift from the norm is through critical self-examination.

We pilots are a pretty disciplined lot all in all, and we generally don’t just flagrantly stop following rules and norms. I would venture to say that when we do, we do it slowly, often with good rationale and lots of reinforcement.

That is what makes it hard to recognize the slippery slope that we may be on, or break that link in the chain, to prevent an otherwise-inevitable accident.


This article originally appeared in the December 2019 issue of Aviation Safety magazine.

For more great content like this, subscribe to Aviation Safety!

Mike HartContributor
Mike Hart is an Idaho-based flight instructor and proud owner of a 1946 Piper J-3 Cub and a Cessna 180. He also is the Idaho liaison to the Recreational Aviation Foundation.