In November 28th, 2016, a LAMIA Bolivia Avro RJ-85, registration CP-2933 performing the charter flight LMI-2933 from Santa Cruz (Bolivia) to Medellin (Colombia) with 68 passengers and 9 crew, crashed, killing 71 people. The flight was chartered to carry the Brazilian football team Chapecoense to play the finals of South American Coup 2016. Three players, one flight attendant, one technician and one journalist survived.
Pretty quickly became evident the plane has suffered fuel starvation. The flight plan was unofficially leaked to news channels evidencing several no-go issues, being the most relevant the Total EET: 04 HR 22MIN having the same value as ENDURANCE:04 HR 22MIN. When Bolivian Aviation Authority officer inquired the crashed flight’s dispatcher about the issues, he asked her to let it pass arguing they will complete the flight in less time, as they have done before, and eventually, the flight was authorized.
Main wreckage LAMIA Bolivia Avro RJ-85, CP-2933 crashed on approach to SKRG-MDE (Photo: AP/Luis Benavides)
Being the continuation of The Organizational Influences behind the aviation accidents & incidents
Normalization of Deviance
“Social normalization of deviance means people within the organization become so much accustomed to the deviation that they don’t consider a deviant, despite the fact that they far exceeded their own rules of elementary safety” Diane Vaughan, 1996 – Challenger Accident Investigation (The Challenger Launch Decision. Risky Technology, Culture, and Deviance at NASA. Chicago, IL: University of Chicago Press, 1996)
Normalization of deviance is the gradual process by which, in the absence of immediate adverse consequences, the unacceptable becomes acceptable. It refers to unnoticed failures that do not cause immediate harm and permeate everyday work becoming routine behaviour. In other words, “The shortcut slowly but surely over time becomes the norm” Chris Sharber, 2015 (first officer and flight simulator instructor–Boeing 777 fleet, at the United Airlines Training Center in Denver )
The term can be applied as legitimately to the human factors risks in airline operations as was applied to the Challenger accident, where it was first used.
“Normalization of deviance breaks the safety culture, substituting a slippery slope of tolerating more and more errors and accepting more and more risks, always in the interest of efficiency and on-time schedules. This toxic thinking often ends with a mindset that demands evidence that these errors would destroy the vehicle, instead of demanding proof that the shuttle is safe and not being harmed. The boundaries are soon pushed to extremes without understanding where and why the original limits were established.” (Westgard JO, Westgard S. It’s Not Rocket Science: Lessons from the Columbia and Challenger Disasters. Guest Essays. Available at: http://www.westgard.com)
It is invisible and insidious, common and pernicious. People tend to ignore or misinterpret the deviations as an innocuous part of the daily job. If the deviations also save time and resources and reduce costs, they can even be encouraged by managers and supervisors. However, the more times deviations occur without apparent consequences, the system becomes more complacent.
Normalization of deviance can lead to Groupthink (1), which can be defined as “… a mode of thinking that persons engage in when they are deeply involved in a cohesive in-group when concurrence-seeking becomes so dominant that it tends to override critical thinking or realistic appraisal of alternative courses of action.” —Irving l. Janis, 1982
“There are eight symptoms of groupthink. All of them need not be present for the process to influence decisions:
1.Illusion of Invulnerability
2.Belief in Inherent Morality of the Group
5.Self-Censorship: for example, a no-go item became a “recommendation”
6.Illusion of Unanimity: Silence is interpreted as agreement
7.Direct Pressure on Dissenters
8.Self-Appointed Mindguards: Subject matter experts excluded from decision briefs and meetings
There is a natural tendency to rationalize shortcuts under pressure. The lack of bad outcomes reinforces the rightness of trusting past success instead of objectively assessing risk.”
Moreover, when the outcomes are successful, this reinforce the natural tendency people have to tend to focus on the results and to assume that the process that led to it was correct, even when there are evidence that it wasn’t .
With time, deviations lead to near-misses. But instead of seeing them as a sign of alarm people tend to ignore or misinterpret them, therefore often are not evaluated or worst of all are seen as a symptom of resilience.The big problem is “if conditions change, even slightly, and luck does not intervene, the near-miss becomes an accident.”
“Accidents are initiated by the unexpected interaction of multiple small, often seemingly unimportant, human errors, deviations or violations, technological failures or bad business decisions, and are culminated by these latent conditions combining with enabling conditions. Near misses arise from the same preconditions but in the absence of enabling conditions they produce only small failures and thus go undetected or are ignored. Multiple near misses precede (and foreshadow) every disaster and business crisis, and most of the misses are ignored or misread.”
In LAMIA’s accident commented above an enabling condition could has been an unexpected landing time delay because of an abnormal indication inside the cockpit of another plane that caused that plane to receive priority to land meanwhile the LMI-2933 was sent into holding pattern.
“Whether an enabling condition transforms a near-miss into disaster generally depends on chance, thus it makes little sense to try to predict or control all the possible enabling conditions. Instead, companies should focus on identifying and fixing latent conditions before circumstances allow them to produce an accident.”
To recognize and learn from normalization of deviance and from near-misses just paying attention is not enough. “It actually runs contrary to human nature.”
“Research suggests seven strategies that can help organizations recognize near-misses and root-out the latent error behind them:
- Heed high pressure
- Learn from deviations
- Uncover root causes
- Demand accountability
- Consider worst-case scenarios
- Evaluate projects at every step
- Reward owning up
Two forces conspire to make learning from near misses difficult: Cognitive biases make them hard to see, and, even when they are visible, leaders tend not to grasp their significance. Thus, organizations often fail to expose and correct latent errors even when the cost of doing so is small—and so they miss opportunities for organizational improvement before disaster strikes. This tendency is itself a type of organizational failure—a failure to learn from “cheap” data. Surfacing near misses and correcting root causes is one the soundest investments an organization can make.”
Intentional Noncompliance (3)
“Flight crews engage in intentional noncompliance — and sometimes self-justify this behaviour — out of a variety of motivations. “Maybe it’s a bad SOP. Maybe there are competing priorities. Maybe it just doesn’t work. It’s not functional. … It’s not that important. It doesn’t really matter. I might [take a] shortcut just because I’m trying to save time,” he said. “[Or pilots rationalize], ‘I just don’t like it. I like the way we did it before. I’ve got a better way of doing things. I think this is a bad idea. I’m just not going to do it.” These occur with a perceived lack of consequences. The LOSA Collaborative’s latest data analysis suggests that acts of intentional noncompliance occur on between 40 to 60 percent of flights, or about half, on average.”
The categories and subcategories for both types of noncompliance with SOPs can be summarised as follows:
- Compliant: Unintentional errors
- Risky: Intentional act. Risk is underestimated or believed justified
- Reckless: Intentional disregard of significant risk
- Gross negligence
- Criminal act
Procedural Intentional Non-Compliance-PINC (4)
“Procedural Intentional Non-Compliance-PINC is one of the most frequent contributors to aviation accidents.
“PINCs are often the result of well-meaning pilots trying to do their job but willfully taking risks to achieve what should be the secondary goal, “completing the mission”… However, when your efforts to get there include fudging the rules, you do raise risk.
PINCs raise risks, and there are a lot of PINCs happening every day.But if you are in a position to do so, you can take a straightforward series of steps that are critical to prevent PINCs in your organization: (1) Gain commitment (2) budget and develop the resources and (3) ensure performance management.
…Everyone learns early in life about the two sets of rules to live by: the formal rules written or stated, and the real rules- those the game is actually played by. When there is a significant difference between the two, the real rules, became the standard. The solution is to establish and maintain a universal commitment to the formal rules- that is flight operations manuals, procedures, etc. That emphasis must start at the very top of the organization.
If the Chief Executive Officer (CEO) of an organization is truly committed to safety the safety program is set up to succeed. A safety-committed CEO is the chief enforcement officer. Anything less leaves the door open for informal rules and resultant PINCs.
The commitment from top management allows expecting appropriate behaviour from those all involved in the operation.
No PINCs are permitted. Period. With that understanding as a start point, it becomes the manager’s responsibility to get the necessary resources into play.
Budget and develop resources
Aviation professionals tend to be highly service-oriented. They naturally push themselves and their equipment to get the job done, so it is critically important than their leaders and managers give them the appropriate resources. If they don’t have the appropriate resources, they will stretch the ones they have. The result of this heroic efforts populates accident investigation files.
The most important resources are enough people, time and equipment. Also are the guidelines to using them- effective policies, standards and procedures. Those are critical in ensuring the quality and continuity of organizational and individual performance and the avoidance o PINCs.
Some aviation managers say vague policies and procedures create the flexibility they need to get the job done. Wrong! That approach sends a loud and clear message: safety is a variable, service is an absolute. That sets the stage for people to push. Lives are lost and hills are littered with aircraft wreckage as a result of crews pushing. Weak policies and procedures send the wrong message.
On the other hand, Standard Operating Procedures-SOP also must establish clear guidelines for the use of judgment in a way that continues to assure safety while being flexible enough to adjust to unique service need. Some aviation managers make a cause for absolute SOPs that have no wiggle room for judgment. They are the enforcers, unwilling to take responsibility for using common sense. Overly rigid guidelines prevent the use of using judgment and decision making to get the job dome safely.
If there is the expectation people make informed and collaborative decisions that are biased to the safe side, it is critical to have a comprehensive set of operational policies, standards and procedures. Once those are in place, it’s up to the team to perform… top to bottom.
Since safety starts at the top, operational managers must not only be the champions of proper performance, they must be the models. “Do as I say, no as I do” is not an option.
They always must catch people doing things right and praise routinely and publicly praise people for taking time and care to follow and implement proper procedures. Doing this they are creating aa culture of co-responsibility. Co-responsibility is basic to effective crew resource management. Each member is co-responsible for the rest of the team performance. This applies to ground and scheduling operations too.
From a managerial perspective, each PINC deserves unique attention and action. There are few things to consider:
- A PINC is a deliberate violation of an established policy, standard or practice
- A PINC often raises risk
- A PINC perpetrator is likely to commit future PINCs
- If other members of the organization are aware of a PINC event and the see no negative consequences, they may correctly assume management don’t take SOPs seriously
Therefore, contrary to the old axiom “praise publicly and punish privately”. The consequence of PINC should be emphasized, the floggings should be public. Not only does this approach provide positive public reinforcement of proper behaviours to prevent such public embarrassment. (From the blogger: please note and remember, the author is talking about violations and intentional non-compliance, not talking about error)
The University of Texas found that crews who intentionally deviate from standard operating procedures are almost twice as likely to commit additional errors with consequential results. PINCs are a disease. Unchecked they will infect an entire operation. That infection can have extreme consequences. Sadly, the price of PINCs is paid by innocent people. The antidote to PINCs is discipline.”
- The Cost of Silence: Normalization of Deviance and Groupthink. Terry Wilcutt, Hal Bell. National Aeronautics and Space Administration- NASA. Senior Management ViTS Meeting November 3, 2014.
How to Avoid Catastrophe. Catherine H. Tinsley, Robin L. Dillon, Peter M. Madsen. Harvard Business Review. April 2011
Normalization of Deviance. Wayne Rosenkrans. Flight Safety Foundation’s AeroSafety World. June 2015.
- Discipline as Antidote. Peter V. Agur Jr.Flight Safety Foundation’s AeroSafety World. February 2007.
- The Organizational Influences behind the aviation accidents & incidents
- LaMía CP2933 accident in Colombia, preliminary report
Recognizing and learning from normalization of deviance and from near-misses runs contrary to human nature. Therefore, it requires constant effort, reinforcement and supervision, a zero tolerance policy and well-defined consequences. The managerial commitment is indispensable.
But, what happens when the pilot in command is also a manager and co-owner of the airline?
In that case, the Aviation Authority MUST prove more than ever that it deserves such a title.
By Laura Duque-Arrubla, a medical doctor with postgraduate studies in Aviation Medicine, Human Factors and Aviation Safety. In the aviation field since 1988, Human Factors instructor since 1994. Follow me on facebook Living Safely with Human Error and twitter@dralaurita. Human Factors information almost every day