When the error comes from an expert: The Limits of Expertise

“On Aug 3rd, 2016, an Emirates Airlines Boeing 773 was performing flight EK-521 from Thiruvananthapuram (India) to Dubai (United Arab Emirates) with 282 passengers and 18 crew. As the flight neared Dubai, the crew received the automatic terminal information service (ATIS) Information Zulu, which included a windshear warning for all runways.

The Aircraft was configured for landing with the flaps set to 30, and approach speed selected of 152 knots (VREF + 5) indicated airspeed (IAS) The Aircraft was vectored for an area navigation (RNAV/GNSS) approach to runway 12L. Air traffic control cleared the flight to land, with the wind reported to be from 340 degrees at 11 knots and to vacate the runway via taxiway Mike 9.

emirates-1

Emirates B773 crashed at Dubai on Aug 3rd, 2016. Photo from Malaysian Wings Forum page

During the approach, at 0836:00, with the autothrottle system in SPEED mode, as the Aircraft descended through a radio altitude (RA) of 1,100 feet, at 152 knots IAS, the wind direction started to change from a headwind component of 8 knots to a tailwind component. The autopilot was disengaged at approximately 920 feet RA and the approach continued with the autothrottle connected. As the Aircraft descended through 700 feet RA at 0836:22, and at 154 knots IAS, it was subjected to a tailwind component which gradually increased to a maximum of 16 knots.

At 0837:07, 159 knots IAS, 35 feet RA, the PF started to flare the Aircraft. The autothrottle mode transitioned to IDLE and both thrust levers were moving towards the idle position. At 0837:12, 160 knots IAS, and 5 feet RA, five seconds before touchdown, the wind direction again started to change to a headwind.

As recorded by the Aircraft flight data recorder, the weight-on-wheels sensors indicated that the right main landing gear touched down at 0837:17, approximately 1,100 meters from the runway 12L threshold at 162 knots IAS, followed three seconds later by the left main landing gear. The nose landing gear remained in the air.

At 0837:19, the Aircraft runway awareness advisory system (RAAS) aural message “LONG LANDING, LONG LANDING” was annunciated.

At 0837:23, the Aircraft became airborne in an attempt to go-around and was subjected to a headwind component until impact. At 0837:27, the flap lever was moved to the 20 position. Two seconds later the landing gear lever was selected to the UP position. Subsequently, the landing gear unlocked and began to retract.

At 0837:28, the air traffic control tower issued a clearance to continue straight ahead and climb to 4,000 feet. The clearance was read back correctly.

The Aircraft reached a maximum height of approximately 85 feet RA at 134 knots IAS, with the landing gear in transit to the retracted position. The Aircraft then began to sink back onto the runway. Both crewmembers recalled seeing the IAS decreasing and the Copilot called out “Check speed.” At 0837:35, three seconds before impact with the runway, both thrust levers were moved from the idle position to full forward. The autothrottle transitioned from IDLE to THRUST mode. Approximately one second later, a ground proximity warning system (GPWS) aural warning of “DON’T SINK, DON’T SINK” was annunciated.

One second before impact, both engines started to respond to the thrust lever movement showing an increase in related parameters.

At 0837:38, the Aircraft aft fuselage impacted the runway abeam the November 7 intersection at 125 knots, with a nose-up pitch angle of 9.5 degrees, and at a rate of descent of 900 feet per minute. This was followed by the impact of the engines on the runway. The three landing gears were still in transit to the retracted position.” (See: Going around with no thrust. Emirates B773 accident at Dubai on August 3rd, 2016, interim report)

emirates-2

Emirates B773 crashed at Dubai on Aug 3rd, 2016. Photo from Bureau of Aircraft Accidents Archives B3A

*********

Being the continuation of  Multitasking in Complex Operations, a real danger

RETHINKING CREW ERROR (1)

“The vast majority of airline accidents are attributed to flight crew error. However, the great majority of commercial pilots has received strict training, is checked with punctual regularity, operates advanced safety technology and is highly experienced. They do their job according to a flight operations manual and checklists that prescribe carefully planned procedures for almost conceivable situation, normal or abnormal, they will encounter. How can all this expertise co-exist with the pilot error that we are told is a factor in more than half of airline accidents?”  (Darby, Rick & Setze, Patricia. Factors in Vulnerability  Aviation Safety World, May 2007 53-54)

Why very experienced professional pilots make errors?

“This well-known fact is widely misinterpreted, even by experts in aviation safety. Certainly, if pilots never made mistakes the accident rate would go down dramatically, but is it reasonable to expect pilots not to make mistakes? For both scientific and practical reasons, this expectation is not reasonable.”

“The accident rate for major airlines operations in industrialized nations is already very low. This impressive record has been accomplished by developing very reliable systems, by thorough training, by requiring high levels of experience for captains, and by emphasizing safety. However, this accident rate can be further reduced substantially through the understanding of the underlying causes of human error and better ways of managing human error and changing how we think about the causes of error.”

“It is all too easy to say, because crew errors led to an accident, that the crew was the problem: they should have been more careful or more skilful. This “blame and punish” mentality or even the more benign “blame and train” mentality does not support safety—in fact, it undermines safety by diverting attention from the underlying causes.”

“Admittedly in general aviation, many accidents do show evidence of poor judgment or of marginal skill. This is much less common in airline operations because of the high standards that are set for this type of operation. Nonetheless, whatever discussion about airline operation could have implications for general aviation.”

“There are two common fallacies about pilot error:

  1. Fallacy 1: Error can be eliminated if pilots are sufficiently vigilant, conscientious, and proficient.

The truth is that vigilant, conscientious pilots routinely make mistakes, even in tasks at which they are highly skilled. Helmreich and his colleagues have found that on average airline crews make about two errors per flight leg and even more on challenging flights (Helmreich, Klinect, & Wilhelm, 1999; Klinect, Wilhelm, & Helmreich, 1999). And this is, if anything, an undercount because of the difficulty in observing all errors.

  1. Fallacy 2: If an accident crew made errors in tasks that pilots routinely handle without difficulty, that accident crew was in some way deficient—either they lacked skill, or had a bad attitude, or just did not try hard enough.

But the truth is that the most skilful, conscientious expert in the world can perform a procedure perfectly a hundred times in a row and then do something wrong on the 101st trial. This is true in every field of expertise—medicine, music, and mountain climbing just as much as aviation (Reason, 1990).”

“It must also be highlighted something called “hindsight bias”. After an accident, all know the outcome of the flight. The thorough investigation by the investigation authorities reveals many details about what happened leading up to the accident. Armed with this information it is easy for everybody to say the crew should have handled things differently. But the crew in that airplane did not know the outcome. They may not have known all of the details later revealed and they certainly did not realize how the factors were combining to create the conditions for an accident.”

“Experts do what seems reasonable, given what they know at the moment and the limits of human information processing. Errors are not de facto evidence of lack of skill or lack of conscientiousness.

In some accidents, crews may not have had access to adequate information to assess the situation and make prudent decisions on how to continue. Many bits and pieces of information may be available to the crew, who weigh the information as well as they can. But comes the question whether crews always have enough information in time to decide and to be absolutely certain that the decision is correct.”

“It is ironic that in some wind shear accidents the crew was faulted for continuing an approach even though an aircraft landed without mishap one minute ahead of the accident aircraft. Both crews had the same information, both made the same decision, but for one crew luck ran the wrong way. We do not like to admit that any element of luck still pertains to airline safety—and in fact, the element of chance in airline operations has been reduced enormously since the 1930s, as described by Ernest Gann in Fate is the Hunter (1984). But there are still a few accidents in which we should admit that the crew made decisions consistent with typical airline practice and still met disaster because risk cannot be completely eliminated.”

“Tension and tradeoffs between safety and mission completion are inherent in any type of real-world operation. Modern airlines have done an extraordinary job of reducing risk while maintaining a high level of performance. Nevertheless, some small degree of risk will always exist. The degree of risk that is acceptable should be a matter of explicit public discussion, which should guide policy. What we must not do is tell the public they can have zero risk and perfect performance—and then say when a rare accident occurs: “it was the crew’s fault”, neglecting to mention that the accident crew did what many other crews had done before.”

“If the investigation of an accident or incident reveals explicit evidence of deliberate misconduct the pilot obviously should be held accountable. If the investigation reveals a lack of competence the pilot obviously should not fly again unless retrained to competency. But with these rare exceptions, identifying “pilot error” as the probable cause of accidents is dangerous because it encourages the aviation community and the public to think something was wrong with the crew and that the problem is solved because the crew is dead or can be fired (or retrained in less serious cases).”

“Rather than labeling probable cause, it is more useful to identify the contributing factors including the inherent human vulnerability to characteristic forms of error, to characterize the interplay of those factors, and to suggest ways errors can be prevented from escalating into accidents. If probable cause must be retained, it would in most cases be better to blame the inherent vulnerability of conscientious experts to make errors occasionally rather than to blame crews for making errors.”

“To improve aviation safety we must stop thinking of pilot errors as the prime cause of accidents, but rather think of errors as the consequence of many factors that combine to create the conditions for accidents. It is easy in hindsight to identify ways any given accident could have been prevented, but that is of limited value because the combination of conditions leading to accidents has a large random component. The best way to reduce the accident rate is to develop ways to reduce vulnerability to error and to manage errors when they do occur.”

emirates_b773_a6-emw_dubai_160803_3

Emirates B773 crashed at Dubai on Aug 3rd, 2016. Aerial overview of accident site Photo from The Aviation Herald

ERROR SITUATIONS (2)

“The naïve view is that pilots who make an error are somehow less expert than others. That view is wrong. The pilot who makes an error – as seen in hindsight- typically does not lack skill, vigilance or conscientiousness. He or she is behaving expertly, in a situation that may involve misinformation, lack of information, ambiguity, rare weather phenomena or a range of other stressors, in a possibly unique combination.”

“No one thing “causes” accidents. Accidents are produced by the confluence of multiple events, task demands, actions taken or not taken, and environmental factors. Each accident has unique surface features and combinations of factors.”

Human cognitive processes are by their nature subject to failures of attention, memory and decision-making. At the same time, human cognition, despite all its potential vulnerability to error is essential for safe operations.

“Computers have extremely limited capability dealing with unexpected and novel situations, for interpreting ambiguous and sometimes conflicting information, and for making value judgments on the face of competing goals. Technology helps make up for the limitations of human brainpower, but by the same token, humans are needed to counteract the limitations of aviation technology.”

“Airline crews routinely deal with equipment displays imperfectly matched to human information-processing characteristics, respond to system failures and decide how to deal with threats ranging from unexpected weather condition to passenger medical emergencies. Crews are able to manage the vast majority of these occasions so skillfully that what could have become a disaster is no more than a minor perturbation in the flow of high-volume operations.”

“But on the rare occasions when crews fail to manage these situations, it is detrimental to the case of aviation safety to assume that failure stems from the deficiency of the crews. Rather, these failures occur because crews are expected to perform tasks at which perfect reliability is not possible for either humans or machines. If we insist on thinking of accidents in terms of deficiency, that deficiency must be attributed to the overall system in which crews operate.”

“It has been described six overlapping clusters of error situations:

  • Inadvertent slips and oversights while performing highly practiced tasks under normal conditions
  • Inadvertent slips and oversights while performing highly practiced tasks under challenging conditions
  • Inadequate execution of non-normal procedures under challenging conditions
  • Inadequate response to rare situations for which pilots are not trained
  • Judgment in ambiguous situations
  • Deviation from explicit guidance or SOP

However, error is NOT just part of doing business, it must still be reduced and to reduce it, the factors associated with it must be understood as well as possible.”

“Uncovering the causes of flight crew error is one of the investigators biggest challenges because human performance including that of experts pilots is driven by the confluence of many factors, not all of which are observable in the aftermath of an accident. Although it is often impossible to determine with certainty why accident crewmembers did what they did, it is possible to understand the types of error to which pilots are vulnerable and to identify the cognitive, task and organizational factors that shape that vulnerability”. (Carl W.Wogt, 2007, on his Foreword to the book The Limits of expertise: Rethinking pilot error and the causes of airline accidents. Burlington, VT: Ashgate.)

“Studies have shown the most common cross-cutting factors contributing to crew errors (3):

  • Situations requiring rapid response
  • Challenges of managing concurrent tasks
  • Equipment failure and design flaws
  • Misleading or missing cues normally present
  • Plan continuation bias
  • Stress
  • Shortcomings in training and/or guidance
  • Social/organizational issues”

emirates-3

Emirates B773 crashed at Dubai on Aug 3rd, 2016. Photo from Bureau of Aircraft Accidents Archives B3A

EXPERIENCED PILOTS ERRORS (4) 

“Studies show that almost all experienced pilots operating in the same environment in which the accidents crews were operating and knowing only what the crews knew at each moment of the flight would be vulnerable to making similar decisions and actions.”

“The skilled performance of experts is driven by the interaction of moment-to-moment task demands, availability of information and social/organizational factors with the inherent characteristics and limitations of human cognitive processes. Whether a particular crew in a given situation makes errors depends as much, or more, on this somewhat random interaction of factors as it does on the individual characteristics of the pilots.”

“The two most common themes saw in aviation accidents are Continuation Bias, –a deep-rooted tendency to continue their original plan of action even when changing circumstances require a new plan– and situations that lead to Snowballing Workload- a workload that builds on itself and increases at accelerating rate.”

Continuation bias

“Too often crew errors are attributed to complacency or intentional deviations from standard procedures, but these are labels, not explanations. To understand why experienced pilots sometimes continue ill-advised actions is important to understand the insidious nature of plan continuation bias which appears to underlie what pilots call “press-on-itis”. This bias results from the interaction of three major components: social/organizational influences, the inherent characteristics and limitations of human cognitive processes and incomplete or ambiguous information.”

“Safety is the highest priority in all commercial flight operations, but there is an inevitable trade-off between and competing goals of schedule reliability and cost effectiveness. To ensure conservative margins of safety, airlines establish written guidelines and standard procedures for most aspects of operations.”

“Yet considerable evidence exist that the norms for actual flight operations often deviate considerably for these ideals. When standard operating procedures are phrased not as requirements but as strong suggestions that may appear to tacitly approve of bending the rules, pilots may -perhaps without realizing it- place too much importance on costs and scheduling.”

“Also, pilots may not understand why guidance should be conservative; that is they may not recognize that the cognitive demands of recovering a plane from an unstabilized approach severely impair their ability to assess whether the approach will work out. For all these reasons many pilots, not only the few who have accidents may deviate from procedures that the industry has set up to build extra safety into flight operations. Most of the time, the result of these deviations are successful landings, which further reinforce deviating norms.”

“As pilots amass experience in successfully deviating from procedures they unconsciously recalibrate their assessment of risk toward taking greater chances.”

“Another inherent and powerful cognitive bias in judgment and decision making is expectation bias- when someone expects one situation, she or he is less likely to notice cues indicating that the situation is not quite what it seems. Human beings become less sensitive to cues that reality is deviating from the mental model of the situation.”

“Expectation bias is worsened when crews are required to ingrate new information that arrives piecemeal over time in incomplete, sometimes ambiguous, fragments. Human working memory has extremely limited capacity to hold individual chunks of information, and each piece of information decays rapidly fro working memory. Further, the cognitive effort required to interpret and integrate this information can reach the limits of human capacity to process information under the competing workload to flying an approach.”

Snowballing Workload

“Errors that are inconsequential on themselves have a way of increasing crews vulnerability to further errors and combining with happenstance events – with fatal results. The abnormal situations can produce acute stress, and acute stress narrows the field of attention (tunnel-vision) and reduces working memory capacity. The combination of a high workload with many other factors, as stress and/or fatigue, can severely undermine cognitive performance.”

“A particularly insidious manifestation of snowballing workload is that it pushes crews into a reactive, rather than proactive stance. Overloaded crews often abandon efforts to think ahead of the situation strategically, instead simply responding to events as they occur not thinking if that is going to work out.”

Implications and countermeasures

“Simply labelling crew errors as simply “failure to follow procedures” misses the essence of the problem. All experts no matter how conscientious and skilled are vulnerable to inadvertent errors. The basis of this vulnerability is in the interaction of task demands, limited availability of information, sometimes conflicting organizational goals and random events with the inherent characteristics and limitations of human cognitive processes. Even actions that are not inadvertent are the consequences of the same interaction.”

“Almost all airline accidents are system accidents. Human reliability in the system can be improved if pilots, instructors, check pilots, managers and the designers of aircraft equipment and procedures understand the nature of vulnerability to error.”

“For example, monitoring and checklists are essential defenses but in snowballing workload situations, when these defenses are most needed they are most likely to be shed in favor of flying the airplane, managing systems and communicating.”

“Monitoring can be more reliable by designing procedures that accommodate the workload and by training and checking monitoring as an essential task more than a secondary one.”

“Checklist use can be improved by explaining the cognitive reasons that effectiveness declines with extensive repetition and showing how this can be countered by slowing the pace of execution to be more deliberate, and by pointing to or touching items being checked.”

“Inevitable variability in skilled performance must be accepted. Because skilled pilots normally perform a task without difficulty, it doesn’t mean they should be able to perform that task without error 100% times.”

“Plan continuation bias is powerful, although it can be countered once acknowledged. One countermeasure is analyze situations explicitly, stating the nature of the threat explicitly, the observable indications of the threat and the initial plan for dealing with it.”

“Questions as what if our assumptions are wrong? How will we know? Will we know on time?, are the basis for forming realistic backup plans and implementing them on time before snowballing workload limits the pilot’s ability to think ahead.”

“Airlines should periodically review normal and non-normal procedures looking for design features that could induce error. Examples of correctable design flaws are checklist conducted during periods of high interruptions, critical items that are permitted to “float” in time and actions that require the monitoring pilot to head down during critical periods such as taxing near runway intersections.”

“Operators should carefully examine whether they are unintentionally giving pilots mixed messages about competing goals such as SOPs adherence versus on-time-performance and fuel costs. If a company is serious about SOPs adherence it should publish, train and check those criteria as hard-and-fast rules rather than as guidelines. Further, it is crucial to collect data about deviation from those criteria (LOSA & FOQA) and to look for organizational factors that tolerate or even encourage those deviations.”

“These are some of the ways to increase human reliability on the flight deck, making errors less likely and helping the system recover from the errors that inevitably occur. This is hard work, but it is the way to prevent accidents. In comparison, blaming flight crews for making errors is easy but ultimately ineffective.”

To be continued on Pilot performance in emergencies: why can be so easy, even for experts, to fail

REFERENCES

The previous paragraphs were excerpted from:

  1. Dismukes, R. K. (2001). Rethinking crew error: Overview of a panel discussion. In R. Jensen (Ed.), Proceedings of the 11th International Symposium on Aviation Psychology. Columbus, OH: Ohio State University.
  2. Darby, Rick & Setze, Patricia. Factors in Vulnerability a book review to The Limits of expertise: Rethinking pilot error and the causes of airline accidents. Dismukes, R. K., Berman, B. A., & Loukopoulos, L. D. (2007). Burlington, VT Ashgate. Aviation Safety World, May, 2007 53-54
  3. Dismukes, R.K., Berman, B., & Loukopoulos, L. D. (2006, April). The limits of expertise: rethinking pilot error and the causes of airline accidents. Presented at the 2006 Crew Resource Management Human Factors Conference, Denver, Colorado. (PDF 232KB)
  4. Berman, B. A. & Dismukes, R. K. (2006) Pressing the approach: A NASA study of 19 recent accidents yields a new perspective on pilot error, Aviation Safety World, December 2006, 28-33.
  5. United Arab Emirates, General Civil Aviation Authority, Air Accident Investigation Sector. Accident Preliminary Report: Runway Impact During Attempted Go-Around. Dubai International Airport. 3 August 2016. Boeing 777-300 operator: Emirates. AAIS Case No: AIFN/0008/2016

FURTHER READING

  1. Pilot performance in emergencies: why can be so easy, even for experts, to fail
  2. Multitasking in Complex Operations, a real danger
  3. Speaking of going around
  4. Going around with all engines operating
  5. Normalization of Deviance: when non-compliance becomes the “new normal”
  6. The Organizational Influences behind the aviation accidents & incidents

**********************

minime2By Laura Duque-Arrubla, a medical doctor with postgraduate studies in Aviation Medicine, Human Factors and Aviation Safety. In the aviation field since 1988, Human Factors instructor since 1994. Follow me on facebook Living Safely with Human Error and twitter@dralaurita. Human Factors information almost every day 

5 thoughts on “When the error comes from an expert: The Limits of Expertise

  1. Only question that needs to be highlighted is… why make the decision to go around with 3000 meters ahead?

    Answer: the Airline (without a union to protect workers) had and continues to fire people for long landings. Put that into your human factors equation, and you find the dog is wagged by the tail all day long at EK. Not an environment that leads to the safest decisions under any circumstances. They regularly use QAR data impune individuals. Not smart!

    Liked by 1 person

    1. That is one of many links in the error chain but it doesn’t explain why they tried to go around with no thrust. This is not an article about the Emirates B777 accident, it doesn’t intend to analyze or discuss its causes, I only used this accident as a recent example of how is so easy even for experts to fail. Experts make mistakes not because they are stupid or negligent or unworthy, as people tend to think and say, but for the many human cognitive limitations. That’s the intention of this article, to explain some of these human cognitive limitations.
      On the other hand, you know no one thing “causes” accidents but the concatenation of multiple factors, where each may be necessary but none alone sufficient, organizational factors are one of this factors, one of huge importance. (There are at least two articles on this blog about that 🙂 please feel free to read them) Thanks a lot for your comment.

      Like

  2. What comes to my mind by errors made by these professionals is the fact that pilots have been trained to go around when not stabilized on the approach regardless of the altitude and speed.
    When we look deeply into the accidents of Asiana in San Francisco and this accident in Dubai is that both of them tried to execute a go around very close to the ground without enough speed. As they had a very high angle of attack at that moment, the aircraft crashed against the ground with high thrust on their engines.
    Something needs to be done regarding instruction about going around very close to the ground and no speed

    Liked by 1 person

    1. It’s very probable there will be some recommendations about flight training, scanning procedures, workload management during go-around especially by the Pilot Monitoring, airplane state awareness by the flight crew…
      Recently, there have been many accidents while going around, is a maneouver that it’s not risk-free. Last January 16th, there was another one at Bishkek (Kyrgyzstan).
      But, again, the objective of the article is not to discuss the Emirates B777 accident, it is just an example 🙂 The objective of the article is to acknowledge how easy is for experts to fail. Thanks a lot for your comment, Javier.

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s