Why Would a Flight Crew Make a Wrong Decision?

Contact Our Team

For more information about how Halldale can add value to your marketing and promotional campaigns or to discuss event exhibitor and sponsorship opportunities, contact our team to find out more

 

The Americas -
holly.foster@halldale.com

Rest of World -
jeremy@halldale.com



CAE-simulator

Editor’s Note: CAT magazine presents Guest Commentary on important issues facing the community. The opinions expressed are the author’s own.

This commentary is offered by Naveed Kapadia, whose career expertise includes research and development of flight crew training to enhance safety for a major European airline group, business development for Airways Aviation, and easyJet Flight Officer. He earned a master’s degree in air transport management from City University of London and an MSc bursary winner from Royal Academy of Engineering. He is also an ambulance crew volunteer during the current UK health crisis.

He poses the challenge: do we need a global refresh on how we train crews for decision-making?

We are used to scrutinising accidents and serious incidents, but we almost never investigate with the same tenacity and vigor when things go right. Why do we deprive ourselves of equally important learning opportunities when the flight crew makes the right call? We need to encourage stakeholders to proactively look at what went well and celebrate success with similar importance. Enhancing insights and experience is key to minimise errors.

Given the recent focus on preventing skill fade during the pandemic shutdown and potential challenges with ‘restarting the operations’, perhaps we need to revisit and reassess our current thinking on how we prepare flight crews for making effective decisions.

No flight crew makes a wrong decision deliberately. People can be oblivious of the things that affect their actions, yet people rarely feel impervious to what really affects their decision-making. If perception is about being aware of what surrounds us, then judgement is about using those perceptions to reach conclusions. People reach different conclusions and make different decisions because they perceive things in a different way. Therefore, we can consider that all Intentions are positive until proven otherwise.

In 1972, 2365 fatalities were recorded versus just 50 in 2017 (Figure 1). There is a worrying upsurge in 2018 where 586 fatalities were recorded consisting of 11 fatal accidents, according to the ICAO safety report (2019 edition). A concerning point is that decision-making may directly or indirectly be one of the contributory factors that led to these events.

Figure 1. Number of Fatalities Involving Large Aeroplane Passenger and Cargo Operations Worldwide combined, 1970-2018. (Source EASA Annual Safety Review 2019)

Looking further into the detail (Figure 2) for the years between 2008 and 2018, it would appear there is a huge delta between the number of accidents and serious incidents within European member states and rest of the world. Why have European states seemingly managed to curtail the trend better than other regions? And will these statistics be impacted by the current economic crisis and added financial pressures the airlines are facing?

Figure 2. Number of Fatal Accidents and Fatalities Involving Large Aeroplane Passenger and Cargo Operations, EASA MS and Rest of the World, 2008-2018. (EASA Annual Safety Review 2019)

The Pressures of Economics

The coronavirus global health crisis has triggered the most severe economic recession in nearly a century. The International Air Transport Association (IATA) financial outlook expects airlines to lose $84.3 billion in 2020 for a net loss of -20.1%. Revenues will fall 50% to $419 billion in 2020. In 2021, losses are expected to be cut to $15.8 billion as revenues rise to $598 billion.

“Financially, 2020 will go down as the worst year in the history of aviation. On average, every day of this year will add $230 million to industry losses. In total that’s a loss of $84.3 billion. It means that – based on an estimate of 2.2 billion passengers this year – airlines will lose $37.54 per passenger. That’s why government financial relief was and remains crucial as airlines burn through cash”, stated Alexandre de Juniac, IATA’s Director General and CEO.

Figure 3. Source: OECD (2020), OECD Economic Outlook No. 107 (Edition 2020/1)

The Organisation for Economic Co-operation and Development (OECD) economic outlook (Figure 3) presents two possible scenarios, one where the virus continues to recede and remains under control and another where a second wave of rapid contagion erupts later in 2020, slowing the recovery even further.

Operational efficiencies and reducing costs will be high on the agenda for most global aviation operations. Concerns with skill fade, recency and safety compliance will exert enormous pressures on an organisation’s resources with the temptation to only focus on the bare minimum and perhaps cut corners. The enormous toll on emotional and physical resilience of the entire ecosystem will be formidable. Tight budgetary controls against the backdrop of dire economic recovery may drive behaviours that need to be closely monitored and managed.

Experience and Expertise

Gary Klein, PhD, a cognitive psychologist who has spent years in observing how people make decisions, believes that Recognition Primed Decision Making (RPDM) is how we are wired. ‘Naturalistic’ RPDM seeks to explain how good decisions are made under extreme time pressures and under huge amounts of uncertainty.

Dr. Klein (Senior Scientist at MacroCognition, Washington DC) and his team observed firefighters and realised that they followed a unique way to decide their next course of action and did not follow a ‘rational’ decision-making process. The expert firefighters got their expertise through experience, which they utilised for pattern matching, once they knew what goals were to be achieved. They did not go through an evaluation process to assess multiple options but intuitively knew what to do. The situation generated the cues to help recognise the patterns that led to actions. Those actions were assessed by mental simulation using mental models. If there were anomalies, the firefighters went back to assessing the situation on what they may have missed and re-worked the entire loop again until it matched the mental model for actions to be committed. The firefighters invariably worked together to form a common understanding that also played to their strength during the times of disaster.

In a larger team dynamic, more complexities are involved. Practicing together to form a basis of common ground to avoid the breakdown of assumptions is essential to establish a sound ‘experience-bank’ for future pattern matching. Unlike firefighters, in a commercial airline multi-crew flight deck environment, working with the same crew pairing is very rare and largely not practical to facilitate a collective mindset primed to handle non-normal events in a similar manner. If we have different levels of expertise, it is no surprise that more often than not, there will be a higher degree of miscommunication and lack of coordination to manage the startle of the non-normal scenarios.

Furthermore, studies have highlighted that people do not consciously choose to do RPDM or not. The choice comes in how much confidence people have in their intuitive decisions and how much time and energy they put in mentally simulating courses of action.

“Flight crews need to be able to adapt skilfully by having intuitive skills.” Image credit: NASA.

So how do people gain confidence? The consensus among researchers is that confidence develops via experience. During the pilot certification processes, the required technical knowledge is assessed and assumed to have developed in academic and flight training programmes. But the question remains on how to build enough relevant experience to facilitate intuitive decisions?

There is a danger here that suggests that hoping to achieve expertise via training may slow down developmental skills. This may deprive the novice of that critical lesson on differentiating a poor outcome with a poor decision. Poor outcomes are different from poor decisions.

Researchers have warned against the temptation to train people to think ‘like experts’ and to instead focus on the development and building of an experience-bank. Novices often look for typical outcomes whereas the experts are aware of good and great. Beginners keep coming up against events that were not anticipated and are surprised when expectations are violated; on the other hand, an absence of an event surprises experts. Furthermore, it is understood that the decision-making process, from the general cognitive point of view, is seen to be the process of selecting choices or courses of action from alternatives and is understood to include gathering information, estimating likelihood, deliberating, selection and human reasoning.

Conventional decision-making process involves logical thinking, probabilities and statistical methods. However, in a ‘naturalistic’ setting, it is not analytical; people focus on the ability to quickly size up the situation, their ability to simulate how the case might play out, referring to personal experience and anecdotal storytelling.

In circumstances where the situation is dynamic, with less chance of feedback and less opportunity to match patterns due to lack of frequency and  fewer chances for trial and error, experience does not necessarily translate into expertise. Additionally, people with significant expertise can see the world differently; they notice problems more readily and know what to do to fix it. But this can quickly lead to viewing problems in a stereotypical manner. Explaining the inconsistencies away may miss signs of mayhem to come. Therefore, a fresh perspective is always needed, and a multi-crew flight deck environment can play a collaborative role in sense-checking the understanding and positively challenge the situational awareness for a safer outcome.

Flight crews should plan to adapt and expect to improvise rather than trying to figure everything out in advance. Adaptation in today’s complex operational setting remains critical. Flight crews need to be able to adapt skilfully by having intuitive skills. Here again, a relevant bank of experience is helpful for intuitive decision-making. Plan for adaptation, search for early signs of problems, expect to revise goals, and resist the urge to continue down a certain path due to emotional attachment. It is never too late to make the right decision.

The Continuous Phenomenon: Culture

While decision and behaviour could be the vital elements of a decision-making process, cultural emphases on preferences in life also influence how people believe, think and behave. Culture can act as a filter and helps people to simplify incoming information and make sense of surroundings. Additionally, other factors such as knowledge, experience, mood states, environment, fatigue, stress, and social factors contribute in processing information while making decisions. Cultural diversity is not a condition that assumes mutual understanding but instead is a continuous phenomenon that occurs when people think, communicate, and behave from different cultures influenced by their core values. Cultural competence is the ability to appreciate different cultures so that we are able to understand, communicate and quickly align our thought processes and common goals. It helps with effective communication and to ensure that the common understanding of the challenge at hand does not erode. Therefore, the absence of cultural competence is a potential threat that must be tackled head-on. Organisations should consider the acquisition of cultural competence in their decision-making training.

Past as Prologue?

With these theoretical concepts and ideas in mind, consider a selection of past accidents and serious incidents to ascertain where we may be heading next:

19 August 1980 - Saudi Arabian Airlines, Flight 163, a Lockheed L1011 Tristar, departed Riyadh enroute to Jeddah. The pilots returned to Riyadh after an uncontrolled fire developed in the C3 cargo compartment of the aircraft. The flight landed, taxied clear of the runway and came to a stop at an adjacent taxiway without the decision to evacuate the aircraft. No common communication frequency was established between the flight crew, the fire and rescue teams and the tower, resulting in further delays for the flight crew to decide whether to evacuate or not. While parked on the taxiway, the aircraft was destroyed by the fire, killing all 310 passengers onboard. The final accident report pointed at various contributory factors, including several questionable decisions by the Captain and lack of adequate training of the fire and rescue teams. Among other findings, review of the SOPs, enhanced flight crew CRM training was recommended to improve standards and safety of flight. Source: Dreifus, E., 1980. Lessons Learned from Civil Aviation Accidents. Lessonslearned.faa.gov. Available at:https://lessonslearned.faa.gov/Saudi163/AircraftAccidentReportSAA.pdf.

4 February 2015 - A TransAsia Airways ATR72 experienced loss of control during the initial climb and crashed into Keelung River near Songshan Airport, Taipei, Taiwan. The final accident report concluded that the crew did not respond to the stall warnings in a timely and effective manner as they shut down the operative engine in error. The aircraft stalled and continued descent during the attempted engine restart. The remaining altitude and time to impact were not enough to successfully restart the engine and recover the aircraft. Further improvement of regulatory oversight of training including CRM by the Civil Aeronautics Administration was recommended as there were various red flags along the way that should have been picked up and addressed. Source: Taiwan Transport Safety Board

9 June 2019 - A British Airways Boeing 747 was enroute from London Heathrow to Phoenix, US. On reaching top of climb the aircraft experienced unreliable airspeed indications resulting in overspeed warnings and activation of the stall warning system. In recovering, the crew carried out the unreliable airspeed procedure but also carried out the stall warning procedure, which was not required. The problem was believed to have been caused by a fault with the right Air Data Computer (ADC), (although this could not be replicated due to the loss of the fault codes that occurred following arrival of the ADC in their avionics workshop). The Quick Reference Handbook procedure applicable at the time noted that ‘overspeed warnings and ‘AIRSPEED LOW’ alerts may occur erroneously or simultaneously’. Stall warnings were not mentioned specifically as the aircraft manufacturer considered that crews would understand this was included. But the crew involved decided to react to the stall warning when it occurred. The ‘AIRSPEED LOW’ alert is a specific warning and the crew considered that, as the stall warning was not mentioned separately in the procedural note, operation of the stick shaker should not be considered erroneous. This was confirmed to them when the stick shaker operation ceased, while reducing the pitch, as they would expect after a genuine stall warning. Eventually QRH datums were attained and there was no further stick shaker activation. This incident highlights the importance of clear, unambiguous information being readily available to crews at times of high workload when dealing with potentially critical incidents. As a result of this incident, the aircraft manufacturer is providing additional information as part of their published unreliable airspeed procedure. The aircraft operator is also reviewing its maintenance procedures due to the accidental erasure of fault codes on the right ADC as part of the post-incident inspection process. Source: UK AAIB Field Investigation (AAIB Bulletin: 5/2020 G-BNLN EW/C2019/06/03)

4 November 2019 - A Jetstar Airways A320-200 was operating from Sydney, New South Wales, to Sunshine Coast, Queensland, Australia. On final approach to land at runway 18 and below 900 ft, a proximity event occurred with an Aero Commander 500 aircraft, which was departing Sunshine Coast Airport on runway 36. As this was outside Sunshine Coast ATC tower operating hours, the airspace was Class G (uncontrolled) and pilots of aircraft in the vicinity of the airport were communicating on the common traffic advisory frequency. The two aircraft paths converged. There was no resolution advisory or aural alert as all traffic was marked as traffic advisory only. Upon realisation and once airborne, the Aero Commander pilot conducted a right turn and increased the separation and a certain disaster was averted. The two aircraft passed each other with a recorded separation of 0.7 nm horizontally and 265 ft vertically. Important radio broadcasts on the CTAF were not heard by the Jetstar flight crew and the pilot of the Aero Commander regarding each other’s positions and intentions, leading to them continuing to use reciprocal runways. Complacency and loss of situational awareness of surrounding traffic within the vicinity of uncontrolled space has been highlighted as a broad safety concern by the Australian Transport Safety Bureau, according to whom Insufficient communication between pilots operating in the same area is the most common cause of safety incidents near non-controlled aerodromes. Source: Australian Transport Safety Bureau (ATSB Transport Safety Report Aviation Occurrence Investigation AO-2019-062 Final – 4 June 2020)

Connecting the Dots

Many organisations enforce procedures after an error, or when violation has been made, rather than trying to understand the reason for the error and looking for proactive ways to preventing them from ever again taking place. Unless we improve our insights into effective decision-making, we are more than likely to keep having such events.

Researchers have highlighted human error as a symptom that represents trouble deeper in the system and have warned against the idea that all human decisions can be questionable after the event has occurred. We have all seen how easy it is to connect the dots looking backwards.

Fuelled by technological advances, decision-making is further compounded by over-reliance and incorrect use of automation. On the one hand, automation makes situational awareness and workload management more efficient. But it has also directly contributed to eroded skills in areas such as manual flying, upset prevention, recovery and challenging the status quo.

“Do we need to be worried about the ever-increasing experience gap within modern cockpits?” Image credit: BAA Flight School.

We now have more insights into decision-making skills with wider perspective than ever before, but it doesn’t seem to have translated into new training methodologies. More often than not, we are now able to acquire very detailed flight parameters and associated detail right up until an event occurred. We have seen remarkable technological advances in aircraft types and their operation, deeper understanding of CRM, appreciation of cultural nuances and formidable experienced crews with relevant expertise. Safety-enhancing operations, rules and regulations have been refined and implemented. Independent safety review boards are genuinely interested in averting the next disaster. Regulators are looking at the trends to offer their guidance and support. And yet this has merely changed the type of accidents. The new entrants have unique skills but also new learning preferences and requirements.

Studies have also suggested that firefighters, first-responders, soldiers and senior executives who make critical life-and-death decisions resort to intuitively making the right decision based on their experiences. Is it possible to do the same within the aviation industry?

We need to improve our understanding of errors in ‘decision-making’. What does it really mean in the context of the aviation system and involving all disciplines from engineering, cabin crew, air traffic controllers, ground operations, and senior leadership team, amongst many others?

Where Do We Go from Here?

A good way to reach decisions is to use a ‘Yes/No’ response to the questions at hand, especially when the risk-to-reward ratio is weighed with no luxury of time. However, making decisions is not as simple as a ‘yes or no’ assessment. Further complexities such as more choices, experience, expertise, skills, cultural biases and technological advances need to be explored. Why would the flight-crew make wrong decisions? A pilot error in decision-making should be regarded as a system error and managed as such.

Do we resort to acceptable outcomes rather than an ideal scenario? Do we need to be worried about the ever-increasing experience gap within modern cockpits? How confident are we with the training and assessment of new commanders? Are we losing a very valuable safety net from our industry?

The technological advances and the evolving demographic of crew from varied backgrounds and cultures, learning preferences, age, gender and experience are more than likely to change the nature of causal factors. Yet we have not quite evolved when it comes to enhancing decision-making capabilities of our crew members.

We have considered competencies, evidence-based training and ATQP to varying degrees of success. How effective are the predicted training scenarios during annual training events? Surely training for a startle-induced scenario is different to being skilled at expected events and in controlled environments? How effective, relevant and prepared do we feel to adequately and proactively addressing the future crew training deficiencies and learning preference of every crew member? How can we enhance the experience-bank and insights of our new generation of flight crew?

Do we need a global refresh on how we train our crew for decision-making? I think we do. We have a unique opportunity now to reflect and refresh our thinking, informed by more relevant and modern research involving all aspects of the aviation community.

Some Recommendations

We need further research and study to appreciate the complexity of flight crew cognitive behaviours while problem-solving and making decisions like experts. Resilience of our training methodologies isn’t a destination but a continuously evolving domain. This will require a proactive and relevant intervention informed by meaningful research and based on a flexible and dynamic approach. A detailed analysis of decision-making and related contributory factors needs to be further researched in the aviation context. All stakeholders, including regulators, airlines, training organisations, airframe manufacturers, air traffic controllers, human performance specialists, psychologists, and behavioural analysts should contribute.

A global mentorship programme for inexperienced colleagues. Demonstrating humility and vulnerability is not a bad thing. Allowing people to be vulnerable and ask questions is empowering. It opens up the opportunity to offer full disclosure. We have an opportunity now to redefine learning with purpose and harness the valuable experience within our industry via a well-defined mentorship programme. Global coordination and collaboration will be key to success. It doesn’t need to be limited to the inexperienced First Officers but also a platform for new commanders to be able to share and learn.

A single global sharing platform for lessons learned. We need a consistent global platform to share errors as well as good decisions that led to a safer outcome. We have several forums and outlets regionally where valuable learning lessons are being shared but we seem to lack the cohesion and a common platform to offer wider benefits. The benefit of hindsight offers us a great way to learn from our mistakes so that others in similar positions will be able to make better and well-informed decisions.

Join the call for action to review and assist in forming a global coalition to eradicate errors in decision-making. It begins with us.

Related articles



More Features

More features