Threat and error management

Last updated
Threat and error management model TEMsnowconemodel.png
Threat and error management model

In aviation safety, threat and error management (TEM) is an overarching safety management approach that assumes that pilots will naturally make mistakes and encounter risky situations during flight operations. Rather than try to avoid these threats and errors, its primary focus is on teaching pilots to manage these issues so they do not impair safety. Its goal is to maintain safety margins by training pilots and flight crews to detect and respond to events that are likely to cause damage (threats) as well as mistakes that are most likely to be made (errors) during flight operations. [1]

Contents

TEM allows crews to measure the complexities of a specific organization's context meaning that the threats and errors encountered by pilots will vary depending upon the type of flight operation and record human performance in that context. [2] TEM also considers technical (e.g. mechanical) and environmental issues, and incorporates strategies from Crew Resource Management to teach pilots to manage threats and errors.

The TEM framework was developed in 1994 by psychologists at University of Texas based on the investigation of accidents of high capacity Regular Public Transport (RPT) airlines. [3] However, an evaluation method was needed to identify threats and errors during flight operations and to add information to existing TEM data. [4] [5] A Line Operations Safety Audit (LOSA) serves this purpose and involves the identification and collection of safety-related information on crew performance, environmental conditions, and operational complexity by a highly trained observer. [5] [6] LOSA data is used to assess the effectiveness of an organization's training program and to find out how trained procedures are being implemented in day-to-day flights.

Importance of TEM

Threat and error management is an important element in the training of competent pilots that can effectively manage in-flight challenges. [1] [7] Many strategies have been developed (e.g. training, teamwork, reallocating workload) that were focused on improving on stress, fatigue, and error. Flight crew training stressed the importance of operational procedures and technical knowledge, with less emphasis placed on nontechnical skills, which became isolated from the real-world operational contexts. [4] Safety training, including TEM, is important because a crew's nontechnical (safety) knowledge helps more in managing errors effectively than crews' familiarization with operations through experience. [8] Candidates who are shortlisted during selection and training processes must demonstrate analytical and coordination capabilities. [9] Possessing these nontechnical skills allows pilots and crew members to carry out their duties efficiently and effectively.

Components of TEM

The following components are methods that help provide data for the TEM.

LOSA observation training

Training for LOSA experts includes two sessions: education in procedural protocols, and TEM concepts and classifications. [10] A LOSA trainee is taught to find data first and then code them later for both sessions, during which a crew member must exhibit "LOSA Etiquette" ability to notify the pilot as to why he or she was not able to detect an error or threat after a flight. The pilot's responsibilities include his or her opinions on what safety issues could have had an adverse impact on their operations. A LOSA trainee must then record the specific responses of the pilot and thereafter code performance using behavioral markers. The order of the recording is as follows: a) record visible threats; b) identify error types, crew's responses, and specific outcomes; and c) use CRM behavioral markers to rate crew. [11]

Observers will finally record a pilot's overall response on a 4-point Likert scale: 1) poor, 2) marginal, 3) good, and 4) outstanding. The data are then quantified and tabulated as exemplified by the following format: [10]

Planning and execution of performance

TaskTask DescriptionCommentsRating
Monitor cross-checkActive monitoring of crewsSituational awareness maintainedOutstanding
SOP briefingCarried out necessary briefingsThorough understanding of procedures
Contingency ManagementCommunicate strategiesGood management of threats and errors.
Identified ThreatsManagedMismanaged*Frequency (N)
Air Traffic Control17219
Airline Operational Pressure909
Weather6612

Frequency is the total number of threats that occurred and is denoted by N.

Categories of the LOSA

LOSA identifies three main categories that must be recorded:

Safety change process

Safety change process (SCP), which is part of LOSA, is a formal mechanism that airlines can use to identify active and latent threats to flight operations. [12] It is a guideline that communicates in detail what is an imminent threat to current operations or who is causing the threat. In the past, SCP data were based on investigation of accidents or incidents, experiences, and intuitions but nowadays SCP focuses more on the precursors to accidents. [12] There are several steps involved in conducting SCP: [12]

Safety Change Process (SCP) model
1. Collect safety issues (LOSA expert)2. Conduct detailed analysis of Risks/data3. Identify improvement strategies
8. Revise any changesSafety Change Process4. Risk Analysis
7. Observe the impact of changes6. Apply changes to operations5. Funding of changes

An unnamed airline conducted base-line observations from 1996 to 1998 using the defined SCP and LOSA data to improve its organization's safety culture and the results were positive. The crew error-trapping rate was significantly increased to 55%, meaning that crews were able to detect about 55% of the errors they caused. [12] A 40% reduction in errors related to checklist performance and a 62% reduction in unstabilized approaches (tailstrikes, controlled flight into terrain, runway excursions, etc.) were observed. [12] A proper review and management of SCP and LOSA data can prevent further disasters in flight operations.

See also

Related Research Articles

<span class="mw-page-title-main">Aviation safety</span> State in which risks associated with aviation are at an acceptable level

Aviation safety is the study and practice of managing risks in aviation. This includes preventing aviation accidents and incidents through research, educating air travel personnel, passengers and the general public, as well as the design of aircraft and aviation infrastructure. The aviation industry is subject to significant regulation and oversight.

<span class="mw-page-title-main">Saudia Flight 163</span> August 1980 aircraft fire in Riyadh, Saudi Arabia

Saudia Flight 163 was a scheduled Saudia passenger flight departing from Quaid-e-Azam Airport in Karachi, Pakistan, bound for Kandara Airport in Jeddah, Saudi Arabia, via Riyadh International Airport in Riyadh, Saudi Arabia. The aircraft caught fire after takeoff from Riyadh International Airport on 19 August 1980. Although the Lockheed L-1011-200 TriStar made a successful emergency landing at Riyadh, the flight crew failed to perform an emergency evacuation of the airplane, leading to the deaths of all 287 passengers and 14 crew on board the aircraft from smoke inhalation.

Crew resource management or cockpit resource management (CRM) is a set of training procedures for use in environments where human error can have devastating effects. CRM is primarily used for improving aviation safety and focuses on interpersonal communication, leadership, and decision making in aircraft cockpits. Its founder is David Beaty, a former Royal Air Force and a BOAC pilot who wrote The Human Factor in Aircraft Accidents (1969). Despite the considerable development of electronic aids since then, many principles he developed continue to prove effective.

<span class="mw-page-title-main">Pilot error</span> Decision, action, or inaction by an aircraft pilot

In aviation, pilot error generally refers to an action or decision made by a pilot that is a substantial contributing factor leading to an aviation accident. It also includes a pilot's failure to make a correct decision or take proper action. Errors are intentional actions that fail to achieve their intended outcomes. The Chicago Convention defines the term "accident" as "an occurrence associated with the operation of an aircraft [...] in which [...] a person is fatally or seriously injured [...] except when the injuries are [...] inflicted by other persons." Hence the definition of "pilot error" does not include deliberate crashing.

<span class="mw-page-title-main">Gulf Air Flight 072</span> 2000 aviation accident

Gulf Air Flight 072 (GF072/GFA072) was a scheduled international passenger flight from Cairo International Airport in Egypt to Bahrain International Airport in Bahrain, operated by Gulf Air. On 23 August 2000 at 19:30 Arabia Standard Time (UTC+3), the Airbus A320 crashed minutes after executing a go-around upon failed attempt to land on Runway 12. The flight crew suffered from spatial disorientation during the go-around and crashed into the shallow waters of the Persian Gulf 2 km (1 nmi) from the airport. All 143 people on board the aircraft were killed.

<span class="mw-page-title-main">United Airlines Flight 173</span> 1978 aviation accident in Portland, Oregon

United Airlines Flight 173 was a scheduled flight from John F. Kennedy International Airport in New York City to Portland International Airport in Portland, Oregon, with a scheduled stop in Denver, Colorado. On December 28, 1978, the aircraft flying this route ran out of fuel while troubleshooting a landing gear problem and crashed in a suburban Portland neighborhood near NE 157th Avenue and East Burnside Street, killing 10 people on board.

Maintenance resource management (MRM) training is an aircraft maintenance variant on crew resource management (CRM). Although the term MRM was used for several years following CRM's introduction, the first governmental guidance for standardized MRM training and its team-based safety approach, appeared when the FAA (U.S.) issued Advisory Circular 120-72, Maintenance Resource Management Training in September, 2000.

Single-pilot resource management (SRM) is defined as the art and science of managing all the resources available to a single-pilot to ensure that the successful outcome of the flight is never in doubt. SRM includes the concepts of Aeronautical Decision Making (ADM), Risk Management (RM), Task Management (TM), Automation Management (AM), Controlled Flight Into Terrain (CFIT) Awareness, and Situational Awareness (SA). SRM training helps the pilot maintain situational awareness by managing the automation and associated aircraft control and navigation tasks. This enables the pilot to accurately assess and manage risk and make accurate and timely decisions.

Maritime resource management (MRM) or bridge resource management (BRM) is a set of human factors and soft skills training aimed at the maritime industry. The MRM training programme was launched in 1993 – at that time under the name bridge resource management – and aims at preventing accidents at sea caused by human error.

The Academy of Technical Training is a privately owned institution which provides training programs in aviation security and safety, air traffic control, and management of aircraft accidents. The curriculum and courses are accredited and certified by the General Civil Aviation Authority.

Aviation accident analysis is performed to determine the cause of errors once an accident has happened. In the modern aviation industry, it is also used to analyze a database of past accidents in order to prevent an accident from happening. Many models have been used not only for the accident investigation but also for educational purpose.

Pilot decision making, also known as aeronautical decision making (ADM), is a process that aviators perform to effectively handle troublesome situations that are encountered. Pilot decision-making is applied in almost every stage of the flight as it considers weather, air spaces, airport conditions, estimated time of arrival and so forth. During the flight, employers pressure pilots regarding time and fuel restrictions since a pilots’ performance directly affects the company’s revenue and brand image. This pressure often hinders a pilot's decision-making process leading to dangerous situations as 50% to 90% of aviation accidents are the result of pilot error.

<span class="mw-page-title-main">Impact of culture on aviation safety</span>

Culture can affect aviation safety through its effect on how the flight crew deals with difficult situations; cultures with lower power distances and higher levels of individuality can result in better aviation safety outcomes. In higher power cultures subordinates are less likely to question their superiors. The crash of Korean Air Flight 801 in 1997 was attributed to the pilot's decision to land despite the junior officer's disagreement, while the crash of Avianca Flight 052 was caused by the failure to communicate critical low-fuel data between pilots and controllers, and by the failure of the controllers to ask the pilots if they were declaring an emergency and assist the pilots in landing the aircraft. The crashes have been blamed on aspects of the national cultures of the crews.

<span class="mw-page-title-main">Pilot fatigue</span> Reduced pilot performance from inadequate energy

The International Civil Aviation Organization (ICAO) defines fatigue as "A physiological state of reduced mental or physical performance capability resulting from sleep loss or extended wakefulness, circadian phase, or workload." The phenomenon places great risk on the crew and passengers of an airplane because it significantly increases the chance of pilot error. Fatigue is particularly prevalent among pilots because of "unpredictable work hours, long duty periods, circadian disruption, and insufficient sleep". These factors can occur together to produce a combination of sleep deprivation, circadian rhythm effects, and 'time-on task' fatigue. Regulators attempt to mitigate fatigue by limiting the number of hours pilots are allowed to fly over varying periods of time.

<span class="mw-page-title-main">Stress in the aviation industry</span> Pilots wellbeing whilst working

Stress in the aviation industry is a common phenomenon composed of three sources: physiological stressors, psychological stressors, and environmental stressors. Professional pilots can experience stress in flight, on the ground during work-related activities, and during personal time because of the influence of their occupation. An airline pilot can be an extremely stressful job due to the workload, responsibilities and safety of the thousands of passengers they transport around the world. Chronic levels of stress can negatively impact one's health, job performance and cognitive functioning. Being exposed to stress does not always negatively influence humans because it can motivate people to improve and help them adapt to a new environment. Unfortunate accidents start to occur when a pilot is under excessive stress, as it dramatically affects his or her physical, emotional, and mental conditions. Stress "jeopardizes decision-making relevance and cognitive functioning" and it is a prominent cause of pilot error. Being a pilot is considered a unique job that requires managing high workloads and good psychological and physical health. Unlike the other professional jobs, pilots are considered to be highly affected by stress levels. One study states that 70% of surgeons agreed that stress and fatigue don't impact their performance level, while only 26% of pilots denied that stress influences their performance. Pilots themselves realize how powerful stress can be, and yet many accidents and incidents continues to occur and have occurred, such as Asiana Airlines Flight 214, American Airlines Flight 1420, and Polish Air Force Tu-154.

NOTECHS is a system used to assess the non-technical skills of crew members in the aviation industry. Introduced in the late 1990s, the system has been widely used by airlines during crew selection process, picking out individuals who possess capable skills that are not directly related to aircraft controls or systems. In aviation, 70 percent of all accidents are induced from pilot error, lack of communication and decision making being two contributing factors to these accidents. NOTECHS assesses and provides feedback on the performance of pilots' social and cognitive skills to help minimize pilot error and enhance safety in the future. The NOTECHS system also aims to improve the Crew Resource Management training system.

<span class="mw-page-title-main">SHELL model</span> Conceptual model for human error in aviation

In aviation, the SHELL model is a conceptual model of human factors that helps to clarify the location and cause of human error within an aviation environment.

The LOSA Collaborative is a private organization that serves as the main provider of the Line Operation Safety Audit (LOSA). It was established in 2001 in Austin, Texas to expand the use of the LOSA and sharing of airline safety data in coordination with protocols provided by the International Civil Aviation Organization (ICAO) and the Federal Aviation Administration (FAA).

<span class="mw-page-title-main">European Helicopter Safety Team</span>

The European Helicopter Safety Team (EHEST) was a European aviation safety improvement initiative focusing on improving helicopter safety in Europe and worldwide. It was established in 2006 as part of the European Strategic Safety Initiative (ESSI) of the European Aviation Safety Agency (EASA). The goal of the European Helicopter Safety Team was to contribute to reducing the worldwide helicopter accident rate by 80% in the time-span 2006-2016, which was set as a goal by the International Helicopter Safety Team (IHST) in 2006. Focusing on European helicopter operators and manufacturers, the European Helicopter Safety Team conducted helicopter accident analyses, provided technology potential studies, and published safety management and training documents.

An in-flight breakup is a catastrophic failure of an aircraft structure that causes it to break apart in mid-air. This can result in the death of all occupants and the destruction of the aircraft. In-flight breakups are rare but devastating events that can be caused by various factors.

References

  1. 1 2 3 Dekker, Sidney; Lundström, Johan (May 2007). "From Threat and Error Management (TEM) to Resilience". Journal of Human Factors and Aerospace Safety: 1. Retrieved 6 October 2015.
  2. Maurino, Dan (18 April 2005). "Threat and Error Management (TEM)" (PDF). Coordinator, Flight Safety and Human Factors Programme - ICAO. Canadian Aviation Safety Seminar (CASS): 1. Retrieved 6 October 2015.
  3. Banks, Ian. "Threat & Error Management (TEM) SafeSkies Presentation" (PDF). Retrieved 19 October 2015.
  4. 1 2 Thomas, Matthew (2004). "Predictors of Threat and Error Management: Identification of Core Nontechnical Skills and Implications for Training Systems Design". The International Journal of Aviation Psychology. 14 (2): 207–231. doi:10.1207/s15327108ijap1402_6. S2CID   15271960 . Retrieved 24 October 2015.
  5. 1 2 Earl, Laurie; Murray, Patrick; Bates, Paul (2011). "Line Operations Safety Audit (LOSA) for the management of safety in single pilot operations (LOSA:SP) in Australia and New Zealand". Aeronautica (Griffith University Aerospace Strategic Study Centre) (1): 2.
  6. Thomas, Matthew (2003). "Operational Fidelity in Simulation-Based Training: The Use of Data from Threat and Error Management Analysis in Instructional Systems Design" (PDF). Proceedings of SimTecT2003: Simulation Conference: 2. Retrieved 19 October 2015.
  7. Martin, Wayne L. (2019). "Crew Resource Management and Individual Resilience". Crew Resource Management. Elsevier. pp. 207–226. doi:10.1016/b978-0-12-812995-1.00007-5.
  8. Thomas, Matthew; Petrilli, Renee (Jan 2006). "Crew Familiarity: Operational Experience, NonTechnical Performance, and Error Management" (PDF). Aviation, Space, and Environmental Medicine. 77 (1). Retrieved 25 October 2015.
  9. Sexton, J. Bryan; Thomas, Eric; Helmreich, Robert (March 2000). "Error, Stress, and Teamwork in Medicine and Aviation: Cross Sectional Surveys". British Medical Journal. 320 (7273): 745–749. doi:10.1136/bmj.320.7237.745. PMC   27316 . PMID   10720356.
  10. 1 2 3 Earl, Laurie; Bates, Paul; Murray, Patrick; Glendon, Ian; Creed, Peter (2012). "Developing a Single-Pilot Line Operations Safety Audit: An Aviation Pilot Study". Aviation Psychology and Applied Human Factors. 2: 49–61. doi:10.1027/2192-0923/a000027. hdl: 10072/49214 . Retrieved 24 October 2015.
  11. Leva, M.C.; et al. (August 2008). "The advancement of a new human factors report – 'The Unique Report' – facilitating flight crew auditing of performance/operations as part of an airline's safety management system". Ergonomics. 53 (2): 164–183. doi:10.1080/00140130903437131. PMID   20099172. S2CID   32462406.
  12. 1 2 3 4 5 "Line Operations Safety Audit (LOSA)" (PDF). ICAO Journal (First Edition): 25–29. 2002. Retrieved 18 November 2015.