Threat and error management

Last updated
Threat and error management model TEMsnowconemodel.png
Threat and error management model

Threat and error management (TEM) is an overarching safety management approach that assumes that pilots will naturally make mistakes and encounter risky situations during flight operations. Rather than try to avoid these threats and errors, its primary focus is on teaching pilots to manage these issues so they do not impair safety. Its goal is to maintain safety margins by training pilots and flight crews to detect and respond to events that are likely to cause damage (threats) as well as mistakes that are most likely to be made (errors) during flight operations. [1]

Aircrew personnel operating an aircraft in flight, including pilots, systems operators, and attendants

Aircrew, also called flight crew, are personnel who operate an aircraft while in flight. The composition of a flight's crew depends on the type of aircraft, plus the flight's duration and purpose.

Contents

TEM allows crews to measure the complexities of a specific organization's context meaning that the threats and errors encountered by pilots will vary depending upon the type of flight operation and record human performance in that context. [2] TEM also considers technical (e.g. mechanical) and environmental issues, and incorporates strategies from Crew Resource Management to teach pilots to manage threats and errors.

The TEM framework was developed in 1994 by psychologists at University of Texas based on the investigation of accidents of high capacity Regular Public Transport (RPT) airlines. [3] However, an evaluation method was needed to identify threats and errors during flight operations and to add information to existing TEM data. [4] [5] A Line Operations Safety Audit (LOSA) serves this purpose and involves the identification and collection of safety-related information on crew performance, environmental conditions, and operational complexity by a highly trained observer. [5] [6] LOSA data is used to assess the effectiveness of an organization's training program and to find out how trained procedures are being implemented in day-to-day flights.

Air operators certificate approval by national aviation authorities to operate aircraft for commercial purposes

An air operator's certificate (AOC) is the approval granted by a national aviation authority (NAA) to an aircraft operator to allow it to use aircraft for commercial purposes. This requires the operator to have personnel, assets and system in place to ensure the safety of its employees and the general public. The certificate will list the aircraft types and registrations to be used, for what purpose and in what area – specific airports or geographic region.

Importance of TEM

Threat and error management is an important element in the training of competent pilots that can effectively manage in-flight challenges. [1] Many strategies have been developed (e.g. training, teamwork, reallocating workload) that were focused on improving on stress, fatigue, and error. Flight crew training stressed the importance of operational procedures and technical knowledge, with less emphasis placed on nontechnical skills, which became isolated from the real-world operational contexts. [4] Safety training, including TEM, is important because a crew's nontechnical (safety) knowledge helps more in managing errors effectively than crews' familiarization with operations through experience. [7] Candidates who are shortlisted during selection and training processes must demonstrate analytical and coordination capabilities. [8] Possessing these nontechnical skills allows pilots and crew members to carry out their duties efficiently and effectively.

Pilot error decision, action or inaction by a pilot of an aircraft

Historically, the term pilot error has been used to describe an accident in which an action or decision made by the pilot was the cause or a contributing factor that led to the accident, but also includes the pilot’s failure to make a correct decision or take proper action. Errors are intentional actions that that fail to achieve their intended outcomes. Chicago Convention defines accident as "An occurrence associated with the operation of an aircraft [...] in which [...] a person is fatally or seriously injured [...] except when the injuries are [...] inflicted by other persons." Hence the definition of the "pilot error" does not include deliberate crash.

Components of TEM

The following components are methods that help provide data for the TEM.

LOSA observation training

Training for LOSA experts includes two sessions: education in procedural protocols, and TEM concepts and classifications. [9] A LOSA trainee is taught to find data first and then code them later for both sessions, during which a crew member must exhibit "LOSA Etiquette" ability to notify the pilot as to why he or she was not able to detect an error or threat after a flight. The pilot's responsibilities include his or her opinions on what safety issues could have had an adverse impact on their operations. A LOSA trainee must then record the specific responses of the pilot and thereafter code performance using behavioral markers. The order of the recording is as follows: a) record visible threats; b) identify error types, crew's responses, and specific outcomes; and c) use CRM behavioral markers to rate crew. [10]

Crew resource management or cockpit resource management (CRM) is a set of training procedures for use in environments where human error can have devastating effects. Used primarily for improving aviation safety, CRM focuses on interpersonal communication, leadership, and decision making in the cockpit of an airliner. Its pioneer was David Beaty, a former Royal Air Force pilot and later a BOAC pilot who wrote his seminal book The Human Factor in Aircraft Accidents in the late 1950s. Despite the considerable development of electronic aids since then, many of principles he developed continue to prove effective today.

Observers will finally record a pilot's overall response on a 4-point Likert scale: 1) poor, 2) marginal, 3) good, and 4) outstanding. The data are then quantified and tabulated as exemplified by the following format: [9]

Likert scale psychometric measurement scale

A Likert scale is a psychometric scale commonly involved in research that employs questionnaires. It is the most widely used approach to scaling responses in survey research, such that the term is often used interchangeably with rating scale, although there are other types of rating scales.

Planning and execution of performance

TaskTask DescriptionCommentsRating
Monitor cross-checkActive monitoring of crewsSituational awareness maintainedOutstanding
SOP briefingCarried out necessary briefingsThorough understanding of procedures
Contingency ManagementCommunicate strategiesGood management of threats and errors.
Identified ThreatsManagedMismanaged*Frequency (N)
Air Traffic Control17219
Airline Operational Pressure909
Weather6612

Frequency is the total number of threats that occurred and is denoted by N.

Categories of the LOSA

LOSA identifies three main categories that must be recorded:

Safety change process

Safety change process (SCP), which is part of LOSA, is a formal mechanism that airlines can use to identify active and latent threats to flight operations. [15] It is a guideline that communicates in detail what is an imminent threat to current operations or who is causing the threat. In the past, SCP data were based on investigation of accidents or incidents, experiences, and intuitions but nowadays SCP focuses more on the precursors to accidents. [15] There are several steps involved in conducting SCP: [15]

An unnamed airline conducted base-line observations from 1996 to 1998 using the defined SCP and LOSA data to improve its organization's safety culture and the results were positive. The crew error-trapping rate was significantly increased to 55%, meaning that crews were able to detect about 55% of the errors they caused. [15] A 40% reduction in errors related to checklist performance and a 62% reduction in unstabilized approaches (tailstrikes, controlled flight into terrain, runway excursions, etc.) were observed. [15] A proper review and management of SCP and LOSA data can prevent further disasters in flight operations.

See also

Related Research Articles

Controlled flight into terrain accident in which an airworthy aircraft, under pilot control, is unintentionally flown into the ground

A controlled flight into terrain is an accident in which an airworthy aircraft, under pilot control, is unintentionally flown into the ground, a mountain, a body of water or an obstacle. In a typical CFIT scenario, the crew is unaware of the impending disaster until it is too late. The term was coined by engineers at Boeing in the late 1970s.

Aviation safety state of an aviation system or organization in which risks associated with aviation activities, related to, or in direct support of the operation of aircraft, are reduced and controlled to an acceptable level

Aviation safety means the state of an aviation system or organization in which risks associated with aviation activities, related to, or in direct support of the operation of aircraft, are reduced and controlled to an acceptable level. It encompasses the theory, practice, investigation, and categorization of flight failures, and the prevention of such failures through regulation, education, and training. It can also be applied in the context of campaigns that inform the public as to the safety of air travel.

Swiss cheese model

The Swiss cheese model of accident causation is a model used in risk analysis and risk management, including aviation safety, engineering, healthcare, emergency service organizations, and as the principle behind layered security, as used in computer security and defense in depth. It likens human systems to multiple slices of swiss cheese, stacked side by side, in which the risk of a threat becoming a reality is mitigated by the differing layers and types of defenses which are "layered" behind each other. Therefore, in theory, lapses and weaknesses in one defense do not allow a risk to materialize, since other defenses also exist, to prevent a single point of failure. The model was originally formally propounded by Dante Orlandella and James T. Reason of the University of Manchester, and has since gained widespread acceptance. It is sometimes called the "cumulative act effect".

In accident analysis, a chain of events consists of the contributing factors leading to an undesired outcome.

Single-pilot resource management (SRM) is an adaptation of crew resource management (CRM) training to single-pilot operations. The purpose of SRM is to reduce the number of aviation accidents caused by human error by teaching pilots about their own human limitations and how to maximize their performance. The initiative for this training began in 2005 when the NBAA published training guidelines for single-pilot operations of very light jets (VLJs). However, the application of SRM is not limited to VLJ pilots. This training applies to all single-pilot flights in general aviation (GA).

Effects of fatigue on safety human tiredness as a hazard

Fatigue is a major safety concern in many fields, but especially in transportation, because fatigue can result in disastrous accidents. Fatigue is considered an internal precondition for unsafe acts because it negatively affects the human operator's internal state. Research has generally focused on pilots, truck drivers, and shift workers.

Next Generation (NextGen) Data Communications, an element of the Next Generation Air Transportation System, will significantly reduce controller-to-pilot communications and controller workload, whilst improving safety.

Mike Monroney Aeronautical Center is administered as one of the FAA Regional Offices.

Civil Aerospace Medical Institute

Civil Aerospace Medical Institute (CAMI) is the medical certification, education, research, and occupational medicine wing of the Office of Aerospace Medicine (AAM) under the auspices of the Federal Aviation Administration Office of Aviation Safety. The Institute's primary goal is to enhance aviation safety.

Maritime Resource Management (MRM) is a human factors training programme aimed at the maritime industry. The MRM training programme was launched in 1993 - at that time under the name Bridge Resource Management - and aims at preventing accidents at sea caused by human error.

Aviation accident analysis is performed to determine the cause of errors once an accident has happened. In the modern aviation industry, it is also used to analyze a database of past accidents in order to prevent an accident from happening. Many models have been used not only for the accident investigation but also for educational purpose.

Environmental causes of aviation stress

In aviation, a source of stress that comes from the environment is known as an environmental stressor. Stress is defined as a situation, variable, or circumstance that interrupts the normal functioning of an individual and, most of the time, causes a threat. It can be related not only to mental health, but also to physical health.

Impact of culture on aviation safety

Culture can affect aviation safety through its effect on how the flight crew deals with difficult situations; cultures with lower power distances and higher levels of individuality can result in better aviation safety outcomes. In higher power cultures subordinates are less likely to question their superiors. The crash of Korean Air Flight 801 in 1997 was attributed to the pilot's decision to land despite the junior officer's disagreement, while the crash of Avianca Flight 52 was caused by the failure to communicate critical low-fuel data between pilots and controllers, and by the failure of the controllers to ask the pilots if they were declaring an emergency and assist the pilots in landing the aircraft. The crashes have been blamed on aspects of the national cultures of the crews.

Pilot fatigue

The International Civil Aviation Organization (ICAO) defines fatigue as "A physiological state of reduced mental or physical performance capability resulting from sleep loss or extended wakefulness, circadian phase, or workload." The phenomenon places great risk on the crew and passengers of an airplane because it significantly increases the chance of pilot error. Fatigue is particularly prevalent among pilots because of "unpredictable work hours, long duty periods, circadian disruption, and insufficient sleep". These factors can occur together to produce a combination of sleep deprivation, circadian rhythm effects, and 'time-on task' fatigue. Regulators attempt to mitigate fatigue by limiting the number of hours pilots are allowed to fly over varying periods of time.

Stress in the aviation industry

Stress in the aviation industry is a common phenomenon composed of three sources: physiological stressors, psychological stressors, and environmental stressors. Professional pilots can experience stress in flight, on the ground during work-related activities, and during personal time because of the influence of their occupation. An airline pilot can be an extremely stressful job due to the workload, responsibilities and safety of the thousands of passengers they transport around the world. Chronic levels of stress can negatively impact one's health, job performance and cognitive functioning. Being exposed to stress does not always negatively influence humans because it can motivate people to improve and help them adapt to a new environment. Unfortunate accidents start to occur when a pilot is under excessive stress, as it dramatically affects his or her physical, emotional, and mental conditions. Stress "jeopardizes decision-making relevance and cognitive functioning" and it is a prominent cause of pilot error. Being a pilot is considered a unique job that requires managing high workloads and good psychological and physical health. Unlike the other professional jobs, pilots are considered to be highly affected by stress levels. One study states that 70% of surgeons agreed that stress and fatigue don't impact their performance level, while only 26% of pilots denied that stress influences their performance. Pilots themselves realize how powerful stress can be, and yet many accidents and incidents continues to occur and have occurred, such as Asiana Airlines Flight 214, American Airlines Flight 1420, and Polish Air Force Tu-154.

NOTECHS is a system used to assess the non-technical skills of crew members in the aviation industry. Introduced in the late 1990s, the system has been widely used by airlines during crew selection process, picking out individuals who possess capable skills that are not directly related to aircraft controls or systems. In aviation, 70 percent of all accidents are induced from pilot error, lack of communication and decision making being two contributing factors to these accidents. NOTECHS assess and provide feedback on the performance of pilots' social and cognitive skills to help minimize pilot error and enhance safety in the future. NOTECHS system also aims to improve the Crew Resource Management training system.

The SHELL model is a conceptual model of human factors that clarifies the scope of aviation human factors and assists in understanding the human factor relationships between aviation system resources/environment and the human component in the aviation system.

The LOSA Collaborative is a private organization that serves as the main provider of the Line Operation Safety Audit (LOSA). It was established in 2001 in Austin, Texas to expand the use of the LOSA and sharing of airline safety data in coordination with protocols provided by the International Civil Aviation Organization (ICAO) and the Federal Aviation Administration (FAA).

References

  1. 1 2 3 Dekker, Sidney; Lundström, Johan (May 2007). "From Threat and Error Management (TEM) to Resilience" (PDF). Journal of Human Factors and Aerospace Safety: 1. Retrieved 6 October 2015.
  2. Maurino, Dan (18 April 2005). "Threat and Error Management (TEM)" (PDF). Coordinator, Flight safety and Human Factors Programme - ICAO. Canadian Aviation Safety Seminar (CASS): 1. Retrieved 6 October 2015.
  3. Banks, Ian. "Threat & Error Management (TEM) SafeSkies Presentation" (PDF). Retrieved 19 October 2015.
  4. 1 2 Thomas, Matthew (2004). "Predictors of Threat and Error Management: Identification of Core Nontechnical Skills and Implications for Training Systems Design" (PDF). The International Journal of Aviation Psychology. 14 (2). Retrieved 24 October 2015.
  5. 1 2 Earl, Laurie; Murray, Patrick; Bates, Paul (2011). "Line Operations Safety Audit (LOSA) for the management of safety in single pilot operations (LOSA:SP) in Australia and New Zealand" (PDF). Aeronautica (Griffith University Aerospace Strategic Study Centre) (1): 2.
  6. Thomas, Matthew (2003). "Operational Fidelity in Simulation-Based Training: The Use of Data from Threat and Error Management Analysis in Instructional Systems Design" (PDF). Proceedings of SimTecT2003: Simulation Conference: 2. Retrieved 19 October 2015.
  7. Thomas, Matthew; Petrilli, Renee (Jan 2006). "Crew Familiarity: Operational Experience, NonTechnical Performance, and Error Management" (PDF). Aviation, Space, and Environmental Medicine. 77 (1). Retrieved 25 October 2015.
  8. Sexton, J. Bryan; Thomas, Eric; Helmreich, Robert (March 2000). "Error, Stress, and Teamwork in Medicine and Aviation: Cross Sectional Surveys" (PDF). British Medical Journal. 320 (7273): 745–749. Retrieved 25 October 2015.
  9. 1 2 3 4 Earl, Laurie; Bates, Paul; Murray, Patrick; Glendon, Ian; Creed, Peter (2012). "Developing a Single-Pilot Line Operations Safety Audit: An Aviation Pilot Study" (PDF). Aviation Psychology and Applied Human Factors. 2: 49–61. doi:10.1027/2192-0923/a000027 . Retrieved 24 October 2015.
  10. Leva, M.C.; et al. (August 2008). "The advancement of a new human factors report – 'The Unique Report' – facilitating flight crew auditing of performance/operations as part of an airline's safety management system". Ergonomics. 53 (2): 164–183. doi:10.1080/00140130903437131.
  11. 1 2 Edward; et al. (February 2015). "National Aeronautics and Space Administration threat and error model applied to pediatric cardiac surgery: Error cycles precede 85% of patient deaths". The Journal of Thoracic and Cardiovascular Surgery. 149 (2): 496–507.e4. doi:10.1016/j.jtcvs.2014.10.058.
  12. Kearns, Suzanne; Sutton, Jennifer (April 2013). "Hangar Talk Survey: Using Stories as a Naturalistic Method of Informing Threat and Error Management Training". Human Factors. 55 (2): 267–77. doi:10.1177/0018720812452127. PMID   23691823.
  13. Thomas, Matthew; Ferguson, Sally (July 2010). "Prior Sleep, Prior Wake, and Crew Performance During Normal Flight Operations". Aviation, Space, and Environmental Medicine. 81 (7).
  14. Drury, Arthur; Ferguson, Sally; Thomas, Matthew (August 2011). "Restricted sleep and negative affective states in commercial pilots during short haul operations". Accident Analysis and Prevention. 45: 80–84. doi:10.1016/j.aap.2011.09.031.
  15. 1 2 3 4 5 "Line Operations Safety Audit (LOSA)" (PDF). ICAO Journal (First Edition): 25–29. 2002. Retrieved 18 November 2015.