Pilot error

Last updated

1994 Fairchild Air Force Base B-52 crash, caused by flying the aircraft beyond its operational limits. Here the aircraft is seen in an unrecoverable bank, a split second before the crash. This accident is now used in military and civilian aviation environments as a case study in teaching crew resource management. FairchildB52Crash.jpg
1994 Fairchild Air Force Base B-52 crash, caused by flying the aircraft beyond its operational limits. Here the aircraft is seen in an unrecoverable bank, a split second before the crash. This accident is now used in military and civilian aviation environments as a case study in teaching crew resource management.
Actual flight path (red) of TWA Flight 3 from departure to crash point (controlled flight into terrain). Blue line shows the nominal Las Vegas course, while green is a typical course from Boulder. The pilot inadvertently used the Boulder outbound course instead of the appropriate Las Vegas course. TWA3.png
Actual flight path (red) of TWA Flight 3 from departure to crash point (controlled flight into terrain). Blue line shows the nominal Las Vegas course, while green is a typical course from Boulder. The pilot inadvertently used the Boulder outbound course instead of the appropriate Las Vegas course.
Brazil location map.svg
Red pog.svg
Maraba Airport
Green pog.svg
Belem Airport
Airplane Crash.svg
Departure/destination airports and crash site location of Varig Flight 254 (major navigational error leading to fuel exhaustion). The flight plan was later shown to 21 pilots of major airlines. No fewer than 15 pilots committed the same mistake.
Map of the 2001 Linate Airport runway collision caused by taking the wrong taxiing route (red instead of green), as control tower had not given clear instructions. The accident occurred in thick fog. Linate Airport disaster map en.gif
Map of the 2001 Linate Airport runway collision caused by taking the wrong taxiing route (red instead of green), as control tower had not given clear instructions. The accident occurred in thick fog.
The Tenerife airport disaster now serves as a textbook example. Due to several misunderstandings, the KLM flight tried to take off while the Pan Am flight was still on the runway. The airport was accommodating an unusually large number of commercial airliners, resulting in disruption of the normal use of taxiways. Map Tenerife Disaster EN.svg
The Tenerife airport disaster now serves as a textbook example. Due to several misunderstandings, the KLM flight tried to take off while the Pan Am flight was still on the runway. The airport was accommodating an unusually large number of commercial airliners, resulting in disruption of the normal use of taxiways.
The "three-pointer" design altimeter is one of the most prone to being misread by pilots (a cause of the UA 389 and G-AOVD crashes). 3-Pointer Altimeter.svg
The "three-pointer" design altimeter is one of the most prone to being misread by pilots (a cause of the UA 389 and G-AOVD crashes).

In aviation , pilot error generally refers to an action or decision made by a pilot that is a substantial contributing factor leading to an aviation accident. It also includes a pilot's failure to make a correct decision or take proper action. [2] Errors are intentional actions that fail to achieve their intended outcomes. [3] The Chicago Convention defines the term "accident" as "an occurrence associated with the operation of an aircraft [...] in which [...] a person is fatally or seriously injured [...] except when the injuries are [...] inflicted by other persons." [4] Hence the definition of "pilot error" does not include deliberate crashing (and such crashes are not classified as accidents).

Contents

The causes of pilot error include psychological and physiological human limitations. Various forms of threat and error management have been implemented into pilot training programs to teach crew members how to deal with impending situations that arise throughout the course of a flight. [5]

Accounting for the way human factors influence the actions of pilots is now considered standard practice by accident investigators when examining the chain of events that led to an accident. [5] [6]

Description

Modern accident investigators avoid the words "pilot error", as the scope of their work is to determine the cause of an accident, rather than to apportion blame. Furthermore, any attempt to incriminate the pilots does not consider that they are part of a broader system, which in turn may be accountable for their fatigue, work pressure, or lack of training. [6] The International Civil Aviation Organization (ICAO), and its member states, therefore adopted James Reason's model of causation in 1993 in an effort to better understand the role of human factors in aviation accidents. [7]

Pilot error is nevertheless a major cause of air accidents. In 2004, it was identified as the primary reason for 78.6% of disastrous general aviation (GA) accidents, and as the major cause of 75.5% of GA accidents in the United States. [8] [ better source needed ] There are multiple factors that can cause pilot error; mistakes in the decision-making process can be due to habitual tendencies, biases, as well as a breakdown in the processing of the information coming in. For aircraft pilots, in extreme circumstances these errors are highly likely to result in fatalities. [9]

Causes of pilot error

Pilots work in complex environments and are routinely exposed to high amounts of situational stress in the workplace, inducing pilot error which may result in a threat to flight safety. While aircraft accidents are infrequent, they are highly visible and often involve significant numbers of fatalities. For this reason, research on causal factors and methodologies of mitigating risk associated with pilot error is exhaustive. Pilot error results from physiological and psychological limitations inherent in humans. "Causes of error include fatigue, workload, and fear as well as cognitive overload, poor interpersonal communications, imperfect information processing, and flawed decision making." [10] Throughout the course of every flight, crews are intrinsically subjected to a variety of external threats and commit a range of errors that have the potential to negatively impact the safety of the aircraft. [11]

Threats

The term "threat" is defined as any event "external to flight crew's influence which can increase the operational complexity of a flight." [12] Threats may further be broken down into environmental threats and airline threats. Environmental threats are ultimately out of the hands of crew members and the airline, as they hold no influence on "adverse weather conditions, air traffic control shortcomings, bird strikes, and high terrain." [12] Conversely, airline threats are not manageable by the flight crew, but may be controlled by the airline's management. These threats include "aircraft malfunctions, cabin interruptions, operational pressure, ground/ramp errors/events, cabin events and interruptions, ground maintenance errors, and inadequacies of manuals and charts." [12]

Errors

The term "error" is defined as any action or inaction leading to deviation from team or organizational intentions. [10] Error stems from physiological and psychological human limitations such as illness, medication, stress, alcohol/drug abuse, fatigue, emotion, etc. Error is inevitable in humans and is primarily related to operational and behavioral mishaps. [13] Errors can vary from incorrect altimeter setting and deviations from flight course, to more severe errors such as exceeding maximum structural speeds or forgetting to put down landing or takeoff flaps.

Decision making

Reasons for negative reporting of accidents include staff being too busy, confusing data entry forms, lack of training and less education, lack of feedback to staff on reported data and punitive organizational cultures. [14] Wiegmann and Shappell invented three cognitive models to analyze approximately 4,000 pilot factors associated with more than 2,000 U.S. Navy aviation mishaps. Although the three cognitive models have slight differences in the types of errors, all three lead to the same conclusion: errors in judgment. [15] The three steps are decision-making, goal-setting, and strategy-selection errors, all of which were highly related to primary accidents. [15] For example, on 28 December 2014, AirAsia Flight 8501, which was carrying seven crew members and 155 passengers, crashed into the Java Sea due to several fatal mistakes made by the captain in the poor weather conditions. In this case, the captain chose to exceed the maximum climb rate for a commercial aircraft, which caused a critical stall from which he was unable to recover. [16]

Threat and error management (TEM)

TEM involves the effective detection and response to internal or external factors that have the potential to degrade the safety of an aircraft's operations. [11] Methods of teaching TEM stress replicability, or reliability of performance across recurring situations. [17] TEM aims to prepare crews with the "coordinative and cognitive ability to handle both routine and unforeseen surprises and anomalies." [17] The desired outcome of TEM training is the development of 'resilience'. Resilience, in this context, is the ability to recognize and act adaptively to disruptions which may be encountered during flight operations. [18] TEM training occurs in various forms, with varying levels of success. Some of these training methods include data collection using the line operations safety audit (LOSA), implementation of crew resource management (CRM), cockpit task management (CTM), and the integrated use of checklists in both commercial and general aviation. Some other resources built into most modern aircraft that help minimize risk and manage threat and error are airborne collision and avoidance systems (ACAS) and ground proximity warning systems (GPWS). [19] With the consolidation of onboard computer systems and the implementation of proper pilot training, airlines and crew members look to mitigate the inherent risks associated with human factors.

Line operations safety audit (LOSA)

LOSA is a structured observational program designed to collect data for the development and improvement of countermeasures to operational errors. [20] Through the audit process, trained observers are able to collect information regarding the normal procedures, protocol, and decision making processes flight crews undertake when faced with threats and errors during normal operation. This data driven analysis of threat and error management is useful for examining pilot behavior in relation to situational analysis. It provides a basis for further implementation of safety procedures or training to help mitigate errors and risks. [12] Observers on flights which are being audited typically observe the following: [20]

LOSA was developed to assist crew resource management practices in reducing human error in complex flight operations. [12] LOSA produces beneficial data that reveals how many errors or threats are encountered per flight, the number of errors which could have resulted in a serious threat to safety, and correctness of crew action or inaction. This data has proven to be useful in the development of CRM techniques and identification of what issues need to be addressed in training. [12]

Crew resource management (CRM)

CRM is the "effective use of all available resources by individuals and crews to safely and effectively accomplish a mission or task, as well as identifying and managing the conditions that lead to error." [21] CRM training has been integrated and mandatory for most pilot training programs, and has been the accepted standard for developing human factors skills for air crews and airlines. Although there is no universal CRM program, airlines usually customize their training to best suit the needs of the organization. The principles of each program are usually closely aligned. According to the U.S. Navy, there are seven critical CRM skills: [21]

These seven skills comprise the critical foundation for effective aircrew coordination. With the development and use of these core skills, flight crews "highlight the importance of identifying human factors and team dynamics to reduce human errors that lead to aviation mishaps." [21]

Application and effectiveness of CRM

Since the implementation of CRM circa 1979, following the need for increased research on resource management by NASA, the aviation industry has seen tremendous evolution of the application of CRM training procedures. [22] The applications of CRM has been developed in a series of generations:

  • First generation: emphasized individual psychology and testing, where corrections could be made to behavior.
  • Second generation: featured a shift in focus to cockpit group dynamics.
  • Third evolution: diversification of scope and an emphasis on training crews in how they must function both in and out of the cockpit.
  • Fourth generation: CRM integrated procedure into training, allowing organizations to tailor training to their needs.
  • Fifth generation (current): acknowledges that human error is inevitable and provides information to improve safety standards. [23]

Today, CRM is implemented through pilot and crew training sessions, simulations, and through interactions with senior ranked personnel and flight instructors such as briefing and debriefing flights. Although it is difficult to measure the success of CRM programs, studies have been conclusive that there is a correlation between CRM programs and better risk management. [23]

Cockpit task management (CTM)

Multiple sources of information can be taken from one interface here, known as the PFD, or primary flight display from which pilots receive all of the most important data readings Primary Flight Display.svg
Multiple sources of information can be taken from one interface here, known as the PFD, or primary flight display from which pilots receive all of the most important data readings

Cockpit task management (CTM) is the "management level activity pilots perform as they initiate, monitor, prioritize, and terminate cockpit tasks." [24] A 'task' is defined as a process performed to achieve a goal (i.e. fly to a waypoint, descend to a desired altitude). [24] CTM training focuses on teaching crew members how to handle concurrent tasks which compete for their attention. This includes the following processes:

The need for CTM training is a result of the capacity of human attentional facilities and the limitations of working memory. Crew members may devote more mental or physical resources to a particular task which demands priority or requires the immediate safety of the aircraft. [24] CTM has been integrated to pilot training and goes hand in hand with CRM. Some aircraft operating systems have made progress in aiding CTM by combining instrument gauges into one screen. An example of this is a digital attitude indicator, which simultaneously shows the pilot the heading, airspeed, descent or ascent rate and a plethora of other pertinent information. Implementations such as these allow crews to gather multiple sources of information quickly and accurately, which frees up mental capacity to be focused on other, more prominent tasks.

A military pilot reads the pre-flight checklist prior the mission. Checklists ensure that pilots are able to follow operational procedure and aids in memory recall. Running the checklist (14729226029).jpg
A military pilot reads the pre-flight checklist prior the mission. Checklists ensure that pilots are able to follow operational procedure and aids in memory recall.

Checklists

The use of checklists before, during and after flights has established a strong presence in all types of aviation as a means of managing error and reducing the possibility of risk. Checklists are highly regulated and consist of protocols and procedures for the majority of the actions required during a flight. [25] The objectives of checklists include "memory recall, standardization and regulation of processes or methodologies." [25] The use of checklists in aviation has become an industry standard practice, and the completion of checklists from memory is considered a violation of protocol and pilot error. Studies have shown that increased errors in judgement and cognitive function of the brain, along with changes in memory function are a few of the effects of stress and fatigue. [26] Both of these are inevitable human factors encountered in the commercial aviation industry. The use of checklists in emergency situations also contributes to troubleshooting and reverse examining the chain of events which may have led to the particular incident or crash. Apart from checklists issued by regulatory bodies such as the FAA or ICAO, or checklists made by aircraft manufacturers, pilots also have personal qualitative checklists aimed to ensure their fitness and ability to fly the aircraft. An example is the IM SAFE checklist (illness, medication, stress, alcohol, fatigue/food, emotion) and a number of other qualitative assessments which pilots may perform before or during a flight to ensure the safety of the aircraft and passengers. [25] These checklists, along with a number of other redundancies integrated into most modern aircraft operation systems, ensure the pilot remains vigilant, and in turn, aims to reduce the risk of pilot error.

Notable examples

One of the most famous examples of an aircraft disaster that was attributed to pilot error was the night-time crash of Eastern Air Lines Flight 401 near Miami, Florida on 29 December 1972. The captain, first officer, and flight engineer had become fixated on a faulty landing gear light and had failed to realize that one of the crew had accidentally bumped the flight controls, altering the autopilot settings from level flight to a slow descent. Told by ATC to hold over a sparsely populated area away from the airport while they dealt with the problem (with, as a result, very few lights visible on the ground to act as an external reference), the distracted flight crew did not notice the plane losing height and the aircraft eventually struck the ground in the Everglades, killing 101 of the 176 passengers and crew. The subsequent National Transportation Safety Board (NTSB) report on the incident blamed the flight crew for failing to monitor the aircraft's instruments properly. Details of the incident are now frequently used as a case study in training exercises by aircrews and air traffic controllers.

During 2004 in the United States, pilot error was listed as the primary cause of 78.6% of fatal general aviation accidents, and as the primary cause of 75.5% of general aviation accidents overall. [27] For scheduled air transport, pilot error typically accounts for just over half of worldwide accidents with a known cause. [8]

See also

Related Research Articles

<span class="mw-page-title-main">Tenerife airport disaster</span> 1977 runway collision

The Tenerife airport disaster occurred on 27 March 1977, when two Boeing 747 passenger jets collided on the runway at Los Rodeos Airport on the Spanish island of Tenerife. The accident occurred when KLM Flight 4805 initiated its takeoff run in dense fog, colliding with the rear of Pan Am Flight 1736 still on the runway. The impact and the resulting fire killed all 248 people on board the KLM plane and 335 of the 396 people on board the Pan Am plane, with only 61 survivors in the front section of the latter aircraft. With a total of 583 fatalities, the disaster is the deadliest accident in aviation history.

<span class="mw-page-title-main">Aviation safety</span> State in which risks associated with aviation are at an acceptable level

Aviation safety is the study and practice of managing risks in aviation. This includes preventing aviation accidents and incidents through research, educating air travel personnel, passengers and the general public, as well as the design of aircraft and aviation infrastructure. The aviation industry is subject to significant regulation and oversight.

<span class="mw-page-title-main">Saudia Flight 163</span> August 1980 aircraft fire in Riyadh, Saudi Arabia

Saudi Arabian Airlines Flight 163 was a scheduled Saudia passenger flight departing from Quaid-e-Azam Airport in Karachi, Pakistan, bound for Kandara Airport in Jeddah, Saudi Arabia, via Riyadh International Airport in Riyadh, Saudi Arabia, which caught fire after takeoff from Riyadh International Airport on 19 August 1980. Although the Lockheed L-1011-200 TriStar made a successful emergency landing at Riyadh, the flight crew failed to perform an emergency evacuation of the airplane, leading to the deaths of all 287 passengers and 14 crew on board the aircraft from smoke inhalation.

Crew resource management or cockpit resource management (CRM) is a set of training procedures for use in environments where human error can have devastating effects. CRM is primarily used for improving aviation safety and focuses on interpersonal communication, leadership, and decision making in aircraft cockpits. Its founder is David Beaty, a former Royal Air Force and a BOAC pilot who wrote The Human Factor in Aircraft Accidents (1969). Despite the considerable development of electronic aids since then, many principles he developed continue to prove effective.

<span class="mw-page-title-main">Gulf Air Flight 072</span> 2000 aviation accident

Gulf Air Flight 072 (GF072/GFA072) was a scheduled international passenger flight from Cairo International Airport in Egypt to Bahrain International Airport in Bahrain, operated by Gulf Air. On 23 August 2000 at 19:30 Arabia Standard Time (UTC+3), the Airbus A320 crashed minutes after executing a go-around upon failed attempt to land on Runway 12. The flight crew suffered from spatial disorientation during the go-around and crashed into the shallow waters of the Persian Gulf 2 km (1 nmi) from the airport. All 143 people on board the aircraft were killed.

<span class="mw-page-title-main">United Airlines Flight 173</span> 1978 aviation accident in Portland, Oregon

United Airlines Flight 173 was a scheduled flight from John F. Kennedy International Airport in New York City to Portland International Airport in Portland, Oregon, with a scheduled stop in Denver, Colorado. On December 28, 1978, the aircraft flying this route ran out of fuel while troubleshooting a landing gear problem and crashed in a suburban Portland neighborhood near NE 157th Avenue and East Burnside Street, killing 10 people on board.

<span class="mw-page-title-main">Comair Flight 5191</span> 2006 passenger plane crash in Lexington, Kentucky, United States

Comair Flight 5191 was a scheduled United States domestic passenger flight from Lexington, Kentucky, to Atlanta, Georgia. On the morning of August 27, 2006, at around 06:07 EDT, the Bombardier Canadair Regional Jet 100ER crashed while attempting to take off from Blue Grass Airport in Fayette County, Kentucky, 4 miles west of the central business district of the city of Lexington.

<span class="mw-page-title-main">One-Two-Go Airlines Flight 269</span> 2007 plane crash in Phuket, Thailand

One-Two-Go Airlines Flight 269 (OG269) was a scheduled domestic passenger flight from Bangkok to Phuket, Thailand. On 16 September 2007, about 15:41 ICT, the McDonnell Douglas MD-82 operating the flight crashed into an embankment beside runway 27 at Phuket International Airport (HKT) bursting into flames upon impact during an attempted go-around after an aborted landing, killing 90 of the 130 people on board. It is the third deadliest aviation incident to occur in Thailand.

<span class="mw-page-title-main">Aviastar-TU Flight 1906</span> 2010 aviation accident

Aviastar-TU Flight 1906 was a Tupolev Tu-204 that crashed while attempting to land at Domodedovo International Airport, Moscow, Russia, in heavy fog on 22 March 2010. The aircraft was on a ferry flight from Hurghada International Airport, Egypt to Moscow, and had no passengers on board; all eight crew survived the accident, four with serious injuries requiring hospitalization and four with minor injuries. The accident was the first hull loss of a Tu-204 and the first hull loss for Aviastar-TU.

In aeronautics, loss of control (LOC) is the unintended departure of an aircraft from controlled flight and is a significant factor in several aviation accidents worldwide. In 2015 it was the leading cause of general aviation accidents. Loss of control may be the result of mechanical failure, external disturbances, aircraft upset conditions, or inappropriate crew actions or responses.

Pilot decision making, also known as aeronautical decision making (ADM), is a process that aviators perform to effectively handle troublesome situations that are encountered. Pilot decision-making is applied in almost every stage of the flight as it considers weather, air spaces, airport conditions, estimated time of arrival and so forth. During the flight, employers pressure pilots regarding time and fuel restrictions since a pilots’ performance directly affects the company’s revenue and brand image. This pressure often hinders a pilot's decision-making process leading to dangerous situations as 50% to 90% of aviation accidents are the result of pilot error.

<span class="mw-page-title-main">Impact of culture on aviation safety</span>

Culture can affect aviation safety through its effect on how the flight crew deals with difficult situations; cultures with lower power distances and higher levels of individuality can result in better aviation safety outcomes. In higher power cultures subordinates are less likely to question their superiors. The crash of Korean Air Flight 801 in 1997 was attributed to the pilot's decision to land despite the junior officer's disagreement, while the crash of Avianca Flight 052 was caused by the failure to communicate critical low-fuel data between pilots and controllers, and by the failure of the controllers to ask the pilots if they were declaring an emergency and assist the pilots in landing the aircraft. The crashes have been blamed on aspects of the national cultures of the crews.

<span class="mw-page-title-main">Stress in the aviation industry</span> Pilots wellbeing whilst working

Stress in the aviation industry is a common phenomenon composed of three sources: physiological stressors, psychological stressors, and environmental stressors. Professional pilots can experience stress in flight, on the ground during work-related activities, and during personal time because of the influence of their occupation. An airline pilot can be an extremely stressful job due to the workload, responsibilities and safety of the thousands of passengers they transport around the world. Chronic levels of stress can negatively impact one's health, job performance and cognitive functioning. Being exposed to stress does not always negatively influence humans because it can motivate people to improve and help them adapt to a new environment. Unfortunate accidents start to occur when a pilot is under excessive stress, as it dramatically affects his or her physical, emotional, and mental conditions. Stress "jeopardizes decision-making relevance and cognitive functioning" and it is a prominent cause of pilot error. Being a pilot is considered a unique job that requires managing high workloads and good psychological and physical health. Unlike the other professional jobs, pilots are considered to be highly affected by stress levels. One study states that 70% of surgeons agreed that stress and fatigue don't impact their performance level, while only 26% of pilots denied that stress influences their performance. Pilots themselves realize how powerful stress can be, and yet many accidents and incidents continues to occur and have occurred, such as Asiana Airlines Flight 214, American Airlines Flight 1420, and Polish Air Force Tu-154.

<span class="mw-page-title-main">Tropical Airways Flight 1301</span> Domestic short-haul passenger flight crash

Tropical Airways Flight 1301 (TBG1301/M71301) was a domestic short-haul passenger flight, flying from Cap-Haïtien International Airport in Cap-Haïtien, Haiti to the commune of Port-de-Paix which crashed onto a sugarcane field less than 10 minutes after take off on the evening of 24 August 2003. The aircraft was a 19-seater Let L-410 Turbolet carrying 19 passengers and 2 crew. Witnesses stated that the aircraft caught fire during take-off and exploded when it hit the ground. All on board were killed.

<span class="mw-page-title-main">Caspian Airlines Flight 6936</span> Aircraft accident in 2020

On 27 January 2020, Caspian Airlines Flight 6936 overran the runway on landing at Mahshahr Airport, Iran, on a domestic flight from Tehran. All 144 people on board survived, with two injured.

References

  1. "Tenerife Disaster – 27 March 1977: The Utility of the Swiss Cheese Model & other Accident Causation Frameworks". Go Flight Medicine. Retrieved 13 October 2014.
  2. Pilot’s Handbook of Aeronautical Knowledge (2016). U.S. Department of Transportation. Federal Aviation Administration, Flight Standards Service pdf.
  3. Error Management (OGHFA BN). Operator's Guide to Human Factors in Aviation. Skybrary
  4. How exactly should I understand the term "accidental hull loss"?. Aviation stack overflow
  5. 1 2 "Risk Management Handbook" (PDF) (Change 1 ed.). Federal Aviation Administration. January 2016. Chapter 2. Human behavior. Retrieved 16 November 2018.
  6. 1 2 Rural and Regional Affairs and Transport References Committee (May 2013). "Aviation Accident Investigations" (PDF). Government of Australia.
  7. Investigating Human Error: Incidents, Accidents, and Complex Systems. Ashgate Publishing. 2004. ISBN   0754641228.
  8. 1 2 "Accident statistics". www.planecrashinfo.com. Retrieved 21 October 2015.
  9. Foyle, D.C., & Hooey, B.L. (Eds.). (2007). Human performance modeling in aviation. CRC Press.
  10. 1 2 Helmreich, Robert L. (18 March 2000). "On Error Management: Lessons From Aviation". BMJ: British Medical Journal. 320–7237 (7237): 781–785. doi:10.1136/bmj.320.7237.781. PMC   1117774 . PMID   10720367.
  11. 1 2 Thomas, Matthew J.W. (2004). "Predictors of Threat and Error Management: Identification of Core Nontechnical Skills and Implications for Training Systems Design". The International Journal of Aviation Psychology. 14 (2): 207–231. doi:10.1207/s15327108ijap1402_6. S2CID   15271960.
  12. 1 2 3 4 5 6 Earl, Laurie; Bates, Paul R.; Murray, Patrick S.; Glendon, A. Ian; Creed, Peter A. (January 2012). "Developing a Single-Pilot Line Operations Safety Audit". Aviation Psychology and Applied Human Factors. 2 (2): 49–61. doi:10.1027/2192-0923/a000027. hdl: 10072/49214 . ISSN   2192-0923.
  13. Li, Guohua; Baker, Susan P.; Grabowski, Jurek G.; Rebok, George W. (February 2001). "Factors Associated With Pilot Error in Aviation Crashes". Aviation, Space, and Environmental Medicine. 72 (1): 52–58. PMID   11194994.
  14. Stanhope, N.; Crowley-Murphy, M. (1999). "An evaluation of adverse incident reporting". Journal of Evaluation in Clinical Practice. 5 (1): 5–12. doi:10.1046/j.1365-2753.1999.00146.x. PMID   10468379.
  15. 1 2 Wiegmann, D.A., & Shappell, S.A. (2001). Human error perspectives in aviation. The International Journal of Aviation Psychology, 11(4), 341–357.
  16. Stacey, Daniel (15 January 2015). "Indonesian Air-Traffic Control Is Unsophisticated, Pilots Say". The Wall Street Journal. Retrieved 26 January 2015
  17. 1 2 Dekker, Sidney; Lundström, Johan (May 2007). "From Threat and Error Management (TEM) to Resilience". Journal of Human Factors and Aerospace Safety. 260 (70): 1–10.
  18. Mizzi, Andrew; Mccarthy, Pete (2023). "Resilience Engineering's synergy with Threat and Error Management – an operationalised model". doi:10.1007/978-3-031-35392-5_36.{{cite journal}}: Cite journal requires |journal= (help)
  19. Maurino, Dan (April 2005). "Threat and Error Management (TEM)". Canadian Aviation Safety Seminar (CASS); Flight Safety and Human Factors Programme – ICAO.
  20. 1 2 "Line Operations Safety Audit (LOSA)". SKYbrary. Retrieved 24 August 2016.
  21. 1 2 3 Myers, Charles; Orndorff, Denise (2013). "Crew Resource Management: Not Just for Aviators Anymore". Journal of Applied Learning Technology. 3 (3): 44–48.
  22. Helmreich, Robert L.; Merritt, Ashleigh C.; Wilhelm, John A. (1999). "The Evolution of Crew Resource Management Training in Commercial Aviation". The International Journal of Aviation Psychology. 9 (1): 19–32. doi:10.1207/s15327108ijap0901_2. PMID   11541445.
  23. 1 2 Salas, Eduardo; Burke, Shawn C.; Bowers, Clint A.; Wilson, Katherine A. (2001). "Team Training in the Skies: Does Crew Resource Management (CRM) Training Work?". Human Factors. 43 (4): 641–674. doi:10.1518/001872001775870386. ISSN   0018-7208. PMID   12002012. S2CID   23109802.
  24. 1 2 3 Chou, Chung-Di; Madhavan, Das; Funk, Ken (1996). "Studies of Cockpit Task Management Errors". The International Journal of Aviation Psychology. 6 (4): 307–320. doi:10.1207/s15327108ijap0604_1.
  25. 1 2 3 Hales, Brigette M.; Pronovost, Peter J. (2006). "The Checklist – A Tool for Error Management and Performance". Journal of Critical Care. 21 (3): 231–235. doi:10.1016/j.jcrc.2006.06.002. PMID   16990087.
  26. Cavanagh, James F.; Frank, Michael J.; Allen, John J.B. (April 2010). "Social Stress Reactivity Alters Reward and Punishment Learning". Social Cognitive and Affective Neuroscience. 6 (3): 311–320. doi:10.1093/scan/nsq041. PMC   3110431 . PMID   20453038.
  27. "2005 Joseph T. Nall Report" (PDF). Archived from the original (PDF) on 2 February 2007. Retrieved 12 February 2007.
  28. "Aircraft Accident Investigation Report KNKT/07.01/08.01.36" (PDF). National Transportation Safety Committee, Indonesian Ministry of Transportation. 1 January 2007. Archived from the original (PDF) on 16 July 2011. Retrieved 8 June 2013. Aircraft Accident Investigation Report of Indonesian's National Transportation Safety Committee