Pilot decision making

Last updated

Pilot decision making, [1] also known as aeronautical decision making (ADM), [2] is a process that aviators perform to effectively handle troublesome situations that are encountered. Pilot decision-making is applied in almost every stage of the flight as it considers weather, air spaces, airport conditions, estimated time of arrival and so forth. During the flight, employers pressure pilots regarding time and fuel restrictions since a pilots’ performance directly affects the company’s revenue and brand image. This pressure often hinders a pilot's decision-making process leading to dangerous situations as 50% to 90% of aviation accidents are the result of pilot error. [3] [4] [5]

Contents

Decision-making process

Since the 1980s, [6] the airline industry has identified the aeronautical decision-making (ADM) process as a critical factor in safe aeronautical operations. Airline industries are motivated to create decision-making procedures supplemented by crew resource management (CRM) to advance air safety.

A risk assessment checklist for pilots. This program includes a wide array of aviation-related activities specific to the pilot and assesses health, fatigue, weather, capabilities, etc. Pilot Risk Assessment.png
A risk assessment checklist for pilots. This program includes a wide array of aviation-related activities specific to the pilot and assesses health, fatigue, weather, capabilities, etc.

The pilot decision-making process is an effective five-step management skill that a pilot should conduct to maximize success chance when facing an unexpected or critical event. This cyclic model allows the pilot to make a critical decision and follow up with a series of events to produce the best possible resolution.

Mnemonics

Pilots use mnemonics to help them deal with emergencies and unexpected situations. One of the most famous mnemonics is the phrase "Aviate, Navigate, Communicate", to remind pilots what their priorities should be. The first priority is to keep the aircraft flying, avoiding undesired aircraft states and controlled flight into terrain. Next the pilot(s) should verify their location and navigate towards a suitable destination. Communication with air traffic control, while important, is a lower priority. [9]

Mnemonics used to decide and carry out a course of action include T-DODAR (Time, Diagnose, Options, Decision, Assign, Review), [10] [11] [12] [13] FOR-DEC (Facts, Options, Risks and benefits, Decide, Execute, Check), [14] [15] [16] [17] DECIDE (Detect, Estimate, Choose, Identify, Do, Evaluate), [18] [14] DESIDE (Detect, Estimate, Set safety objectives, Identify, Do, Evaluate), [14] [10] [19] [15] [16] GRADE (Gather Information, Review Information, Analyse Alternatives, Decide, Evaluate), [10] [20] 3P (Perceive, Process, Perform), [21] and PIOSEE (Problem, Information, Options, Select, Execute, Evaluate). [22] [23] FOR-DEC was developed by Lufthansa and the German Aerospace Center, and is used by numerous European airlines, as well as in German nuclear power plants. [14] The hyphen in FOR-DEC is designed to make the pilots stop and think about whether they have considered all the options. [14] T-DODAR is used by British Airways, who added the initial T to remind pilots to consider time available before starting the decision-making process. [10] [14] [24]

Advantages of these techniques include that they force the crew to name the facts; they prevent jumping to conclusions; they give co-pilots a means to make their voice heard; they allow both pilots to participate in the decision-making process; and they enable the captain to withdraw an incorrect decision without losing leadership authority. Disadvantages include that they can be an obstacle to quick and obvious actions; [10] they are used as a tool for justification rather than decision; that they don't provide a way to communicate non-communicable knowledge such as intuitions and "gut feelings". It is important that the technique used is standardised across an airline, so everyone is speaking the same language. It is important that the technique does not become an obstacle to solving problems. [14]

SHOR (Stimuli, Hypotheses, Options, Response) can be used in time-pressured situations. [14]

NITS (Nature, Intentions, Time, Special Instructions) can be used to brief during an emergency, for example to brief the cabin crew. [25] [26] [13]

Difficulties

Fatigue

Study was conducted on how fatigue affects pilot's decision-making process Human fatigue study.jpg
Study was conducted on how fatigue affects pilot's decision-making process

Fatigue poses a significant issue in the aviation industry with the increase in demand for long-haul missions. Fatigue is especially detrimental to decision-making tasks, awareness-related tasks, and planning, which are the fundamental skills for pilots to operate their aircraft. This situation is especially dangerous since 26% of pilots deny the effect of fatigue. The official statistics showed a percentage of 4% to 8% of aviation accidents related to fatigue. [27] However, since fatigue lowers the performance of pilots and cripples their decision making process, fatigue impacts a much larger percentage of aviation accidents. The effects of fatigue are amplified with the changes in time zones due to jet lag disrupting biorhythm.

Pressure

During the flight, pilots are required to execute a specific departure and arrival time as the inability to meet these requirements results in the companies' increased fuel cost, delayed gate time fees, and delayed flights. These factors place pilots in a situation where their job performance directly correlates to the revenue of the employee company. This leads to high amounts of stress and pressure, which causes impairment in performance. [32]

There are significant difficulties presented during the phases associated with take-off and landing. The maneuvering process to approach and landing combined only accounts for 17% of the average flight time but is responsible for 70.2% of total aviation accidents. [33] Statistics prove a significantly larger number of accident occurrences during the phases where pilots are in stressed and pressured situations. At these phases, pilot decision-making can be critical. For example, the pilots of Asiana Airlines flight 214 were in a pressured and fatigued situation when they failed to overshoot after detecting a low approach path and high airspeed on the final approach.

With the advancement in aviation technology, pilots fall into automation bias Boeing 787 cockpit.jpg
With the advancement in aviation technology, pilots fall into automation bias

Automation bias

The advancement in technology has enabled tasks that are too complex for humans and extended human capabilities. Automation such as GPS, traffic alert, and autopilot, has been incorporated into aviation and has become one of the prime resources for critical decision making. With the sophistication and accuracy of current technology, humans have been relying on it excessively, which results in automation bias. Referenced from Human-Computer Studies, an experiment was conducted to measure the effects of automation bias on decision making. Two control groups were selected to monitor a specific task, with the first group having access to reliable automation aid and the second group with no access to aid. The results showed that the second group in non-automated settings outperformed their counterpart. The first group made more errors when not explicitly prompted by automation, moreover, they followed the instruction of automation even when it contradicted their decision. This experiment shows the example of automation bias and participants' high degree of obedience to automation. [34] Automation bias can lead to critical errors in pilot decision making, as it is one of the many difficulties in today's digital age.

Weather decision

VFR pilots flying into IFR conditions leads to high accident rate Valk.JPG
VFR pilots flying into IFR conditions leads to high accident rate

For the pilots flying under visual flight rule (VFR, in weather conditions clear enough to allow the pilot to see where the aircraft is going), it is significant to perform correct decision-making for the weather as they have to stay within the specific VFR weather requirements. The pilot must make a ‘go’ or ‘no-go’ decision as to if he or she will embark on a flight and if they will continue on the flight when the weather deteriorates.

VFR pilots primarily navigate by using the GPS, radio navigation systems, and most importantly pilotage. In order to perform pilotage, pilots must visually see the ground features and reference it to the map. Accidents are inevitable when weather conditions require pilots to fly primarily by reference to flight instruments without the proper instrument flight rules (IFR) equipments. In fact, over 19% of the general aviation crashes are caused from flying VFR in bad weather and 72% of these crashes are fatal. [35]

The research conducted by David O'Hare and Tracy Smitheram on pilots' decision-making in deteriorating conditions demonstrates the application of behavioral psychology to pilots. The experiment was conducted in a simulator where VFR pilots were presented with scenarios of cross-country flights in marginal weather. Participants of this experiment were measured by how their perspective of anticipated gains or losses affected the decision-making process. Results showed that the pilots who viewed decision making in the anticipated gains framework were significantly less likely to press on to deteriorating weather than the ones that were viewed in the losses framework. [36] This research shows that people are risk-averse when situations are viewed in terms of gains. It is important to compare the marginal benefit of pressing on into deteriorating weather to the risk associated with the flight to make the correct decision.

Commercial pilots and their associated airlines also have to contend with company expectations during their decision-making process regarding the weather. Commercial aircraft have higher capabilities for harsh weather, but their risk is significantly greater due to the passenger safety requirements and the sheer cost of the aircraft. Each airline has a different tolerance for weather, which poses problems for airlines that have more lenient protocols. Pilots are pressured to make a decision when canceling the flight, which could lead to a loss in reputation and revenue for the companies.

[37] [38] [39]

Emergencies

When pilots encounter emergencies, a checklist is referenced to follow a specific procedure to overcome the situation. However, not all parts of the emergency checklist explicitly state the qualitative actions that a pilot needs to perform. For example, in a forced landing, the pilot is required to choose a field to commit for landing, which requires the decision-making process to take into account winds, field quality, obstacles, distance, civilization, and other associated factors. The decision-making process is important as pilots are required to measure and compare the risks associated with each option. Four key conditions are required for an effective emergency decision.

It is important that if any of these conditions are absent, a defensive avoidance or hyper vigilance becomes prevalent and aggravates the decision making process. This theoretical model developed from psychological research provides a basis for pilots when confronting an emergency situation. [40]

See also

Related Research Articles

<span class="mw-page-title-main">United Airlines Flight 232</span> 1989 aviation accident

United Airlines Flight 232 was a regularly scheduled United Airlines flight from Stapleton International Airport in Denver to O'Hare International Airport in Chicago, continuing to Philadelphia International Airport. On July 19, 1989, the DC-10 serving the flight crash-landed at Sioux Gateway Airport in Sioux City, Iowa, after suffering a catastrophic failure of its tail-mounted engine due to an unnoticed manufacturing defect in the engine's fan disk, which resulted in the loss of many flight controls. Of the 296 passengers and crew on board, 112 died during the accident, while 184 people survived. 13 of the passengers were uninjured. It was the deadliest single-aircraft accident in the history of United Airlines.

<span class="mw-page-title-main">Aloha Airlines Flight 243</span> 1988 Hawaii aviation incident

Aloha Airlines Flight 243 was a scheduled Aloha Airlines flight between Hilo and Honolulu in Hawaii. On April 28, 1988, a Boeing 737-297 serving the flight suffered extensive damage after an explosive decompression in flight, caused by part of the fuselage breaking due to poor maintenance and metal fatigue. The plane was able to land safely at Kahului Airport on Maui. The one fatality, flight attendant Clarabelle "C.B." Lansing, was ejected from the airplane. Another 65 passengers and crew were injured. The substantial damage inflicted by the decompression, the loss of one cabin crew member, and the safe landing of the aircraft established the incident as a significant event in the history of aviation, with far-reaching effects on aviation safety policies and procedures.

<span class="mw-page-title-main">Aviation safety</span> State in which risks associated with aviation are at an acceptable level

Aviation safety is the study and practice of managing risks in aviation. This includes preventing aviation accidents and incidents through research, educating air travel personnel, passengers and the general public, as well as the design of aircraft and aviation infrastructure. The aviation industry is subject to significant regulation and oversight.

Crew resource management or cockpit resource management (CRM) is a set of training procedures for use in environments where human error can have devastating effects. CRM is primarily used for improving aviation safety and focuses on interpersonal communication, leadership, and decision making in aircraft cockpits. Its founder is David Beaty, a former Royal Air Force and a BOAC pilot who wrote "The Human Factor in Aircraft Accidents" (1969). Despite the considerable development of electronic aids since then, many principles he developed continue to prove effective.

<span class="mw-page-title-main">Pilot error</span> Decision, action or inaction by a pilot of an aircraft

Pilot error generally refers to an accident in which an action or decision made by the pilot was the cause or a contributing factor that led to the accident, but also includes the pilot's failure to make a correct decision or take proper action. Errors are intentional actions that fail to achieve their intended outcomes. The Chicago Convention defines the term "accident" as "an occurrence associated with the operation of an aircraft [...] in which [...] a person is fatally or seriously injured [...] except when the injuries are [...] inflicted by other persons." Hence the definition of "pilot error" does not include deliberate crashing.

Single-pilot resource management (SRM) is defined as the art and science of managing all the resources available to a single-pilot to ensure that the successful outcome of the flight is never in doubt. SRM includes the concepts of Aeronautical Decision Making (ADM), Risk Management (RM), Task Management (TM), Automation Management (AM), Controlled Flight Into Terrain (CFIT) Awareness, and Situational Awareness (SA). SRM training helps the pilot maintain situational awareness by managing the automation and associated aircraft control and navigation tasks. This enables the pilot to accurately assess and manage risk and make accurate and timely decisions.

<span class="mw-page-title-main">Texas International Airlines Flight 655</span> 1973 plane crash in Arkansas, United States

Texas International Airlines Flight 655, registration N94230, was a Convair 600 turboprop aircraft en route from El Dorado to Texarkana, Arkansas, crashing into Black Fork Mountain, Arkansas, on the night of September 27, 1973. The eight passengers and three crewmembers on board were killed.

<span class="mw-page-title-main">Continental Charters Flight 44-2</span> 1951 aviation accident

Continental Charters Flight 44-2, a domestic non scheduled passenger flight from Miami, Florida to Buffalo, New York, crashed on December 29, 1951 near Napoli, New York. The twin engine C-46 Commando, registration N3944C, crashed approximately 10:25 pm in adverse weather conditions. Of the four crew and 36 passengers on board, three crew members and 23 passengers perished. The flight crew's poor judgment in attempting a flight by visual reference during instrument weather conditions was the cause of the accident.

In aeronautics, loss of control (LOC) is the unintended departure of an aircraft from controlled flight and is a significant factor in several aviation accidents worldwide. In 2015 it was the leading cause of general aviation accidents. Loss of control may be the result of mechanical failure, external disturbances, aircraft upset conditions, or inappropriate crew actions or responses.

Automation bias is the propensity for humans to favor suggestions from automated decision-making systems and to ignore contradictory information made without automation, even if it is correct. Automation bias stems from the social psychology literature that found a bias in human-human interaction that showed that people assign more positive evaluations to decisions made by humans than to a neutral object. The same type of positivity bias has been found for human-automation interaction, where the automated decisions are rated more positively than neutral. This has become a growing problem for decision making as intensive care units, nuclear power plants, and aircraft cockpits have increasingly integrated computerized system monitors and decision aids to mostly factor out possible human error. Errors of automation bias tend to occur when decision-making is dependent on computers or other automated aids and the human is in an observatory role but able to make decisions. Examples of automation bias range from urgent matters like flying a plane on automatic pilot to such mundane matters as the use of spell-checking programs.

<span class="mw-page-title-main">TransAsia Airways Flight 222</span> 2014 passenger plane crash in Huxi, Penghu, Taiwan

TransAsia Airways Flight 222 was a scheduled domestic passenger flight operated by TransAsia Airways from Kaohsiung, Taiwan, to Magong, Penghu Island. On 23 July 2014, the ATR 72-500 twin turboprop operating the route crashed into buildings during approach to land in bad weather at Magong Airport. Among the 58 people on board, only 10 survived.

<span class="mw-page-title-main">Impact of culture on aviation safety</span>

Culture can affect aviation safety through its effect on how the flight crew deals with difficult situations; cultures with lower power distances and higher levels of individuality can result in better aviation safety outcomes. In higher power cultures subordinates are less likely to question their superiors. The crash of Korean Air Flight 801 in 1997 was attributed to the pilot's decision to land despite the junior officer's disagreement, while the crash of Avianca Flight 052 was caused by the failure to communicate critical low-fuel data between pilots and controllers, and by the failure of the controllers to ask the pilots if they were declaring an emergency and assist the pilots in landing the aircraft. The crashes have been blamed on aspects of the national cultures of the crews.

<span class="mw-page-title-main">Pilot fatigue</span> Reduced pilot performance from inadequate energy

The International Civil Aviation Organization (ICAO) defines fatigue as "A physiological state of reduced mental or physical performance capability resulting from sleep loss or extended wakefulness, circadian phase, or workload." The phenomenon places great risk on the crew and passengers of an airplane because it significantly increases the chance of pilot error. Fatigue is particularly prevalent among pilots because of "unpredictable work hours, long duty periods, circadian disruption, and insufficient sleep". These factors can occur together to produce a combination of sleep deprivation, circadian rhythm effects, and 'time-on task' fatigue. Regulators attempt to mitigate fatigue by limiting the number of hours pilots are allowed to fly over varying periods of time.

<span class="mw-page-title-main">Stress in the aviation industry</span> Pilots wellbeing whilst working

Stress in the aviation industry is a common phenomenon composed of three sources: physiological stressors, psychological stressors, and environmental stressors. Professional pilots can experience stress in flight, on the ground during work-related activities, and during personal time because of the influence of their occupation. An airline pilot can be an extremely stressful job due to the workload, responsibilities and safety of the thousands of passengers they transport around the world. Chronic levels of stress can negatively impact one's health, job performance and cognitive functioning. Being exposed to stress does not always negatively influence humans because it can motivate people to improve and help them adapt to a new environment. Unfortunate accidents start to occur when a pilot is under excessive stress, as it dramatically affects his or her physical, emotional, and mental conditions. Stress "jeopardizes decision-making relevance and cognitive functioning" and it is a prominent cause of pilot error. Being a pilot is considered a unique job that requires managing high workloads and good psychological and physical health. Unlike the other professional jobs, pilots are considered to be highly affected by stress levels. One study states that 70% of surgeons agreed that stress and fatigue don't impact their performance level, while only 26% of pilots denied that stress influences their performance. Pilots themselves realize how powerful stress can be, and yet many accidents and incidents continues to occur and have occurred, such as Asiana Airlines Flight 214, American Airlines Flight 1420, and Polish Air Force Tu-154.

NOTECHS is a system used to assess the non-technical skills of crew members in the aviation industry. Introduced in the late 1990s, the system has been widely used by airlines during crew selection process, picking out individuals who possess capable skills that are not directly related to aircraft controls or systems. In aviation, 70 percent of all accidents are induced from pilot error, lack of communication and decision making being two contributing factors to these accidents. NOTECHS assesses and provides feedback on the performance of pilots' social and cognitive skills to help minimize pilot error and enhance safety in the future. The NOTECHS system also aims to improve the Crew Resource Management training system.

<span class="mw-page-title-main">Scandinavian Airlines System Flight 901</span> 1984 aviation accident

Scandinavian Airlines System Flight 901, was a scheduled international flight operated by the Scandinavian Airlines System, that overran the runway at its destination at John F. Kennedy International Airport on February 28, 1984. The flight, using a McDonnell Douglas DC-10, originated at Stockholm Arlanda Airport, Sweden, before a stopover at Oslo Airport, Gardermoen, Norway. All 177 passengers and crew members on board survived, although 12 were injured. The runway overshoot was due to the crew's failure to monitor their airspeed and overreliance on the aircraft's autothrottle.

Continued VFR into IMC is when an aircraft operating under visual flight rules intentionally or unintentionally enters into instrument meteorological conditions. Flying an aircraft without visual reference to the ground can lead to a phenomenon known as spatial disorientation, which can cause the pilot to misperceive the angle, altitude, and speed they are traveling. This is considered a very serious safety hazard in general aviation. According to AOPA’s Nall Report, approximately 4% of general aviation accidents are weather related, yet these accidents account for more than 25% of all fatalities.

<span class="mw-page-title-main">Chicago Helicopter Airways Flight 698</span>

The Chicago Helicopter Airways Flight 698 was a scheduled domestic helicopter service between Chicago Midway Airport and Chicago O'Hare Airport. On 27 July 1960 it was operated by a Sikorsky S-58C helicopter which departed Chicago Midway Airport with two pilots and 11 passengers. It crashed at Forest Home Cemetery, Forest Park, Illinois killing all on board.

In-flight crew relief, is a term used in commercial aviation when referring to the members of an aircrew intended to temporarily relieve active crew members of their duties during the course of a flight. The term and its role are almost exclusively applied to the secondary pilots of an aircrew, commonly referred to as relief pilots, that relieve the primary and active captain and/or first officer (co-pilot) in command of an aircraft to provide prolonged breaks for rest or sleep opportunities.

<span class="mw-page-title-main">LATAM Airlines Perú Flight 2213</span> 2022 aviation accident

LATAM Airlines Perú Flight 2213 was a scheduled domestic passenger flight in Peru from Lima to Juliaca. On 18 November 2022, the Airbus A320neo was taking off from Jorge Chávez International Airport when it collided with a fire engine that was crossing the runway, killing two firefighters and injuring a third, who died of his injuries seven months later. 40 passengers were injured. The aircraft was damaged beyond repair and written off, making it the first hull loss of the Airbus A320neo family.

References

  1. "Pilot Decision Making — PDM - TP 13897". Transport Canada. Retrieved 23 June 2022.
  2. "Chapter 2: Aeronautical Decision-Making". Pilot's Handbook of Aeronautical Knowledge. Federal Aviation Administration. 2016. Retrieved 23 June 2022.
  3. Bowman, Terry (1994). "AERONAUTICAL DECISION-MAKING AND UNIVERSITYAVIATIONASSOCIATION CERTIFIED FLIGHT INSTRUCTORS".{{cite journal}}: Cite journal requires |journal= (help)
  4. Schriver, Angela T.; Morrow, Daniel G.; Wickens, Christopher D.; Talleur, Donald A. (2008-12-01). "Expertise Differences in Attentional Strategies Related to Pilot Decision Making". Human Factors: The Journal of the Human Factors and Ergonomics Society. 50 (6): 864–878. doi:10.1518/001872008X374974. ISSN   0018-7208. PMID   19292010. S2CID   6513349.
  5. "Ethical decision-making and the code of ethics of the Canadian Psychological Association". APA PsycNET. Retrieved 2015-10-31.
  6. Kaempf, George L.; Klein, Gary (2017). "Aeronautical Decision Making: The next generation". Aviation Psychology in Practice. pp. 223–254. doi:10.4324/9781351218825-11. ISBN   9781351218825 . Retrieved 24 June 2022.
  7. "Chapter 2: Aeronautical Decision-Making". Pilot's Handbook of Aeronautical Knowledge (FAA-H-8083-25C ed.). Federal Aviation Administration. 2023-07-17. pp. 7–8.
  8. Parry, David (2015). "Human Factor and Pilot Decision-making".
  9. "Setting Priorities - Aviate, Navigate, Communicate". iflyamerica.org. Retrieved 20 April 2023.
  10. 1 2 3 4 5 "Flight-crew human factors handbook CAP 737" (PDF). UK CAA. Retrieved 23 June 2022.
  11. Faraz (11 December 2017). "DODAR - A breakdown for Aviators". @FlightCopilot. Retrieved 23 June 2022.
  12. "DODAR - IVAO - International Virtual Aviation Organisation". mediawiki.ivao.aero. Retrieved 23 June 2022.
  13. 1 2 "CRM tools" (PDF). pmFlight Training. Retrieved 23 June 2022.
  14. 1 2 3 4 5 6 7 8 Soll, Henning; Proske, Solveig; Hofinger, Gesine; Steinhardt, Gunnar (1 September 2016). "Decision-Making Tools for Aeronautical Teams: FOR-DEC and Beyond" (PDF). Aviation Psychology and Applied Human Factors. 6 (2): 101–112. doi:10.1027/2192-0923/a000099 . Retrieved 23 June 2022.
  15. 1 2 Li, Wen-Chin; Li, Lun-Wen; Harris, Don; Hsu, Yueh-Ling (1 June 2014). "The Application of Aeronautical Decision-making Support Systems for Improving Pilots' Performance in Flight Operations". Journal of Aeronautics, Astronautics and Aviation. 46 (2). doi:10.6125/14-0324-789 . Retrieved 1 August 2022.
  16. 1 2 Li, Wen-Chin; Harris, Don (December 2005). "Aeronautical decision making: instructor-pilot evaluation of five mnemonic methods". Aviation, Space, and Environmental Medicine. 76 (12): 1156–1161. PMID   16370266 . Retrieved 1 August 2022.
  17. "FOR-DEC". SKYbrary Aviation Safety. 27 May 2021. Retrieved 23 June 2022.
  18. Martinussen, Monica; Hunter, David R. (12 July 2017). Aviation Psychology and Human Factors. CRC Press. p. 35. ISBN   978-1-351-64901-8 . Retrieved 23 June 2022.
  19. Murray, Stephen R. (1 January 1997). "Deliberate Decision Making by Aircraft Pilots: A Simple Reminder to Avoid Decision Making Under Panic". The International Journal of Aviation Psychology. 7 (1): 83–100. doi:10.1207/s15327108ijap0701_5. ISSN   1050-8414 . Retrieved 1 August 2022.
  20. MacLeod, Norman (5 May 2021). Crew Resource Management Training: A Competence-based Approach for Airline Pilots. CRC Press. ISBN   978-1-000-37668-5 . Retrieved 27 July 2022.
  21. "Chapter 2: Aeronautical Decision-Making". Pilot's Handbook of Aeronautical Knowledge (FAA-H-8083-25C ed.). Federal Aviation Administration. 2023-07-17. p. 15.
  22. Smejkal, Petr. The Command Handbook: A Practical Guide through Command Upgrade and Beyond. 737 Publishing s.r.o. p. 40. Retrieved 27 July 2022.
  23. "How do Pilots Make Decisions?". FlightDeckFriend.com. Retrieved 23 June 2022.
  24. Roth, Wolff-Michael (20 July 2017). Cognition, Assessment and Debriefing in Aviation. CRC Press. ISBN   978-1-351-80702-9 . Retrieved 1 August 2022.
  25. "Chirp Cabin Crew Feedback" (PDF). CHIRP Confidential Reporting Programme (48). March 2013. Retrieved 23 June 2022.
  26. "What is a NITS Briefing - What is the NITS Briefing? - BizJet | Business Jet | Flight Safety Equipment | Fire Fighting | Emergency Evacuation". bizjetsafetyequipment.com. Retrieved 23 June 2022.
  27. Caldwell, John (2004). Travel Medicine and Infectious Disease. Elsevier. pp. 85–96.
  28. National Transportation Safety Board. 2000. Controlled Flight Into Terrain, Korean Air Flight 801, Boeing 747-300, HL7468, Nimitz Hill, Guam, August 6, 1997. Aircraft Accident Report NTSB/AAR-00/01. Washington, DC.
  29. "Aviation accident report" (PDF). Collision with Trees and Crash Short of Runway, Corporate Airlines Flight 5966. 2006.
  30. National transportation safety board (February 12, 2009). "loss of control on Approach Colgan Air" (PDF). Accident Report.
  31. "Accident to Air India Express at Mangalore" (PDF). October 31, 2010. Archived from the original (PDF) on May 16, 2018.{{cite journal}}: Cite journal requires |journal= (help)
  32. Tehran, Iran (2007). "Stress and job satisfaction among air force military pilots".{{cite journal}}: Cite journal requires |journal= (help)
  33. "Aeronautical Decision-Making" (PDF). FAA. 2013. Archived from the original (PDF) on 2015-12-08.
  34. SKITKA, Linda (1999). Does automation bias decision-making?. Elsevier.
  35. Sexton, J. Bryan; Thomas, Eric J.; Helmreich, Robert L. (2000-03-18). "Error, stress, and teamwork in medicine and aviation: cross sectional surveys". BMJ. 320 (7237): 745–749. doi:10.1136/bmj.320.7237.745. ISSN   0959-8138. PMC   27316 . PMID   10720356.
  36. O'Hare, David (13 Nov 2009). "'Pressing On' Into Deteriorating Conditions: An Application of Behavioral Decision Theory to Pilot Decision Making". Aviation Psychology. 5 (4): 351–370. doi:10.1207/s15327108ijap0504_2.
  37. "Download Limit Exceeded". CiteSeerX   10.1.1.434.3878 .{{cite journal}}: Cite journal requires |journal= (help)
  38. "Expertise in aeronautical weather-related decision making: A cross-sectional analysis of general aviation pilots". APA PsycNET. Retrieved 2015-10-31.
  39. SKITKA, LINDA J.; MOSIER, KATHLEEN L.; BURDICK, MARK (1999-11-01). "Does automation bias decision-making?". International Journal of Human-Computer Studies. 51 (5): 991–1006. doi:10.1006/ijhc.1999.0252. S2CID   1863226.
  40. Irving, Janis (1977). "Emergency decision making". Journal of Human Stress. 3 (2): 35–48. doi:10.1080/0097840X.1977.9936085. PMID   864252.