Aviation accident analysis

Last updated

Aviation accident analysis is performed to determine the cause of errors once an accident has happened. In the modern aviation industry, it is also used to analyze a database of past accidents in order to prevent an accident from happening. Many models have been used not only for the accident investigation but also for educational purpose. [1]

Contents

Per the Convention on International Civil Aviation, if an aircraft of a contracting State has an accident or incident in another contracting State, the State where the accident occurs will institute an inquiry. The Convention defines the rights and responsibilities of the states.

ICAO Annex 13—Aircraft Accident and Incident Investigation—defines which States may participate in an investigation, for example: the States of Occurrence, Registry, Operator, Design and Manufacture. [2]

Human factors

In the aviation industry, human error is the major cause of accidents. About 38% of 329 major airline crashes, 74% of 1627 commuter/air taxi crashes, and 85% of 27935 general aviation crashes were related to pilot error. [3] The Swiss cheese model is an accident causation model which analyzes the accident more from the human factor aspect. [4] [5]

Reason's model

Reason's model Swiss cheese model of accident causation.png
Reason's model

Reason's model, commonly referred to as the Swiss cheese model, was based on Reason's approach that all organizations should work together to ensure a safe and efficient operation. [1] From the pilot's perspective, in order to maintain a safe flight operation, all human and mechanical elements must co-operate effectively in the system. In Reason's model, the holes represent weakness or failure. These holes will not lead to accident directly, because of the existence of defense layers. However, once all the holes line up, an accident will occur. [6]

There are four layers in this model: organizational influences, unsafe supervision, precondition and unsafe acts.

Investigation using Reason's model

Based on Reason's model, accident investigators analyze the accident from all four layers to determine the cause of the accident. There are two main types of failure investigators will focus on: active failure and latent failure. [9]

In order to fully understand the cause of the accident all those steps need to be performed. Investigation that is different from the causing of the accident, then it is necessary to investigate from backward of Reason's model.

Related Research Articles

<span class="mw-page-title-main">Aviation accidents and incidents</span> Aviation occurrence involving serious injury, death, or destruction of aircraft

An aviation accident is defined by the Convention on International Civil Aviation Annex 13 as an occurrence associated with the operation of an aircraft, which takes place from the time any person boards the aircraft with the intention of flight until all such persons have disembarked, and in which (a) a person is fatally or seriously injured, (b) the aircraft sustains significant damage or structural failure, or (c) the aircraft goes missing or becomes completely inaccessible. Annex 13 defines an aviation incident as an occurrence, other than an accident, associated with the operation of an aircraft that affects or could affect the safety of operation.

In aviation, a controlled flight into terrain is an accident in which an airworthy aircraft, fully under pilot control, is unintentionally flown into the ground, a mountain, a body of water or an obstacle. In a typical CFIT scenario, the crew is unaware of the impending disaster until it is too late. The term was coined by engineers at Boeing in the late 1970s.

<span class="mw-page-title-main">Aviation safety</span> State in which risks associated with aviation are at an acceptable level

Aviation safety is the study and practice of managing risks in aviation. This includes preventing aviation accidents and incidents through research, educating air travel personnel, passengers and the general public, as well as the design of aircraft and aviation infrastructure. The aviation industry is subject to significant regulation and oversight.

Crew resource management or cockpit resource management (CRM) is a set of training procedures for use in environments where human error can have devastating effects. CRM is primarily used for improving aviation safety and focuses on interpersonal communication, leadership, and decision making in aircraft cockpits. Its founder is David Beaty, a former Royal Air Force and a BOAC pilot who wrote "The Human Factor in Aircraft Accidents" (1969). Despite the considerable development of electronic aids since then, many principles he developed continue to prove effective.

Human reliability is related to the field of human factors and ergonomics, and refers to the reliability of humans in fields including manufacturing, medicine and nuclear power. Human performance can be affected by many factors such as age, state of mind, physical health, attitude, emotions, propensity for certain common mistakes, errors and cognitive biases, etc.

<span class="mw-page-title-main">Pilot error</span> Decision, action or inaction by a pilot of an aircraft

Pilot error generally refers to an accident in which an action or decision made by the pilot was the cause or a contributing factor that led to the accident, but also includes the pilot's failure to make a correct decision or take proper action. Errors are intentional actions that fail to achieve their intended outcomes. The Chicago Convention defines the term "accident" as "an occurrence associated with the operation of an aircraft [...] in which [...] a person is fatally or seriously injured [...] except when the injuries are [...] inflicted by other persons." Hence the definition of "pilot error" does not include deliberate crashing.

<span class="mw-page-title-main">Gulf Air Flight 072</span> 2000 aviation accident

Gulf Air Flight 072 (GF072/GFA072) was a scheduled international passenger flight from Cairo International Airport in Egypt to Bahrain International Airport in Bahrain, operated by Gulf Air. On 23 August 2000 at 19:30 Arabia Standard Time (UTC+3), the Airbus A320 crashed minutes after executing a go-around upon failed attempt to land on Runway 12. The flight crew suffered from spatial disorientation during the go-around and crashed into the shallow waters of the Persian Gulf 2 km (1 nmi) from the airport. All 143 people on board the aircraft were killed.

A near miss, near death, near hit or close call is an unplanned event that has the potential to cause, but does not actually result in human injury, environmental or equipment damage, or an interruption to normal operation.

A system accident is an "unanticipated interaction of multiple failures" in a complex system. This complexity can either be of technology or of human organizations and is frequently both. A system accident can be easy to see in hindsight, but extremely difficult in foresight because there are simply too many action pathways to seriously consider all of them. Charles Perrow first developed these ideas in the mid-1980s. Safety systems themselves are sometimes the added complexity which leads to this type of accident.

<span class="mw-page-title-main">Swiss cheese model</span> Model used in risk analysis

The Swiss cheese model of accident causation is a model used in risk analysis and risk management, including aviation safety, engineering, healthcare, emergency service organizations, and as the principle behind layered security, as used in computer security and defense in depth. It likens human systems to multiple slices of Swiss cheese, which has randomly placed and sized holes in each slice, stacked side by side, in which the risk of a threat becoming a reality is mitigated by the differing layers and types of defenses which are "layered" behind each other. Therefore, in theory, lapses and weaknesses in one defense do not allow a risk to materialize, since other defenses also exist, to prevent a single point of failure. The model was originally formally propounded by James T. Reason of the University of Manchester, and has since gained widespread acceptance. It is sometimes called the "cumulative act effect".

In accident analysis, a chain of events consists of the contributing factors leading to an undesired outcome.

<span class="mw-page-title-main">Human Factors Analysis and Classification System</span> Method to identify causes of accidents and analysis to plan preventive training

The Human Factors Analysis and Classification System (HFACS) identifies the human causes of an accident and offers tools for analysis as a way to plan preventive training. It was developed by Dr Scott Shappell of the Civil Aviation Medical Institute and Dr Doug Wiegmann of the University of Illinois at Urbana-Campaign in response to a trend that showed some form of human error was a primary causal factor in 80% of all flight accidents in the Navy and Marine Corps.

Latent human error is a term used in safety work and accident prevention, especially in aviation, to describe human errors which are likely to be made due to systems or routines that are formed in such a way that humans are disposed to making these errors. Latent human errors are frequently components in causes of accidents. The error is latent and may not materialize immediately, thus, latent human error does not cause immediate or obvious damage. Discovering latent errors is therefore difficult and requires a systematic approach. Latent human error is often discussed in aviation incident investigation, and contributes to over 70% of the accidents.

The healthcare error proliferation model is an adaptation of James Reason’s Swiss Cheese Model designed to illustrate the complexity inherent in the contemporary healthcare delivery system and the attribution of human error within these systems. The healthcare error proliferation model explains the etiology of error and the sequence of events typically leading to adverse outcomes. This model emphasizes the role organizational and external cultures contribute to error identification, prevention, mitigation, and defense construction.

<span class="mw-page-title-main">Aviastar-TU Flight 1906</span> 2010 aviation accident

In aeronautics, loss of control (LOC) is the unintended departure of an aircraft from controlled flight, and is a significant factor in several aviation accidents worldwide. In 2015 it was the leading cause of general aviation accidents. Loss of control may be the result of mechanical failure, external disturbances, aircraft upset conditions, or inappropriate crew actions or responses.

Human factors are the physical or cognitive properties of individuals, or social behavior which is specific to humans, and influence functioning of technological systems as well as human-environment equilibria. The safety of underwater diving operations can be improved by reducing the frequency of human error and the consequences when it does occur. Human error can be defined as an individual's deviation from acceptable or desirable practice which culminates in undesirable or unexpected results.

Dive safety is primarily a function of four factors: the environment, equipment, individual diver performance and dive team performance. The water is a harsh and alien environment which can impose severe physical and psychological stress on a diver. The remaining factors must be controlled and coordinated so the diver can overcome the stresses imposed by the underwater environment and work safely. Diving equipment is crucial because it provides life support to the diver, but the majority of dive accidents are caused by individual diver panic and an associated degradation of the individual diver's performance. - M.A. Blumenberg, 1996

Maritime resource management (MRM) or bridge resource management (BRM) is a set of human factors and soft skills training aimed at the maritime industry. The MRM training programme was launched in 1993 – at that time under the name bridge resource management – and aims at preventing accidents at sea caused by human error.

<span class="mw-page-title-main">2013 CHC Helicopters Eurocopter AS332 crash</span> An air accident at Sumburgh 23 Aug 2013

On 23 August 2013, a Eurocopter AS332 Super Puma helicopter belonging to CHC Helicopters crashed into the sea 2 nautical miles from Sumburgh in the Shetland Islands, Scotland, while en route from the Borgsten Dolphin drilling rig. The accident killed four passengers; twelve other passengers and two crew were rescued with injuries. A further passenger killed himself in 2017 as a result of PTSD caused by the crash. An investigation by the UK's Air Accident Investigation Branch concluded in 2016 that the accident was primarily caused by pilot error in failing to monitor instruments during approach. The public inquiry concluded in October 2020 that the crash was primarily caused by pilot error.

<span class="mw-page-title-main">Pilot fatigue</span> Reduced pilot performance from inadequate energy

The International Civil Aviation Organization (ICAO) defines fatigue as "A physiological state of reduced mental or physical performance capability resulting from sleep loss or extended wakefulness, circadian phase, or workload." The phenomenon places great risk on the crew and passengers of an airplane because it significantly increases the chance of pilot error. Fatigue is particularly prevalent among pilots because of "unpredictable work hours, long duty periods, circadian disruption, and insufficient sleep". These factors can occur together to produce a combination of sleep deprivation, circadian rhythm effects, and 'time-on task' fatigue. Regulators attempt to mitigate fatigue by limiting the number of hours pilots are allowed to fly over varying periods of time.

References

  1. 1 2 3 Wiegmann, Douglas A (2003). A Human Error Approach to Aviation Accident Analysis: The Human Factors Analysis and Classification System (PDF). Ashgate Publishing Limited. Retrieved Oct 26, 2015.
  2. ICAO fact sheet : Accident investigation (PDF). ICAO. 2016.
  3. Li, Guohua (Feb 2001). "Factors associated with pilot error in aviation crashes". Aviation, Space, and Environmental Medicine. 72 (1): 52–58. PMID   11194994 . Retrieved Oct 26, 2015.
  4. P.M, Salmon (2012). "Systems-based analysis methods: a comparison of AcciMap, HFACS, and STAMP". Safety Science. 50: 1158–1170. doi:10.1016/j.ssci.2011.11.009. S2CID   205235420.
  5. Underwood, Peter; Waterson, Patrick (2014-07-01). "Systems thinking, the Swiss Cheese Model and accident analysis: A comparative systemic analysis of the Grayrigg train derailment using the ATSB, AcciMap and STAMP models". Accident Analysis & Prevention. Systems thinking in workplace safety and health. 68: 75–94. doi:10.1016/j.aap.2013.07.027. PMID   23973170.
  6. Roelen, A. L. C.; Lin, P. H.; Hale, A. R. (2011-01-01). "Accident models and organisational factors in air transport: The need for multi-method models". Safety Science. The gift of failure: New approaches to analyzing and learning from events and near-misses – Honoring the contributions of Bernhard Wilpert. 49 (1): 5–10. doi:10.1016/j.ssci.2010.01.022.
  7. Qureshi, Zahid H (Jan 2008). "A Review of Accident Modelling Approaches for Complex Critical Sociotechnical Systems". Defense Science and Technology Organisation.
  8. Mearns, Kathryn J.; Flin, Rhona (March 1999). "Assessing the state of organizational safety—culture or climate?". Current Psychology. 18 (1): 5–17. doi:10.1007/s12144-999-1013-3. S2CID   144275343.
  9. Nielsen, K.J.; Rasmussen, K.; Glasscock, D.; Spangenberg, S. (March 2008). "Changes in safety climate and accidents at two identical manufacturing plants". Safety Science. 46 (3): 440–449. doi:10.1016/j.ssci.2007.05.009.
  10. Reason, James (2000-03-18). "Human error: models and management". BMJ: British Medical Journal. 320 (7237): 768–770. doi:10.1136/bmj.320.7237.768. ISSN   0959-8138. PMC   1117770 . PMID   10720363.
  11. "Aviation Accident Report AAR-10-01". www.ntsb.gov. Retrieved 2015-10-26.
  12. "A double tragedy: Colgan Air Flight 3407 – Air Facts Journal". Air Facts Journal. 2014-03-28. Retrieved 2015-10-26.
  13. Caldwell, John A. (2012-04-01). "Crew Schedules, Sleep Deprivation, and Aviation Performance". Current Directions in Psychological Science. 21 (2): 85–89. doi:10.1177/0963721411435842. ISSN   0963-7214. S2CID   146585084.
  14. Landrigan, L.C.; Wade, J.P.; Milewski, A.; Reagor, B. (2013-11-01). "Lessons from the past: Inaccurate credibility assessments made during crisis situations". 2013 IEEE International Conference on Technologies for Homeland Security (HST). pp. 754–759. doi:10.1109/THS.2013.6699098. ISBN   978-1-4799-1535-4. S2CID   16676097.
  15. Haddad, Ziad S.; Park, Kyung-Won (23 June 2010). "Vertical profiling of tropical precipitation using passive microwave observations and its implications regarding the crash of Air France 447". Journal of Geophysical Research. 115 (D12): D12129. Bibcode:2010JGRD..11512129H. doi:10.1029/2009JD013380.
  16. Reason, James (August 1995). "A systems approach to organizational error". Ergonomics. 38 (8): 1708–1721. doi:10.1080/00140139508925221.