The Human Factors Analysis and Classification System (HFACS) identifies the human causes of an accident and offers tools for analysis as a way to plan preventive training. [1] It was developed by Dr. Scott Shappell of the Civil Aviation Medical Institute and Dr. Doug Wiegmann of the University of Illinois at Urbana-Campaign in response to a trend that showed some form of human error was a primary causal factor in 80% of all flight accidents in the Navy and Marine Corps. [1]
HFACS is based in the "Swiss cheese model" of human error [2] which looks at four levels of human failure, including unsafe acts, preconditions for unsafe acts, unsafe supervision, and organizational influences. [1] It is a comprehensive human error framework that folded James Reason's ideas into the applied setting, defining 19 causal categories within four levels of human failure. [3]
In the field of science and engineering, root cause analysis (RCA) is a method of problem solving used for identifying the root causes of faults or problems. It is widely used in IT operations, manufacturing, telecommunications, industrial process control, accident analysis, medicine, healthcare industry, etc. Root cause analysis is a form of inductive and deductive inference.
In the field of human factors and ergonomics, human reliability is the probability that a human performs a task to a sufficient standard. Reliability of humans can be affected by many factors such as age, physical health, mental state, attitude, emotions, personal propensity for certain mistakes, and cognitive biases.
Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability is defined as the probability that a product, system, or service will perform its intended function adequately for a specified period of time, OR will operate in a defined environment without failure. Reliability is closely related to availability, which is typically described as the ability of a component or system to function at a specified moment or interval of time.
In aviation, pilot error generally refers to an action or decision made by a pilot that is a substantial contributing factor leading to an aviation accident. It also includes a pilot's failure to make a correct decision or take proper action. Errors are intentional actions that fail to achieve their intended outcomes. The Chicago Convention defines the term "accident" as "an occurrence associated with the operation of an aircraft [...] in which [...] a person is fatally or seriously injured [...] except when the injuries are [...] inflicted by other persons." Hence the definition of "pilot error" does not include deliberate crashing.
Safety culture is the element of organizational culture which is concerned with the maintenance of safety and compliance with safety standards. It is informed by the organization's leadership and the beliefs, perceptions and values that employees share in relation to risks within the organization, workplace or community. Safety culture has been described in a variety of ways: notably, the National Academies of Science and the Association of Land Grant and Public Universities have published summaries on this topic in 2014 and 2016.
Human error is an action that has been done but that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits". Human error has been cited as a primary cause and contributing factor in disasters and accidents in industries as diverse as nuclear power, aviation, space exploration, and medicine. Prevention of human error is generally seen as a major contributor to reliability and safety of (complex) systems. Human error is one of the many contributing causes of risk events.
Accident analysis is a process carried out in order to determine the cause or causes of an accident so as to prevent further accidents of a similar kind. It is part of accident investigation or incident investigation. These analyses may be performed by a range of experts, including forensic scientists, forensic engineers or health and safety advisers. Accident investigators, particularly those in the aircraft industry, are colloquially known as "tin-kickers". Health and safety and patient safety professionals prefer using the term "incident" in place of the term "accident". Its retrospective nature means that accident analysis is primarily an exercise of directed explanation; conducted using the theories or methods the analyst has to hand, which directs the way in which the events, aspects, or features of accident phenomena are highlighted and explained. These analyses are also invaluable in determining ways to prevent future incidents from occurring. They provide good insight by determining root causes, into what failures occurred that lead to the incident.
The Swiss cheese model of accident causation is a model used in risk analysis and risk management. It likens human systems to multiple slices of Swiss cheese, which has randomly placed and sized holes in each slice, stacked side by side, in which the risk of a threat becoming a reality is mitigated by the differing layers and types of defenses which are "layered" behind each other. Therefore, in theory, lapses and weaknesses in one defense do not allow a risk to materialize, since other defenses also exist, to prevent a single point of failure.
The system safety concept calls for a risk management strategy based on identification, analysis of hazards and application of remedial controls using a systems-based approach. This is different from traditional safety strategies which rely on control of conditions and causes of an accident based either on the epidemiological analysis or as a result of investigation of individual past accidents. The concept of system safety is useful in demonstrating adequacy of technologies when difficulties are faced with probabilistic risk analysis. The underlying principle is one of synergy: a whole is more than sum of its parts. Systems-based approach to safety requires the application of scientific, technical and managerial skills to hazard identification, hazard analysis, and elimination, control, or management of hazards throughout the life-cycle of a system, program, project or an activity or a product. "Hazop" is one of several techniques available for identification of hazards.
The Fire Fighter Near Miss Reporting System was launched on August 12, 2005 by the International Association of Fire Chiefs. It was announced at a press conference in Denver, Colorado, after having completed a pilot program involving 38 fire departments across the country. The Near Miss Reporting System aims to prevent injuries and save lives of other firefighters by collecting, sharing and analyzing near-miss experiences. The near-miss experiences are collected by firefighters who voluntarily submit them; the reports are confidential, non-punitive, and secure. After the reports are compiled, they are posted to the website where firefighters can access them and learn from each other's real-life experiences. Overall these reports help to formulate strategies, reduce firefighter injuries and fatalities, and enhance the safety culture of the fire service. The program is based on the Aviation Safety Reporting System (ASRS), which has been gathering reports of close calls from pilots, flight attendants, air traffic controllers since 1976. The reporting system is funded by the International Association of Fire Chiefs.
Single-pilot resource management (SRM) is defined as the art and science of managing all the resources available to a single-pilot to ensure that the successful outcome of the flight is never in doubt. SRM includes the concepts of Aeronautical Decision Making (ADM), Risk Management (RM), Task Management (TM), Automation Management (AM), Controlled Flight Into Terrain (CFIT) Awareness, and Situational Awareness (SA). SRM training helps the pilot maintain situational awareness by managing the automation and associated aircraft control and navigation tasks. This enables the pilot to accurately assess and manage risk and make accurate and timely decisions.
A Technique for Human Event Analysis (ATHEANA) is a technique used in the field of human reliability assessment (HRA). The purpose of ATHEANA is to evaluate the probability of human error while performing a specific task. From such analyses, preventative measures can then be taken to reduce human errors within a system and therefore lead to improvements in the overall level of safety.
The healthcare error proliferation model is an adaptation of James Reason’s Swiss Cheese Model designed to illustrate the complexity inherent in the contemporary healthcare delivery system and the attribution of human error within these systems. The healthcare error proliferation model explains the etiology of error and the sequence of events typically leading to adverse outcomes. This model emphasizes the role organizational and external cultures contribute to error identification, prevention, mitigation, and defense construction.
Accident classification is a standardized method in accident analysis by which the causes of an accident, including the root causes, are grouped into categories. Accident classification is mainly used in aviation but can be expanded into other areas, such as railroad or health care. While accident reports are very detailed, the goal of accident classification is to look at a broader picture. By analysing a multitude of accidents and applying the same standardized classification scheme, patterns in how accidents develop can be detected and correlations can be built. The advantage of a standardized accident classification is that statistical methods can be used to gain more insight into accident causation.
The term use error has recently been introduced to replace the commonly used terms human error and user error. The new term, which has already been adopted by international standards organizations for medical devices, suggests that accidents should be attributed to the circumstances, rather than to the human beings who happened to be there.
Human factors are the physical or cognitive properties of individuals, or social behavior which is specific to humans, and which influence functioning of technological systems as well as human-environment equilibria. The safety of underwater diving operations can be improved by reducing the frequency of human error and the consequences when it does occur. Human error can be defined as an individual's deviation from acceptable or desirable practice which culminates in undesirable or unexpected results. Human factors include both the non-technical skills that enhance safety and the non-technical factors that contribute to undesirable incidents that put the diver at risk.
[Safety is] An active, adaptive process which involves making sense of the task in the context of the environment to successfully achieve explicit and implied goals, with the expectation that no harm or damage will occur. – G. Lock, 2022
Dive safety is primarily a function of four factors: the environment, equipment, individual diver performance and dive team performance. The water is a harsh and alien environment which can impose severe physical and psychological stress on a diver. The remaining factors must be controlled and coordinated so the diver can overcome the stresses imposed by the underwater environment and work safely. Diving equipment is crucial because it provides life support to the diver, but the majority of dive accidents are caused by individual diver panic and an associated degradation of the individual diver's performance. – M.A. Blumenberg, 1996
Selection, training, cohesion and psychosocial adaptation influence performance and, as such, are relevant factors to consider while preparing for costly, long-duration spaceflight missions in which the performance objectives will be demanding, endurance will be tested and success will be critical.
The AcciMap approach is a systems-based technique for accident analysis, specifically for analysing the causes of accidents and incidents that occur in complex sociotechnical systems.
Aviation accident analysis is performed to determine the cause of errors once an accident has happened. In the modern aviation industry, it is also used to analyze a database of past accidents in order to prevent an accident from happening. Many models have been used not only for the accident investigation but also for educational purpose.
In aviation, the SHELL model is a conceptual model of human factors that helps to clarify the location and cause of human error within an aviation environment.