David D. Woods is an American safety systems researcher who studies human coordination and automation issues in a wide range safety-critical fields such as nuclear power, aviation, space operations, critical care medicine, and software services. He is one of the founding researchers of the fields of cognitive systems engineering [1] and resilience engineering. [2]
In 1974, Woods received his BA in psychology at Canisius College. In 1977, he received his MS in cognitive psychology at Purdue University. In 1979, he received his PhD at Purdue University in cognitive psychology, where he studied human perception and attention. [3]
From 1979 to 1988, Woods worked as a senior engineer at the Westinghouse Research and Development Center [3] where he worked on improving control room equipment interfaces for power plants. [4] [1]
From 1988 onwards, he served on the faculty of The Ohio State University in the Department of Integrated Systems, where he is currently a professor emeritus. [1]
In 2017, Woods co-founded a consulting company, Adaptive Capacity Labs, with Richard Cook and John Allspaw. [5]
Woods has previously been president of the Resilience Engineering Association (2011-2013), and the Human Factors and Ergonomics Society (1998-1999). [6] He is a fellow of the Human Factors and Ergonomics Society. [7]
Woods is one of the founders of the field of resilience engineering. [2] One of his significant contributions is the theory of graceful extensibility. [14]
In the wake of the Three Mile Island accident, Woods and Erik Hollnagel proposed a new approach to thinking about human-computer interaction (HCI) in the domain of supervisory control, Cognitive Systems Engineering (CSE) [15] that focuses on the interaction between people, technological artifacts, and work. In this approach, a set of interacting human and software agents are viewed as a joint cognitive system, where the overall system itself is seen as performing cognitive tasks.
The theory of graceful extensibility is a theory proposed by Woods to explain how some systems are able to continually adapt over time to face new challenges (sustained adaptability) where other systems fail to do so. [16]
This theory asserts that all complex adaptive systems can be model as the composition of individual units that have some ability to adapt their behavior and communicate with other units. It is expressed as ten statements that Woods calls 'proto-theorems':
Woods proposed visual momentum as a measure of how easy it is for a person to navigate to a new screen and integrate the information they see, when in the process of performing a task. [17] [18] This work was motivated by study of event-driven tasks, where events occur that operators must respond to (e.g., pilots, space flight controllers, nuclear plant operators, physicians).
Woods argued that it is easy to get lost in such user interfaces. Effective operator interfaces should help figure out where to look next, and that navigating a virtual space of information could be improved by leveraging the human perceptual system has already been optimized to do, such as pattern recognition.
Woods proposed a number of concepts for improving the design of such interfaces by increasing the visual momentum:
Woods studied the nature of operations work involved in identifying and mitigating faults in a supervisory context, such as controlling a power plant or operating a software service. [19] He found that this work was qualitatively different from traditional offline troubleshooting that had previously been studied. [20] In particular, because of the dynamic nature of the underlying component, the nature and severity of the problem can potentially change over time. In addition, because of the safety-critical nature of the process, the operator must work to limit possible harms in addition to addressing the underlying problem.
Woods's research found three recurring patterns in the failure modes of complex adaptive systems: [21]
The adaptive universe is a model proposed by Woods for the constraints that all complex adaptive systems are bound by. The model contains two assumptions: [16]
Systems engineering is an interdisciplinary field of engineering and engineering management that focuses on how to design, integrate, and manage complex systems over their life cycles. At its core, systems engineering utilizes systems thinking principles to organize this body of knowledge. The individual outcome of such efforts, an engineered system, can be defined as a combination of components that work in synergy to collectively perform a useful function.
Traffic psychology is a discipline of psychology that studies the relationship between psychological processes and the behavior of road users. In general, traffic psychology aims to apply theoretical aspects of psychology in order to improve traffic mobility by helping to develop and apply crash countermeasures, as well as by guiding desired behaviors through education and the motivation of road users.
In the field of human factors and ergonomics, human reliability is the probability that a human performs a task to a sufficient standard. Reliability of humans can be affected by many factors such as age, physical health, mental state, attitude, emotions, personal propensity for certain mistakes, and cognitive biases.
Situational awareness or situation awareness (SA) is the understanding of an environment, its elements, and how it changes with respect to time or other factors. Situational awareness is important for effective decision making in many environments. It is formally defined as:
“the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future”.
Cognitive ergonomics is a scientific discipline that studies, evaluates, and designs tasks, jobs, products, environments and systems and how they interact with humans and their cognitive abilities. It is defined by the International Ergonomics Association as "concerned with mental processes, such as perception, memory, reasoning, and motor response, as they affect interactions among humans and other elements of a system. Cognitive ergonomics is responsible for how work is done in the mind, meaning, the quality of work is dependent on the persons understanding of situations. Situations could include the goals, means, and constraints of work. The relevant topics include mental workload, decision-making, skilled performance, human-computer interaction, human reliability, work stress and training as these may relate to human-system design." Cognitive ergonomics studies cognition in work and operational settings, in order to optimize human well-being and system performance. It is a subset of the larger field of human factors and ergonomics.
Supervisory control is a general term for control of many individual controllers or control loops, such as within a distributed control system. It refers to a high level of overall monitoring of individual process controllers, which is not necessary for the operation of each controller, but gives the operator an overall plant process view, and allows integration of operation between controllers.
Human error is an action that has been done but that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits". Human error has been cited as a primary cause contributing factor in disasters and accidents in industries as diverse as nuclear power, aviation, space exploration, and medicine. Prevention of human error is generally seen as a major contributor to reliability and safety of (complex) systems. Human error is one of the many contributing causes of risk events.
Engineering psychology, also known as Human Factors Engineering, is the science of human behavior and capability, applied to the design and operation of systems and technology. As an applied field of psychology and an interdisciplinary part of ergonomics, it aims to improve the relationships between people and machines by redesigning equipment, interactions, or the environment in which they take place. The work of an engineering psychologist is often described as making the relationship more "user-friendly."
Nancy G. Leveson is an American specialist in system and software safety and a Professor of Aeronautics and Astronautics at MIT, United States.
Systems psychology is a branch of both theoretical psychology and applied psychology that studies human behaviour and experience as complex systems. It is inspired by systems theory and systems thinking, and based on the theoretical work of Roger Barker, Gregory Bateson, Humberto Maturana and others. Groups and individuals are considered as systems in homeostasis. Alternative terms here are "systemic psychology", "systems behavior", and "systems-based psychology".
Macrocognition indicates a descriptive level of cognition performed in natural instead of artificial (laboratory) environments. This term is reported to have been coined by Pietro Cacciabue and Erik Hollnagel in 1995. However, it is also reported that it was used in the 1980s in European Cognitive Systems Engineering research. Possibly the earliest reference is the following, although it does not use the exact term "macrocognition":
A macro-theory is a theory which is concerned with the obvious regularities of human experience, rather than with some theoretically defined unit. To refer to another psychological school, it would correspond to a theory at the level of Gestalten. It resembles Newell’s suggestion for a solution that would analyse more complex tasks. However, the idea of a macro-theory does not entail an analysis of the mechanistic materialistic kind which is predominant in cognitive psychology. Thus we should have a macro-theory of remembering rather than of memory, to say nothing of short-term memory, proactive inhibition release, or memory scanning. To take another example, we should have a macro-theory of attending, rather than a mini-theory of attention, or micro-theories of limited channel capacities or logarithmic dependencies in disjunctive reaction times. This would ease the dependence on the information processing analogy, but not necessarily lead to an abandonment of the information processing terminology, the Flowchart, or the concept of control structures. The meta-technical sciences can contribute to a psychology of cognition as well as to cognitive psychology. What should be abandoned is rather the tendency to think in elementaristic terms and to increase the plethora of mini-and micro-theories. ... To conclude, if the psychological study of cognition shall have a future that is not a continued description of human information processing, its theories must be at what we have called the macro-level. This means that they must correspond to the natural units of experience and consider these in relation to the regularities of human experience, rather than as manifestations of hypothetical information processing mechanisms in the brain. A psychology should start at the level of natural units in human experience and try to work upwards towards the level of functions and human action, rather than downwards towards the level of elementary information processes and the structure of the IPS.
Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.
Ergonomics, also known as human factors or human factors engineering (HFE), is the application of psychological and physiological principles to the engineering and design of products, processes, and systems. Primary goals of human factors engineering are to reduce human error, increase productivity and system availability, and enhance safety, health and comfort with a specific focus on the interaction between the human and equipment.
Mica Endsley is an American engineer and a former Chief Scientist of the United States Air Force.
Human performance modeling (HPM) is a method of quantifying human behavior, cognition, and processes. It is a tool used by human factors researchers and practitioners for both the analysis of human function and for the development of systems designed for optimal user experience and interaction. It is a complementary approach to other usability testing methods for evaluating the impact of interface features on operator performance.
Daniel Gopher is a professor (Emeritus) of Cognitive psychology and Human Factors Engineering at the Faculty of Industrial Engineering and Management, Technion - Israel Institute of Technology. He held the Yigal Alon Chair for the Study of Humans at Work at the Technion. Gopher is a fellow of the Human Factors and Ergonomics Society, the Psychonomic Society and the International Ergonomics Association.
Dr. Richard I. Cook was a system safety researcher, physician, anesthesiologist, university professor, and software engineer. Cook did research in safety, incident analysis, cognitive systems engineering, and resilience engineering across a number of fields, including critical care medicine, aviation, air traffic control, space operations, semiconductor manufacturing, and software services.
Resilience engineering is a subfield of safety science research that focuses on understanding how complex adaptive systems cope when encountering a surprise. The term resilience in this context refers to the capabilities that a system must possess in order to deal effectively with unanticipated events. Resilience engineering examines how systems build, sustain, degrade, and lose these capabilities.
Cognitive systems engineering (CSE) is a field of study that examines the intersection of people, work, and technology, with a focus on safety-critical systems. The central tenet of cognitive systems engineering is that it views a collection of people and technology as a single unit that is capable of cognitive work, which is called a joint cognitive system.
The out-of-the-loop performance problem arises when an operator suffers from performance decrement as a consequence of automation. The potential loss of skills and of situation awareness caused by vigilance and complacency problems might make operators of automated systems unable to operate manually in case of system failure. Highly automated systems reduce the operator to monitoring role, which diminishes the chances for the operator to understand the system. It is related to mind wandering.
{{cite book}}
: CS1 maint: location missing publisher (link) CS1 maint: others (link){{cite book}}
: CS1 maint: location missing publisher (link){{cite journal}}
: Cite journal requires |journal=
(help){{cite journal}}
: Cite journal requires |journal=
(help)