David Woods (safety researcher)

Last updated

David D. Woods is an American safety systems researcher who studies human coordination and automation issues in a wide range safety-critical fields such as nuclear power, aviation, space operations, critical care medicine, and software services. He is one of the founding researchers of the fields of cognitive systems engineering [1] and resilience engineering. [2]

Contents

Biography

In 1974, Woods received his BA in psychology at Canisius College. In 1977, he received his MS in cognitive psychology at Purdue University. In 1979, he received his PhD at Purdue University in cognitive psychology, where he studied human perception and attention. [3]

From 1979 to 1988, Woods worked as a senior engineer at the Westinghouse Research and Development Center [3] where he worked on improving control room equipment interfaces for power plants. [4] [1]

From 1988 onwards, he served on the faculty of The Ohio State University in the Department of Integrated Systems, where he is currently a professor emeritus. [1]

In 2017, Woods co-founded a consulting company, Adaptive Capacity Labs, with Richard Cook and John Allspaw. [5]

Awards

Woods has previously been president of the Resilience Engineering Association (2011-2013), and the Human Factors and Ergonomics Society (1998-1999). [6] He is a fellow of the Human Factors and Ergonomics Society. [7]

National advisory committees and testimony

Work

Resilience engineering

Woods is one of the founders of the field of resilience engineering. [2] One of his significant contributions is the theory of graceful extensibility. [14]

Cognitive systems engineering

In the wake of the Three Mile Island accident, Woods and Erik Hollnagel proposed a new approach to thinking about human-computer interaction (HCI) in the domain of supervisory control, Cognitive Systems Engineering (CSE) [15] that focuses on the interaction between people, technological artifacts, and work. In this approach, a set of interacting human and software agents are viewed as a joint cognitive system, where the overall system itself is seen as performing cognitive tasks.

Theory of graceful extensibility

The theory of graceful extensibility is a theory proposed by Woods to explain how some systems are able to continually adapt over time to face new challenges (sustained adaptability) where other systems fail to do so. [16]

This theory asserts that all complex adaptive systems can be model as the composition of individual units that have some ability to adapt their behavior and communicate with other units. It is expressed as ten statements that Woods calls 'proto-theorems':

  1. Individual units have a limit in the degree to which they are able to adapt.
  2. Units will inevitably encounter events that they have difficulty dealing with.
  3. Because units have limits, they need to identify when they are near the limit, and need a mechanism to increase their limit when this happens.
  4. Individual units will never have a high enough limit to handle everything, so units have to work together.
  5. A nearby unit can affect the saturation limit of another unit.
  6. When the pressure that is applied to a unit changes, the trade-off space changes for that unit.
  7. Units perform differently as they approach saturation.
  8. Units only have a local perspective.
  9. The local perspective of any one unit is necessarily limited.
  10. Each unit has to continually do work to adjust its model of the adaptive capacity of itself and others to match the actual adaptive capacity.

Visual momentum

Woods proposed visual momentum as a measure of how easy it is for a person to navigate to a new screen and integrate the information they see, when in the process of performing a task. [17] [18] This work was motivated by study of event-driven tasks, where events occur that operators must respond to (e.g., pilots, space flight controllers, nuclear plant operators, physicians).

Woods argued that it is easy to get lost in such user interfaces. Effective operator interfaces should help figure out where to look next, and that navigating a virtual space of information could be improved by leveraging the human perceptual system has already been optimized to do, such as pattern recognition.

Woods proposed a number of concepts for improving the design of such interfaces by increasing the visual momentum:

  1. Provide a long shot view that acts a global map to assist an operator in stepping back from the specific details.
  2. Provide perceptual landmarks to help operators orient themselves within the virtual data space.
  3. Use display overlap when moving between data views: have some common subset of the data on both the current and the next view so that the transition between views is not jarring.
  4. Use spatial representation: encode information spatially to leverage the perceptual system.

Dynamic fault management

Woods studied the nature of operations work involved in identifying and mitigating faults in a supervisory context, such as controlling a power plant or operating a software service. [19] He found that this work was qualitatively different from traditional offline troubleshooting that had previously been studied. [20] In particular, because of the dynamic nature of the underlying component, the nature and severity of the problem can potentially change over time. In addition, because of the safety-critical nature of the process, the operator must work to limit possible harms in addition to addressing the underlying problem.

How complex, adaptive systems break down

Woods's research found three recurring patterns in the failure modes of complex adaptive systems: [21]

  1. Decompensation
  2. Working at cross-purposes
  3. Getting stuck in outdated behaviors

Adaptive universe

The adaptive universe is a model proposed by Woods for the constraints that all complex adaptive systems are bound by. The model contains two assumptions: [16]

  1. The amount of resources available to a system are always finite.
  2. The environment that a system is embedded within is always dynamic: change never stops.

Selected publications

Books

Related Research Articles

<span class="mw-page-title-main">Systems engineering</span> Interdisciplinary field of engineering

Systems engineering is an interdisciplinary field of engineering and engineering management that focuses on how to design, integrate, and manage complex systems over their life cycles. At its core, systems engineering utilizes systems thinking principles to organize this body of knowledge. The individual outcome of such efforts, an engineered system, can be defined as a combination of components that work in synergy to collectively perform a useful function.

Traffic psychology is a discipline of psychology that studies the relationship between psychological processes and the behavior of road users. In general, traffic psychology aims to apply theoretical aspects of psychology in order to improve traffic mobility by helping to develop and apply crash countermeasures, as well as by guiding desired behaviors through education and the motivation of road users.

In the field of human factors and ergonomics, human reliability is the probability that a human performs a task to a sufficient standard. Reliability of humans can be affected by many factors such as age, physical health, mental state, attitude, emotions, personal propensity for certain mistakes, and cognitive biases.

Situational awareness or situation awareness (SA) is the understanding of an environment, its elements, and how it changes with respect to time or other factors. Situational awareness is important for effective decision making in many environments. It is formally defined as:

“the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future”.

Cognitive ergonomics is a scientific discipline that studies, evaluates, and designs tasks, jobs, products, environments and systems and how they interact with humans and their cognitive abilities. It is defined by the International Ergonomics Association as "concerned with mental processes, such as perception, memory, reasoning, and motor response, as they affect interactions among humans and other elements of a system. Cognitive ergonomics is responsible for how work is done in the mind, meaning, the quality of work is dependent on the persons understanding of situations. Situations could include the goals, means, and constraints of work. The relevant topics include mental workload, decision-making, skilled performance, human-computer interaction, human reliability, work stress and training as these may relate to human-system design." Cognitive ergonomics studies cognition in work and operational settings, in order to optimize human well-being and system performance. It is a subset of the larger field of human factors and ergonomics.

Supervisory control is a general term for control of many individual controllers or control loops, such as within a distributed control system. It refers to a high level of overall monitoring of individual process controllers, which is not necessary for the operation of each controller, but gives the operator an overall plant process view, and allows integration of operation between controllers.

Human error is an action that has been done but that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits". Human error has been cited as a primary cause contributing factor in disasters and accidents in industries as diverse as nuclear power, aviation, space exploration, and medicine. Prevention of human error is generally seen as a major contributor to reliability and safety of (complex) systems. Human error is one of the many contributing causes of risk events.

Engineering psychology, also known as Human Factors Engineering, is the science of human behavior and capability, applied to the design and operation of systems and technology. As an applied field of psychology and an interdisciplinary part of ergonomics, it aims to improve the relationships between people and machines by redesigning equipment, interactions, or the environment in which they take place. The work of an engineering psychologist is often described as making the relationship more "user-friendly."

Nancy G. Leveson is an American specialist in system and software safety and a Professor of Aeronautics and Astronautics at MIT, United States.

Systems psychology is a branch of both theoretical psychology and applied psychology that studies human behaviour and experience as complex systems. It is inspired by systems theory and systems thinking, and based on the theoretical work of Roger Barker, Gregory Bateson, Humberto Maturana and others. Groups and individuals are considered as systems in homeostasis. Alternative terms here are "systemic psychology", "systems behavior", and "systems-based psychology".

Macrocognition indicates a descriptive level of cognition performed in natural instead of artificial (laboratory) environments. This term is reported to have been coined by Pietro Cacciabue and Erik Hollnagel in 1995. However, it is also reported that it was used in the 1980s in European Cognitive Systems Engineering research. Possibly the earliest reference is the following, although it does not use the exact term "macrocognition":

A macro-theory is a theory which is concerned with the obvious regularities of human experience, rather than with some theoretically defined unit. To refer to another psychological school, it would correspond to a theory at the level of Gestalten. It resembles Newell’s suggestion for a solution that would analyse more complex tasks. However, the idea of a macro-theory does not entail an analysis of the mechanistic materialistic kind which is predominant in cognitive psychology. Thus we should have a macro-theory of remembering rather than of memory, to say nothing of short-term memory, proactive inhibition release, or memory scanning. To take another example, we should have a macro-theory of attending, rather than a mini-theory of attention, or micro-theories of limited channel capacities or logarithmic dependencies in disjunctive reaction times. This would ease the dependence on the information processing analogy, but not necessarily lead to an abandonment of the information processing terminology, the Flowchart, or the concept of control structures. The meta-technical sciences can contribute to a psychology of cognition as well as to cognitive psychology. What should be abandoned is rather the tendency to think in elementaristic terms and to increase the plethora of mini-and micro-theories. ... To conclude, if the psychological study of cognition shall have a future that is not a continued description of human information processing, its theories must be at what we have called the macro-level. This means that they must correspond to the natural units of experience and consider these in relation to the regularities of human experience, rather than as manifestations of hypothetical information processing mechanisms in the brain. A psychology should start at the level of natural units in human experience and try to work upwards towards the level of functions and human action, rather than downwards towards the level of elementary information processes and the structure of the IPS.

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

<span class="mw-page-title-main">Ergonomics</span> Designing systems to suit their users

Ergonomics, also known as human factors or human factors engineering (HFE), is the application of psychological and physiological principles to the engineering and design of products, processes, and systems. Primary goals of human factors engineering are to reduce human error, increase productivity and system availability, and enhance safety, health and comfort with a specific focus on the interaction between the human and equipment.

<span class="mw-page-title-main">Mica Endsley</span> Former Chief Scientist of the U.S. Air Force

Mica Endsley is an American engineer and a former Chief Scientist of the United States Air Force.

Human performance modeling (HPM) is a method of quantifying human behavior, cognition, and processes. It is a tool used by human factors researchers and practitioners for both the analysis of human function and for the development of systems designed for optimal user experience and interaction. It is a complementary approach to other usability testing methods for evaluating the impact of interface features on operator performance.

<span class="mw-page-title-main">Daniel Gopher</span> Israeli cognitive psychologist and ergonomist

Daniel Gopher is a professor (Emeritus) of Cognitive psychology and Human Factors Engineering at the Faculty of Industrial Engineering and Management, Technion - Israel Institute of Technology. He held the Yigal Alon Chair for the Study of Humans at Work at the Technion. Gopher is a fellow of the Human Factors and Ergonomics Society, the Psychonomic Society and the International Ergonomics Association.

Dr. Richard I. Cook was a system safety researcher, physician, anesthesiologist, university professor, and software engineer. Cook did research in safety, incident analysis, cognitive systems engineering, and resilience engineering across a number of fields, including critical care medicine, aviation, air traffic control, space operations, semiconductor manufacturing, and software services.

Resilience engineering is a subfield of safety science research that focuses on understanding how complex adaptive systems cope when encountering a surprise. The term resilience in this context refers to the capabilities that a system must possess in order to deal effectively with unanticipated events. Resilience engineering examines how systems build, sustain, degrade, and lose these capabilities.

Cognitive systems engineering (CSE) is a field of study that examines the intersection of people, work, and technology, with a focus on safety-critical systems. The central tenet of cognitive systems engineering is that it views a collection of people and technology as a single unit that is capable of cognitive work, which is called a joint cognitive system.

The out-of-the-loop performance problem arises when an operator suffers from performance decrement as a consequence of automation. The potential loss of skills and of situation awareness caused by vigilance and complacency problems might make operators of automated systems unable to operate manually in case of system failure. Highly automated systems reduce the operator to monitoring role, which diminishes the chances for the operator to understand the system. It is related to mind wandering.

References

  1. 1 2 3 Cognitive systems engineering : the future for a changing world. Philip J. Smith, Robert R. Hoffman. Boca Raton. 2018. ISBN   978-1-315-57252-9. OCLC   1002192481.{{cite book}}: CS1 maint: location missing publisher (link) CS1 maint: others (link)
  2. 1 2 Dekker, Sidney (2019). Foundations of safety science : a century of understanding accidents and disasters. Boca Raton. ISBN   978-1-351-05977-0. OCLC   1091899791.{{cite book}}: CS1 maint: location missing publisher (link)
  3. 1 2 Woods, David D. "Curriculum Vitae: David D. Woods" (PDF). Retrieved 2022-09-10.
  4. Woods, D. D.; Wise, J. A.; Hanes, L. F. (1982-02-01). "Evaluation of safety parameter display concepts. Final report". OSTI   5339665.{{cite journal}}: Cite journal requires |journal= (help)
  5. "The Career, Accomplishments, and Impact of Richard I. Cook: A Life in Many Acts – Adaptive Capacity Labs" . Retrieved 2022-09-29.
  6. "HFES Officers, Editors, and Committee Chairs" . Retrieved 2022-09-17.
  7. "HFES Fellows Program: List of Fellows" . Retrieved 2022-09-17.
  8. "NASA - Report of Columbia Accident Investigation Board, Volume I". www.nasa.gov. Retrieved 2022-09-18.
  9. "Future of NASA". U.S. Senate Committee on Commerce, Science, & Transportation. 2003-10-29. Retrieved 2022-09-18.
  10. Council, National Research (2007-05-09). Software for Dependable Systems: Sufficient Evidence?. National Academies Press. ISBN   978-0-309-10394-7.
  11. "Defense Science Board Task Force Report: The Role of Autonomy in DoD Systems". 2012-07-01.{{cite journal}}: Cite journal requires |journal= (help)
  12. Nakamura, Dave (2013-09-05). "Operational Use of Flight Path Management System. Final Report of the Performance-based operations Aviation Rulemaking Committee/Commercial Aviation Safety Team Flight Deck Automation Working Group" (PDF). Federal Aviation Association. Retrieved 2022-09-17.
  13. Council, National Research (2014-06-05). Autonomy Research for Civil Aviation: Toward a New Era of Flight. National Academies Press. ISBN   978-0-309-30614-0.
  14. Woods, David D. (December 2018). "The theory of graceful extensibility: basic rules that govern adaptive systems". Environment Systems and Decisions. 38 (4): 433–457. Bibcode:2018EnvSD..38..433W. doi:10.1007/s10669-018-9708-3. ISSN   2194-5403. S2CID   70052983.
  15. Hollnagel, Erik; Woods, David D (August 1999). "Cognitive Systems Engineering: New wine in new bottles". International Journal of Human-Computer Studies. 51 (2): 339–356. doi:10.1006/ijhc.1982.0313. PMID   11543350.
  16. 1 2 Woods, David D. (2018-12-01). "The theory of graceful extensibility: basic rules that govern adaptive systems". Environment Systems and Decisions. 38 (4): 433–457. Bibcode:2018EnvSD..38..433W. doi:10.1007/s10669-018-9708-3. ISSN   2194-5411. S2CID   70052983.
  17. Woods, David D. (September 1984). "Visual momentum: a concept to improve the cognitive coupling of person and computer". International Journal of Man-Machine Studies. 21 (3): 229–244. doi:10.1016/s0020-7373(84)80043-7. ISSN   0020-7373.
  18. Woods, David D.; Watts, Jennifer C. (1997), "How Not to Have to Navigate Through Too Many Displays", Handbook of Human-Computer Interaction, Elsevier, pp. 617–650, doi:10.1016/b978-044481862-1.50092-3, ISBN   9780444818621 , retrieved 2022-09-25
  19. Woods, D.D. (1994-02-28). "Cognitive demands and activities in dynamic fault management: abductive reasoning and disturbance management". In Stanton, Neville A. (ed.). Human Factors in Alarm Design (0 ed.). CRC Press. doi:10.1201/9780203481714. ISBN   978-0-203-48171-4.
  20. Rasmussen, J.; Jensen, A. (May 1974). "Mental Procedures in Real-Life Tasks: A Case Study of Electronic Trouble Shooting". Ergonomics. 17 (3): 293–307. doi:10.1080/00140137408931355. ISSN   0014-0139. PMID   4442376.
  21. Woods, D.D.; Branlat, M (2017-05-15). "Basic Patterns in How Adaptive Systems Fail". In Hollnagel, Erik; Pariès, Jean; Woods, David; Wreathall, John (eds.). Resilience Engineering in Practice: A Guidebook (1 ed.). CRC Press. doi:10.1201/9781317065265. ISBN   978-1-315-60569-2.