Sidney Dekker

Last updated

Sidney W. A. Dekker is Professor in the School of Humanities, Languages and Social Science at Griffith University in Brisbane, Australia, where he founded the Safety Science Innovation Lab. [1] He is a trained mediator and he volunteers as a crisis chaplain.

Contents

Previously, Dekker was Professor of human factors and system safety at Lund University in Sweden, [2] where he founded the Leonardo da Vinci Laboratory for Complexity and Systems Thinking, and flew as First Officer on Boeing 737s for Sterling and later Cimber Airlines out of Copenhagen. He is an avid piano player. Dekker is a high-profile scholar (h-index = 63) [3] and is known globally for his work in the fields of human factors and safety. He founded the terms Safety Differently and Restorative Just Culture which have since turned into global movements for change. They encourage organisations to declutter their bureaucracy and enhance the capacities in people and processes that make things go well—and to offer compassion, restoration and learning when they don’t. Part of the group of founding scientists behind 'Resilience Engineering,' his work has inspired the birth of HOP (Human and Organizational Performance), New View Safety, Learning Teams, and more. [3] [4] [5] [6]

Publications

Books

Documentaries

Related Research Articles

<span class="mw-page-title-main">Fritjof Capra</span> American physicist and author (born 1939)

Fritjof Capra is an Austrian-born American author, physicist, systems theorist and deep ecologist. In 1995, he became a founding director of the Center for Ecoliteracy in Berkeley, California. He is on the faculty of Schumacher College.

Transformative justice is a spectrum of social, economic, legal, and political practices and philosophies that aim to focus on the structures and underlying conditions that perpetuate harm and injustice. Taking up and expanding on the goals of restorative justice such as individual/community accountability, reparation, and non-retributive responses to harm, transformative justice imagines and puts into practice alternatives to the formal, state-based criminal justice system.

In the field of human factors and ergonomics, human reliability is the probability that a human performs a task to a sufficient standard. Reliability of humans can be affected by many factors such as age, physical health, mental state, attitude, emotions, personal propensity for certain mistakes, and cognitive biases.

<span class="mw-page-title-main">Redundancy (engineering)</span> Duplication of critical components to increase reliability of a system

In engineering and systems theory, redundancy is the intentional duplication of critical components or functions of a system with the goal of increasing reliability of the system, usually in the form of a backup or fail-safe, or to improve actual system performance, such as in the case of GNSS receivers, or multi-threaded computer processing.

<span class="mw-page-title-main">Pilot error</span> Decision, action, or inaction by an aircraft pilot

In aviation, pilot error generally refers to an action or decision made by a pilot that is a substantial contributing factor leading to an aviation accident. It also includes a pilot's failure to make a correct decision or take proper action. Errors are intentional actions that fail to achieve their intended outcomes. The Chicago Convention defines the term "accident" as "an occurrence associated with the operation of an aircraft [...] in which [...] a person is fatally or seriously injured [...] except when the injuries are [...] inflicted by other persons." Hence the definition of "pilot error" does not include deliberate crashing.

<span class="mw-page-title-main">Safety culture</span> Risk-averse attitudes

Safety culture is the element of organizational culture which is concerned with the maintenance of safety and compliance with safety standards. It is informed by the organization's leadership and the beliefs, perceptions and values that employees share in relation to risks within the organization, workplace or community. Safety culture has been described in a variety of ways: notably, the National Academies of Science and the Association of Land Grant and Public Universities have published summaries on this topic in 2014 and 2016.

A high reliability organization (HRO) is an organization that has succeeded in avoiding catastrophes in an environment where normal accidents can be expected due to risk factors and complexity.

Human error is an action that has been done but that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits". Human error has been cited as a primary cause and contributing factor in disasters and accidents in industries as diverse as nuclear power, aviation, space exploration, and medicine. Prevention of human error is generally seen as a major contributor to reliability and safety of (complex) systems. Human error is one of the many contributing causes of risk events.

Instruction creep or rule creep occurs when instructions or rules accumulate over time until they are unmanageable or inappropriate. It is a type of scope creep. The accumulation of bureaucratic requirements results in overly complex procedures that are often misunderstood, irritating, time-wasting, or ignored.

<span class="mw-page-title-main">Science and inventions of Leonardo da Vinci</span> Leonardo da Vincis inventions and his relationship to science

Leonardo da Vinci (1452–1519) was an Italian polymath, regarded as the epitome of the "Renaissance Man", displaying skills in numerous diverse areas of study. While most famous for his paintings such as the Mona Lisa and the Last Supper, Leonardo is also renowned in the fields of civil engineering, chemistry, geology, geometry, hydrodynamics, mathematics, mechanical engineering, optics, physics, pyrotechnics, and zoology.

A social-ecological system consists of 'a bio-geo-physical' unit and its associated social actors and institutions. Social-ecological systems are complex and adaptive and delimited by spatial or functional boundaries surrounding particular ecosystems and their context problems.

Blame in organizations may flow between management and staff, or laterally between professionals or partner organizations. In a blame culture, problem-solving is replaced by blame-avoidance. Blame shifting may exist between rival factions. Maintaining one's reputation may be a key factor explaining the relationship between accountability and blame avoidance. The blame culture is a serious issue in certain sectors such as safety-critical domains.

The term use error has recently been introduced to replace the commonly used terms human error and user error. The new term, which has already been adopted by international standards organizations for medical devices, suggests that accidents should be attributed to the circumstances, rather than to the human beings who happened to be there.

Human factors are the physical or cognitive properties of individuals, or social behavior which is specific to humans, and which influence functioning of technological systems as well as human-environment equilibria. The safety of underwater diving operations can be improved by reducing the frequency of human error and the consequences when it does occur. Human error can be defined as an individual's deviation from acceptable or desirable practice which culminates in undesirable or unexpected results. Human factors include both the non-technical skills that enhance safety and the non-technical factors that contribute to undesirable incidents that put the diver at risk.

[Safety is] An active, adaptive process which involves making sense of the task in the context of the environment to successfully achieve explicit and implied goals, with the expectation that no harm or damage will occur. – G. Lock, 2022

Dive safety is primarily a function of four factors: the environment, equipment, individual diver performance and dive team performance. The water is a harsh and alien environment which can impose severe physical and psychological stress on a diver. The remaining factors must be controlled and coordinated so the diver can overcome the stresses imposed by the underwater environment and work safely. Diving equipment is crucial because it provides life support to the diver, but the majority of dive accidents are caused by individual diver panic and an associated degradation of the individual diver's performance. – M.A. Blumenberg, 1996

Maritime resource management (MRM) or bridge resource management (BRM) is a set of human factors and soft skills training aimed at the maritime industry. The MRM training programme was launched in 1993 – at that time under the name bridge resource management – and aims at preventing accidents at sea caused by human error.

<span class="mw-page-title-main">Threat and error management</span> Safety management approach

In aviation safety, threat and error management (TEM) is an overarching safety management approach that assumes that pilots will naturally make mistakes and encounter risky situations during flight operations. Rather than try to avoid these threats and errors, its primary focus is on teaching pilots to manage these issues so they do not impair safety. Its goal is to maintain safety margins by training pilots and flight crews to detect and respond to events that are likely to cause damage (threats) as well as mistakes that are most likely to be made (errors) during flight operations.

Just culture is a concept related to systems thinking which emphasizes that mistakes are generally a product of faulty organizational cultures, rather than solely brought about by the person or persons directly involved. In a just culture, after an incident, the question asked is, "What went wrong?" rather than "Who caused the problem?". A just culture is the opposite of a blame culture. A just culture is not the same as a no-blame culture as individuals may still be held accountable for their misconduct or negligence.

David D. Woods is an American safety systems researcher who studies human coordination and automation issues in a wide range safety-critical fields such as nuclear power, aviation, space operations, critical care medicine, and software services. He is one of the founding researchers of the fields of cognitive systems engineering and resilience engineering.

Dr. Richard I. Cook was a system safety researcher, physician, anesthesiologist, university professor, and software engineer. Cook did research in safety, incident analysis, cognitive systems engineering, and resilience engineering across a number of fields, including critical care medicine, aviation, air traffic control, space operations, semiconductor manufacturing, and software services.

Resilience engineering is a subfield of safety science research that focuses on understanding how complex adaptive systems cope when encountering a surprise. The term resilience in this context refers to the capabilities that a system must possess in order to deal effectively with unanticipated events. Resilience engineering examines how systems build, sustain, degrade, and lose these capabilities.

References

  1. "Safety Science Innovation Lab - Our Vision". Griffith University . Retrieved 1 July 2014.
  2. "Sidney Dekker". Lund University . Retrieved 1 July 2014.
  3. 1 2 "Sidney Dekker". Google Scholar. Retrieved 1 April 2017.
  4. "Brisbane 2014 - Program". devopsdays.org. Retrieved 1 July 2014.
  5. "Sidney Dekker - Role & Resume". SAFEmap International. Retrieved 1 July 2014.
  6. "UQCCR - Professor Sidney Dekker". The University of Queensland . Retrieved 1 July 2014.
  7. "Safety Differently".