Just culture is a concept related to systems thinking which emphasizes that mistakes are generally a product of faulty organizational cultures, rather than solely brought about by the person or persons directly involved. In a just culture, after an incident, the question asked is, "What went wrong?" rather than "Who caused the problem?". [1] A just culture is the opposite of a blame culture. [1] A just culture is not the same as a no-blame culture as individuals may still be held accountable for their misconduct or negligence. [2]
A just culture helps create an environment where individuals feel free to report errors and help the organization to learn from mistakes. This is in contrast to a "blame culture" [3] where individual persons are fired, fined, or otherwise punished for making mistakes, but where the root causes leading to the error are not investigated and corrected. In a blame culture mistakes may be not reported but rather hidden, leading ultimately to diminished organizational outcomes.
In a system of just culture, discipline is linked to inappropriate behavior, rather than harm. [4] This allows for individual accountability and promotes a learning organization culture.
In this system, honest human mistakes are seen as a learning opportunity for the organization and its employees. The individual who made the mistake may be offered additional training and coaching. [5] However, willful misconduct may result in disciplinary action such as termination of employment—even if no harm was caused.
Work on just culture has been applied to industrial, [6] healthcare, [7] [8] aviation [9] [10] and other [11] settings.
The first fully developed theory of a just culture was in James Reason's 1997 book, Managing the Risks of Organizational Accidents. [2] In Reason's theory, a just culture is postulated to be one of the components of a safety culture. A just culture is required to build trust so that a reporting culture will occur. A reporting culture is where all safety incidents are reported so that learning can occur and safety improvements can be made. David Marx expanded the concept of just culture into healthcare in his 2001 report, Patient Safety and the "Just Culture": A Primer for Health Care Executives. [12]
Blame is the act of censuring, holding responsible, or making negative statements about an individual or group that their actions or inaction are socially or morally irresponsible, the opposite of praise. When someone is morally responsible for doing something wrong, their action is blameworthy. By contrast, when someone is morally responsible for doing something right, it may be said that their action is praiseworthy. There are other senses of praise and blame that are not ethically relevant. One may praise someone's good dress sense, and blame their own sense of style for their own dress sense.
Crew resource management or cockpit resource management (CRM) is a set of training procedures for use in environments where human error can have devastating effects. CRM is primarily used for improving aviation safety and focuses on interpersonal communication, leadership, and decision making in aircraft cockpits. Its founder is David Beaty, a former Royal Air Force and a BOAC pilot who wrote The Human Factor in Aircraft Accidents (1969). Despite the considerable development of electronic aids since then, many principles he developed continue to prove effective.
Clinical governance is a systematic approach to maintaining and improving the quality of patient care within the National Health Service (NHS) and private sector health care. Clinical governance became important in health care after the Bristol heart scandal in 1995, during which an anaesthetist, Dr Stephen Bolsin, exposed the high mortality rate for paediatric cardiac surgery at the Bristol Royal Infirmary. It was originally elaborated within the United Kingdom National Health Service (NHS), and its most widely cited formal definition describes it as:
A framework through which NHS organisations are accountable for continually improving the quality of their services and safeguarding high standards of care by creating an environment in which excellence in clinical care will flourish.
Safety culture is the element of organizational culture which is concerned with the maintenance of safety and compliance with safety standards. It is informed by the organization's leadership and the beliefs, perceptions and values that employees share in relation to risks within the organization, workplace or community. Safety culture has been described in a variety of ways: notably, the National Academies of Science and the Association of Land Grant and Public Universities have published summaries on this topic in 2014 and 2016.
A high reliability organization (HRO) is an organization that has succeeded in avoiding catastrophes in an environment where normal accidents can be expected due to risk factors and complexity.
Patient safety is a discipline that emphasizes safety in health care through the prevention, reduction, reporting and analysis of error and other types of unnecessary harm that often lead to adverse patient events. The magnitude of avoidable adverse events, often known as patient safety incidents, experienced by patients was not well known until the 1990s, when multiple countries reported significant numbers of patients harmed and killed by medical errors. Recognizing that healthcare errors impact 1 in every 10 patients around the world, the World Health Organization (WHO) calls patient safety an endemic concern. Indeed, patient safety has emerged as a distinct healthcare discipline supported by an immature yet developing scientific framework. There is a significant transdisciplinary body of theoretical and research literature that informs the science of patient safety with mobile health apps being a growing area of research.
A patient safety organization (PSO) is a group, institution, or association that improves medical care by reducing medical errors. Common functions of patient safety organizations are data collection, analysis, reporting, education, funding, and advocacy. A PSO differs from a Federally designed Patient Safety Organization (PSO), which provides health care providers in the U.S. privilege and confidentiality protections for efforts to improve patient safety and the quality of patient care delivery
A near miss, near death, near hit, close call is an unplanned event that has the potential to cause, but does not actually result in human injury, environmental or equipment damage, or an interruption to normal operation.
The Health and Social Care Select Committee is a Departmental Select Committee of the British House of Commons, the lower house of the United Kingdom Parliament. Its remit is to examine the policy, administration and expenditure of the Department of Health and Social Care (DHSC) and its associated agencies and public bodies. The Clerks of the Committee are Previn Desai and Joanna Dodd.
The Swiss cheese model of accident causation is a model used in risk analysis and risk management. It likens human systems to multiple slices of Swiss cheese, which has randomly placed and sized holes in each slice, stacked side by side, in which the risk of a threat becoming a reality is mitigated by the differing layers and types of defenses which are "layered" behind each other. Therefore, in theory, lapses and weaknesses in one defense do not allow a risk to materialize, since other defenses also exist, to prevent a single point of failure.
The safety of emergency medical services flights has become a topic of public interest in the United States, with the expansion of emergency medical services aviation operations, such as air ambulance and MEDEVAC, and the increasing frequency of related accidents.
The healthcare error proliferation model is an adaptation of James Reason’s Swiss Cheese Model designed to illustrate the complexity inherent in the contemporary healthcare delivery system and the attribution of human error within these systems. The healthcare error proliferation model explains the etiology of error and the sequence of events typically leading to adverse outcomes. This model emphasizes the role organizational and external cultures contribute to error identification, prevention, mitigation, and defense construction.
The Patient Safety and Quality Improvement Act of 2005 (PSQIA): Pub. L. 109–41 (text)(PDF), 42 U.S.C. ch. 6A subch. VII part C, established a system of patient safety organizations and a national patient safety database. To encourage reporting and broad discussion of adverse events, near misses, and dangerous conditions, it also established privilege and confidentiality protections for Patient Safety Work Product. The PSQIA was introduced by Sen. Jim Jeffords [I-VT]. It passed in the Senate July 21, 2005 by unanimous consent, and passed the House of Representatives on July 27, 2005, with 428 Ayes, 3 Nays, and 2 Present/Not Voting.
Clinical peer review, also known as medical peer review is the process by which health care professionals, including those in nursing and pharmacy, evaluate each other's clinical performance. A discipline-specific process may be referenced accordingly.
An accountable care organization (ACO) is a healthcare organization that ties provider reimbursements to quality metrics and reductions in the cost of care. ACOs in the United States are formed from a group of coordinated health-care practitioners. They use alternative payment models, normally, capitation. The organization is accountable to patients and third-party payers for the quality, appropriateness and efficiency of the health care provided. According to the Centers for Medicare and Medicaid Services, an ACO is "an organization of health care practitioners that agrees to be accountable for the quality, cost, and overall care of Medicare beneficiaries who are enrolled in the traditional fee-for-service program who are assigned to it".
Blame in organizations may flow between management and staff, or laterally between professionals or partner organizations. In a blame culture, problem-solving is replaced by blame-avoidance. Blame shifting may exist between rival factions. Maintaining one's reputation may be a key factor explaining the relationship between accountability and blame avoidance. The blame culture is a serious issue in certain sectors such as safety-critical domains.
The term use error has recently been introduced to replace the commonly used terms human error and user error. The new term, which has already been adopted by international standards organizations for medical devices, suggests that accidents should be attributed to the circumstances, rather than to the human beings who happened to be there.
Maritime resource management (MRM) or bridge resource management (BRM) is a set of human factors and soft skills training aimed at the maritime industry. The MRM training programme was launched in 1993 – at that time under the name bridge resource management – and aims at preventing accidents at sea caused by human error.
Closed-loop communication is a communication technique used to avoid misunderstandings.
The Health Services Safety Investigations Body (HSSIB) is a fully independent arm's length body of the Department of Health and Social Care. HSSIB came into operation on 1 October 2023. It investigates patient safety concerns across the NHS in England and in independent healthcare settings where safety learning could also help to improve NHS care.