Just culture

Last updated

Just culture is a concept related to systems thinking which emphasizes that mistakes are generally a product of faulty organizational cultures, rather than solely brought about by the person or persons directly involved. In a just culture, after an incident, the question asked is, "What went wrong?" rather than "Who caused the problem?". [1] A just culture is the opposite of a blame culture. [1] A just culture is not the same as a no-blame culture as individuals may still be held accountable for their misconduct or negligence. [2]

A just culture helps create an environment where individuals feel free to report errors and help the organization to learn from mistakes. This is in contrast to a "blame culture" [3] where individual persons are fired, fined, or otherwise punished for making mistakes, but where the root causes leading to the error are not investigated and corrected. In a blame culture mistakes may be not reported but rather hidden, leading ultimately to diminished organizational outcomes.

In a system of just culture, discipline is linked to inappropriate behavior, rather than harm. [4] This allows for individual accountability and promotes a learning organization culture.

In this system, honest human mistakes are seen as a learning opportunity for the organization and its employees. The individual who made the mistake may be offered additional training and coaching. [5] However, willful misconduct may result in disciplinary action such as termination of employment—even if no harm was caused.

Work on just culture has been applied to industrial, [6] healthcare, [7] [8] aviation [9] [10] and other [11] settings.

The first fully developed theory of a just culture was in James Reason's 1997 book, Managing the Risks of Organizational Accidents. [2] In Reason's theory, a just culture is postulated to be one of the components of a safety culture. A just culture is required to build trust so that a reporting culture will occur. A reporting culture is where all safety incidents are reported so that learning can occur and safety improvements can be made. David Marx expanded the concept of just culture into healthcare in his 2001 report, Patient Safety and the "Just Culture": A Primer for Health Care Executives. [12]

Related Research Articles

Blame is the act of censuring, holding responsible, or making negative statements about an individual or group that their actions or inaction are socially or morally irresponsible, the opposite of praise. When someone is morally responsible for doing something wrong, their action is blameworthy. By contrast, when someone is morally responsible for doing something right, it may be said that their action is praiseworthy. There are other senses of praise and blame that are not ethically relevant. One may praise someone's good dress sense, and blame their own sense of style for their own dress sense.

A medical error is a preventable adverse effect of care ("iatrogenesis"), whether or not it is evident or harmful to the patient. This might include an inaccurate or incomplete diagnosis or treatment of a disease, injury, syndrome, behavior, infection, or other ailment.

Crew resource management or cockpit resource management (CRM) is a set of training procedures for use in environments where human error can have devastating effects. CRM is primarily used for improving aviation safety and focuses on interpersonal communication, leadership, and decision making in aircraft cockpits. Its founder is David Beaty, a former Royal Air Force and a BOAC pilot who wrote "The Human Factor in Aircraft Accidents" (1969). Despite the considerable development of electronic aids since then, many principles he developed continue to prove effective.

Clinical governance is a systematic approach to maintaining and improving the quality of patient care within the National Health Service (NHS) and private sector health care. Clinical governance became important in health care after the Bristol heart scandal in 1995, during which an anaesthetist, Dr Stephen Bolsin, exposed the high mortality rate for paediatric cardiac surgery at the Bristol Royal Infirmary. It was originally elaborated within the United Kingdom National Health Service (NHS), and its most widely cited formal definition describes it as:

A framework through which NHS organisations are accountable for continually improving the quality of their services and safeguarding high standards of care by creating an environment in which excellence in clinical care will flourish.

<span class="mw-page-title-main">Safety culture</span> Attitude, beliefs, perceptions and values that employees share in relation to risks in the workplace

Safety culture is the collection of the beliefs, perceptions and values that employees share in relation to risks within an organization, such as a workplace or community. Safety culture is a part of organizational culture, and has been described in a variety of ways, notably the National Academies of Science and the Association of Land Grant and Public Universities have published summaries on this topic in 2014 and 2016.

A high reliability organization (HRO) is an organization that has succeeded in avoiding catastrophes in an environment where normal accidents can be expected due to risk factors and complexity.

Patient safety is a discipline that emphasizes safety in health care through the prevention, reduction, reporting and analysis of error and other types of unnecessary harm that often lead to adverse patient events. The frequency and magnitude of avoidable adverse events, often known as patient safety incidents, experienced by patients was not well known until the 1990s, when multiple countries reported significant numbers of patients harmed and killed by medical errors. Recognizing that healthcare errors impact 1 in every 10 patients around the world, the World Health Organization (WHO) calls patient safety an endemic concern. Indeed, patient safety has emerged as a distinct healthcare discipline supported by an immature yet developing scientific framework. There is a significant transdisciplinary body of theoretical and research literature that informs the science of patient safety with mobile health apps being a growing area of research.

A Patient Safety Organization (PSO) is a group, institution, or association that improves medical care by reducing medical errors. Common functions of patient safety organizations are data collection, analysis, reporting, education, funding, and advocacy. A PSO differs from a Federally designed Patient Safety Organization (PSO), which provides health care providers in the U.S. privilege and confidentiality protections for efforts to improve patient safety and the quality of patient care delivery

The Health and Social Care Select Committee is a Departmental Select Committee of the British House of Commons, the lower house of the United Kingdom Parliament. Its remit is to examine the policy, administration and expenditure of the Department of Health and Social Care (DHSC) and its associated agencies and public bodies. The Clerks of the Committee are Previn Desai and Joanna Dodd.

<span class="mw-page-title-main">Swiss cheese model</span> Model used in risk analysis

The Swiss cheese model of accident causation is a model used in risk analysis and risk management, including aviation safety, engineering, healthcare, emergency service organizations, and as the principle behind layered security, as used in computer security and defense in depth. It likens human systems to multiple slices of Swiss cheese, which has randomly placed and sized holes in each slice, stacked side by side, in which the risk of a threat becoming a reality is mitigated by the differing layers and types of defenses which are "layered" behind each other. Therefore, in theory, lapses and weaknesses in one defense do not allow a risk to materialize, since other defenses also exist, to prevent a single point of failure. The model was originally formally propounded by James T. Reason of the University of Manchester, and has since gained widespread acceptance. It is sometimes called the "cumulative act effect".

<span class="mw-page-title-main">Safety of emergency medical services flights</span>

The safety of emergency medical services flights has become a topic of public interest in the United States, with the expansion of emergency medical services aviation operations, such as air ambulance and MEDEVAC, and the increasing frequency of related accidents.

The healthcare error proliferation model is an adaptation of James Reason’s Swiss Cheese Model designed to illustrate the complexity inherent in the contemporary healthcare delivery system and the attribution of human error within these systems. The healthcare error proliferation model explains the etiology of error and the sequence of events typically leading to adverse outcomes. This model emphasizes the role organizational and external cultures contribute to error identification, prevention, mitigation, and defense construction.

<span class="mw-page-title-main">Patient Safety and Quality Improvement Act</span> US law

The Patient Safety and Quality Improvement Act of 2005 (PSQIA): Pub. L.Tooltip Public Law  109–41 (text)(PDF), 42 U.S.C. ch. 6A subch. VII part C, established a system of patient safety organizations and a national patient safety database. To encourage reporting and broad discussion of adverse events, near misses, and dangerous conditions, it also established privilege and confidentiality protections for Patient Safety Work Product. The PSQIA was introduced by Sen. Jim Jeffords [I-VT]. It passed in the Senate July 21, 2005 by unanimous consent, and passed the House of Representatives on July 27, 2005, with 428 Ayes, 3 Nays, and 2 Present/Not Voting.

Clinical peer review, also known as medical peer review is the process by which health care professionals, including those in nursing and pharmacy, evaluate each other's clinical performance. A discipline-specific process may be referenced accordingly.

An accountable care organization (ACO) is a healthcare organization that ties provider reimbursements to quality metrics and reductions in the cost of care. ACOs in the United States are formed from a group of coordinated health-care practitioners. They use alternative payment models, normally, capitation. The organization is accountable to patients and third-party payers for the quality, appropriateness and efficiency of the health care provided. According to the Centers for Medicare and Medicaid Services, an ACO is "an organization of health care practitioners that agrees to be accountable for the quality, cost, and overall care of Medicare beneficiaries who are enrolled in the traditional fee-for-service program who are assigned to it".

Blame in organizations may flow between management and staff, or laterally between professionals or partner organizations. In a blame culture, problem-solving is replaced by blame-avoidance. Blame shifting may exist between rival factions. Maintaining one's reputation may be a key factor explaining the relationship between accountability and blame avoidance. The blame culture is a serious issue in certain sectors such as safety-critical domains.

The term use error has recently been introduced to replace the commonly used terms human error and user error. The new term, which has already been adopted by international standards organizations for medical devices, suggests that accidents should be attributed to the circumstances, rather than to the human beings who happened to be there.

Maritime resource management (MRM) or bridge resource management (BRM) is a set of human factors and soft skills training aimed at the maritime industry. The MRM training programme was launched in 1993 – at that time under the name bridge resource management – and aims at preventing accidents at sea caused by human error.

The Health Services Safety Investigations Body (HSSIB), formely the Healthcare Safety Investigation Branch (HSIB), is the independent national investigator for patient safety in England. HSIB was formed in April 2017 and investigates serious patient safety risks that span the healthcare system, operating independently of other regulatory agencies. It aims to produce rigorous, non-punitive, and systematic investigations and to develop system-wide recommendations for learning and improvement and to be separate from systems that seek to allocate blame, liability, or punishment.

A significant event audit (SEA), also known as significant event analysis, is a method of formally assessing significant events, particularly in primary care in the UK, with a view to improving patient care and services. To be effective, the SEA frequently seeks contributions from all members of the healthcare team and involves a subsequent discussion to answer why the occurrence happened and what lessons can be learned. Events triggering a SEA can be diverse, include both adverse and critical events, as well as good practice. It is most frequently required for appraisal, revalidation and continuing professional development.

References

  1. 1 2 Catino, Maurizio (March 2008). "A Review of Literature: Individual Blame vs. Organizational Function Logics in Accident Analysis". Journal of Contingencies and Crisis Management (Review). 16 (1): 53–62. doi:10.1111/j.1468-5973.2008.00533.x. S2CID   56379831.
  2. 1 2 Reason, James (1997). Managing the Risks of Organizational Accidents. Ashgate Publishing. ISBN   9781840141054.
  3. Khatri, N. (October–December 2009). "From a Blame Culture to a Just Culture in Health Care". Health Care Management Review. Health Care Manage Rev. 34 (4): 312–22. doi:10.1097/HMR.0b013e3181a3b709. PMID   19858916. S2CID   44623708.
  4. Behn, Brian (January 29, 2018). "Just Culture basics for EMS". National EMS Management Association.
  5. "Just Culture System and Behaviors Response Guide" (PDF). Los Angeles County Department of Mental Health. August 9, 2017. Archived from the original (PDF) on June 28, 2019. Retrieved June 28, 2019.
  6. Groeneweg, J. (2018). "The Long and Winding Road to a Just Culture". Society of Petroleum Engineers.
  7. Harvey, H. Benjamin (June 17, 2017). "The Just Culture Framework". Journal of the American College of Radiology.
  8. Boysen, Philip (Fall 2013). "Just Culture: A Foundation for Balanced Accountability and Patient Safety". The Ochsner Journal. 13 (3): 400–406. PMC   3776518 . PMID   24052772.
  9. Gain Working Group E, Flight Ops/ATC Ops Safety Information Sharing (September 2004). "A Roadmap To A Just Culture: Enhancing the Safety Environment" (PDF). Global Aviation Information Network.
  10. "Just culture Finding the right balance between the aviation, judicial and political authorities". EuroControl. Retrieved June 28, 2019.
  11. Dekker, Sidney (January 1, 2018). Just Culture: Balancing Safety and Accountability . Ashgate Publishing. ISBN   9780754672678.
  12. Marx, David (April 17, 2001). "Patient Safety and the "Just Culture": A Primer for Health Care Executives" (PDF).