Team error

Last updated

Team Error refers to errors that occur in settings where multiple people are working together. Dependency increases the likelihood of human error due to interactions with other seemingly independent defense mechanisms. Engaging multiple people to perform a task does not ensure that the task will be done correctly. One potential dependency is team error, an error of one or more members that allows other individual members of the same group to make a mistake.

Contents

Common types of team errors

Examples

Pilot/copilot error

Air Florida Flight 90 at Washington’s National Airport in January 1982 [1] had not been properly de-iced. Snow accumulated on the leading edges of the wings as the flight crew prepared for takeoff. During the after-start checklist procedure, the co-captain called out “engine anti-ice system". The captain reported, “engine anti-ice system off,” and then failed to turn it on. The system should have been on. Consequently, ice interfered with the engine pressure ratio (EPR) system, the primary indication of engine thrust. The copilot called the captain’s attention to the anomalous engine indications at least five times in the last moments before the plane rotated off the runway, but he did not oppose the captain’s decision to continue takeoff. Given the engine indications, he should have insisted on aborting the takeoff. The plane crashed, killing 74 of the 79 people on board.

Diffusion of responsibility error

At a US Department of Energy production facility in the late 1980s, [1] the shift manager in the operating contractor organization, along with a small group of shift supervisors, planned and carried out the replacement of a faulty pump. Following the work control system had not been successful. The supervisory group reasoned that continued reliance on that system would not be successful. Schedule pressures and frustration led the men to take matters into their own hands and do the work themselves. The team violated procedures governing the work control system, quality inspections, worker certification and union labor rules governing work assignments and responsibilities. No single salaried supervisor would otherwise have considered doing a union mechanic’s job. In the group situation, the rules were discounted.

Halo Effect

Enron was an American energy company based in Houston, Texas in 2001.  It was revealed that the company had been engaging in accounting fraud. Enron had been hiding billions of dollars in debt via various accounting loopholes, the company's shareholders filed a $40 billion lawsuit. The downfall of Enron was due to a lack of ethical leadership and coordination. Leaders were concerned with promoting a win at any cost culture through systems such as  “rank and yank” by encouraging employees to achieve short-term goals at any cost.  The culture at Enron caused high pressures to meet sales goals which inevitably impacted the employees’ ethical decision-making. Enron shareholders lost $74 billion leading up to its bankruptcy,  its employees lost their jobs and their pensions. [2]

Medical Error

Medical error can occur when there is poor coordination and communication amongst doctors and nurses, organizational practices that are out of date, and a cultural norm that suggests protecting oneself rather than protecting patients. Reducing this would need healthcare workers to communicate more effectively and would require training to acquire the same knowledge and understanding as other doctors and nurses to prevent errors from happening in the medical world. This would include healthcare professionals in all degrees including military medical personnel, general health, and specialized health. [3]

Fire Fighting

Team error within firefighters is extremely dangerous. Not only is it a risk to the public if there is team error amongst firefighters, but it’s also a risk for themselves. A miscommunication or a failure to relay information could be detrimental to all involved. For example, failure to communicate where the fire is located could lead to groups of firefighters walking right into the fire, or into a place where they become trapped. If there were occupants in the burning building, firefighters would then have a harder job of rescuing them because they would also be focused on rescuing the firefighters that are trapped. After many accidents and fallout of communication, fire departments have included tips on how to avoid team error and even how to recognize it and swiftly put a stop to it. [4]

Strategies tend to reduce the occurrence of team errors

The following strategies were proposed by the DOE Human Performance Improvement Handbook. [1]

See also

Related Research Articles

Flight engineer Air crew member responsible for systems monitoring

A flight engineer (FE), also sometimes called an air engineer, is the member of an aircraft's flight crew who monitors and operates its complex aircraft systems. In the early era of aviation, the position was sometimes referred to as the "air mechanic". Flight engineers can still be found on some larger fixed-wing airplanes and helicopters. A similar crew position exists on some spacecraft. In most modern aircraft, their complex systems are both monitored and adjusted by electronic microprocessors and computers, resulting in the elimination of the flight engineer's position.

Diffusion of responsibility Sociopsychological phenomenon whereby a person is less likely to take responsibility for action or inaction when others are present

Diffusion of responsibility is a sociopsychological phenomenon whereby a person is less likely to take responsibility for action or inaction when other bystanders or witnesses are present. Considered a form of attribution, the individual assumes that others either are responsible for taking action or have already done so.

Scandinavian Airlines System Flight 751 1991 aviation incident

Scandinavian Airlines System Flight 751 was a regularly scheduled Scandinavian Airlines passenger flight from Stockholm, Sweden, to Warsaw, Poland, via Copenhagen, Denmark. On 27 December 1991, a McDonnell Douglas MD-81 operating the flight, registration OY-KHO, piloted by Danish Captain Stefan G. Rasmussen (44) and Swedish first officer Ulf Cedermark (34), both experienced pilots with 8,000 and 3,000 flight hours, respectively, was forced to make an emergency landing in a field near Gottröra, Sweden. Ice had collected on the wings' inner roots before takeoff, broke off, and was ingested into the engines as the aircraft became airborne on takeoff, ultimately resulting in the failure of both engines. All 129 passengers and crew aboard survived.

Teamwork Collaborative effort of a team to achieve a common goal

Teamwork is the collaborative effort of a group to achieve a common goal or to complete a task in the most effective and efficient way. This concept is seen within the greater framework of a team, which is a group of interdependent individuals who work together towards a common goal. The four key characteristics of a team include a shared goal, interdependence, boundedness and stability, the ability to manage their own work and internal process, and operate in a bigger social system. Basic requirements for effective teamwork are an adequate team size. The context is important, and team sizes can vary depending upon the objective. A team must include at least 2 or more members, and most teams range in size from 2 to 100. Sports teams generally have fixed sizes based upon set rules, and work teams may change in size depending upon the phase and complexity of the objective. Teams need to be able to leverage resources to be productive, and clearly defined roles within the team in order for everyone to have a clear purpose. Teamwork is present in any context where a group of people are working together to achieve a common goal. These contexts include an industrial organization, athletics, a school, and the healthcare system. In each of these settings, the level of teamwork and interdependence can vary from low, to intermediate, to high, depending on the amount of communication, interaction, and collaboration present between team members. E. g. Team work coordinates the work as early as possible

Crew resource management or cockpit resource management (CRM) is a set of training procedures for use in environments where human error can have devastating effects. Used primarily for improving aviation safety, CRM focuses on interpersonal communication, leadership, and decision making in the cockpit of an airliner. Its pioneer is David Beaty, a former Royal Air Force pilot and later a BOAC pilot who wrote his seminal book The Human Factor in Aircraft Accidents first published in 1969. Despite the considerable development of electronic aids since then, many principles he developed continue to prove effective today.

Human reliability is related to the field of human factors and ergonomics, and refers to the reliability of humans in fields including manufacturing, medicine and nuclear power. Human performance can be affected by many factors such as age, state of mind, physical health, attitude, emotions, propensity for certain common mistakes, errors and cognitive biases, etc.

USAir Flight 405 March 1992 plane crash in New York, US

USAir Flight 405 was a regularly scheduled domestic passenger flight between LaGuardia Airport in Queens, New York City, New York, and Cleveland, Ohio. On March 22, 1992, a USAir Fokker F28, registration N485US, flying the route, crashed in poor weather in a partially inverted position in Flushing Bay, shortly after liftoff from LaGuardia. The undercarriage lifted off from the runway, but the airplane failed to gain lift, flying only several meters above the ground. The aircraft then veered off the runway and hit several obstructions before coming to rest in Flushing Bay, just beyond the end of the runway. Of the 51 people on board, 27 were killed, including the captain and a member of the cabin crew.

Pilot error Decision, action or inaction by a pilot of an aircraft

Pilot error generally refers to an accident in which an action or decision made by the pilot was the cause or a contributing factor that led to the accident, but also includes the pilot's failure to make a correct decision or take proper action. Errors are intentional actions that fail to achieve their intended outcomes. Chicago Convention defines accident as "An occurrence associated with the operation of an aircraft [...] in which [...] a person is fatally or seriously injured [...] except when the injuries are [...] inflicted by other persons." Hence the definition of the "pilot error" does not include deliberate crash.

Galaxy Airlines Flight 203 1985 aviation accident

Galaxy Airlines Flight 203 was a Lockheed L-188 Electra 4-engine turboprop, registration N5532, operating as a non-scheduled charter flight from Reno, Nevada to Minneapolis, Minnesota, which crashed on January 21, 1985 shortly after takeoff. All but one of the 71 on board died.

Human error refers to something having been done that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits". Human error has been cited as a primary cause contributing factor in disasters and accidents in industries as diverse as nuclear power, aviation, space exploration, and medicine. Prevention of human error is generally seen as a major contributor to reliability and safety of (complex) systems. Human error is one of the many contributing causes of risk events.

LAPA Flight 3142 1999 aviation accident

LAPA Flight 3142 was a scheduled Buenos Aires–Córdoba flight operated by the Argentine airline Líneas Aéreas Privadas Argentinas. The flight was operated with a Boeing 737-204C, registration LV-WRZ, that crashed on 31 August 1999 at 20:54 local time while attempting to take off from Aeroparque Jorge Newbery and failing to get airborne. The crash resulted in 65 fatalities — 63 of the occupants of the aircraft and 2 on the ground — as well as injuries, some serious, to at least a further 34 people.

TAM Transportes Aéreos Regionais Flight 402 1996 aviation accident

TAM Transportes Aéreos Regionais Flight 402 was a scheduled domestic flight from São Paulo–Congonhas International Airport in São Paulo, Brazil to Recife International Airport in Recife via Santos Dumont Airport in Rio de Janeiro. On 31 October 1996, at 8:27 (UTC-2), the starboard engine of the Fokker 100 operating the route reversed thrust while the aircraft was climbing away from the runway at Congonhas. The aircraft stalled and rolled beyond control to the right, then struck two buildings and crashed into several houses in a heavily populated area only 25 seconds after takeoff. All 95 people on board were killed, as well as another 4 on the ground. It is the fourth deadliest accident in Brazilian aviation history, the second at the time. It is also the deadliest aviation accident involving a Fokker 100.

Group decision-making is a situation faced when individuals collectively make a choice from the alternatives before them. The decision is then no longer attributable to any single individual who is a member of the group. This is because all the individuals and social group processes such as social influence contribute to the outcome. The decisions made by groups are often different from those made by individuals. In workplace settings, collaborative decision-making is one of the most successful models to generate buy-in from other stakeholders, build consensus, and encourage creativity. According to the idea of synergy, decisions made collectively also tend to be more effective than decisions made by a single individual. In this vein, certain collaborative arrangements have the potential to generate better net performance outcomes than individuals acting on their own. Under normal everyday conditions, collaborative or group decision-making would often be preferred and would generate more benefits than individual decision-making when there is the time for proper deliberation, discussion, and dialogue. This can be achieved through the use of committee, teams, groups, partnerships, or other collaborative social processes.

Groupthink is a psychological phenomenon that occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome. Cohesiveness, or the desire for cohesiveness, in a group may produce a tendency among its members to agree at all costs. This causes the group to minimize conflict and reach a consensus decision without critical evaluation.

Japan Air Lines Flight 446 1972 aviation accident

Japan Air Lines Flight 446 was a Japan Air Lines flight from Sheremetyevo International Airport of Moscow, Russian SFSR, Soviet Union to Tokyo International Airport in Ōta, Tokyo, Japan.

Imperial Airlines Flight 201/8 1961 aviation accident

Imperial Airlines Flight 201/8 was a charter flight by the United States Army to transport new recruits to Columbia, South Carolina for training. On November 8, 1961, the aircraft crashed as it attempted to land at Byrd Field, near Richmond, Virginia. This was the second deadliest accident in American history for a single civilian aircraft.

Team composition and cohesion in spaceflight missions

Selection, training, cohesion and psychosocial adaptation influence performance and, as such, are relevant factors to consider while preparing for costly, long-duration spaceflight missions in which the performance objectives will be demanding, endurance will be tested and success will be critical.

Aeroflot Flight 3932 1973 plane crash in the Soviet Union

Aeroflot Flight 3932 was a flight operated by Aeroflot from Koltsovo Airport to Omsk Tsentralny Airport. On 30 September 1973 the Tupolev Tu-104 operating the route crashed shortly after takeoff from Sverdlovsk, killing all 108 passengers and crew on board.

Automation bias Propensity for humans to favor suggestions from automated decision-making systems

Automation bias is the propensity for humans to favor suggestions from automated decision-making systems and to ignore contradictory information made without automation, even if it is correct. Automation bias stems from the social psychology literature that found a bias in human-human interaction that showed that people assign more positive evaluations to decisions made by humans than to a neutral object. The same type of positivity bias has been found for human-automation interaction, where the automated decisions are rated more positively than neutral. This has become a growing problem for decision making as intensive care units, nuclear power plants, and aircraft cockpits have increasingly integrated computerized system monitors and decision aids to mostly factor out possible human error. Errors of automation bias tend to occur when decision-making is dependent on computers or other automated aids and the human is in an observatory role but able to make decisions. Examples of automation bias range from urgent matters like flying a plane on automatic pilot to such mundane matters as the use of spell-checking programs.

Japan Air Lines Cargo Flight 1045 1977 plane flight which crashed in Anchorage, Alaska, U.S.

Japan Air Lines Cargo Flight 1045 was a charter flight on January 13, 1977, from Grant County, Washington, United States to Tokyo, Japan with a stopover in Anchorage, Alaska, United States. The flight crashed during the initial climb phase, shortly after takeoff from Anchorage due to pilot intoxication. All of those on board, including three flight crew members and two cattle handlers, were killed in the crash.

References

  1. 1 2 3 DOE-HDBK-1028-2009, Human Performance Improvement Handbook. https://www.standards.doe.gov/standards-documents/1000/1028-BHdbk-2009-v1
  2. "How Toxic Leadership Brought Down The Enron Empire | PSY 816: Dysfunctional Leadership (Mastroianni)". sites.psu.edu. Retrieved 2021-11-30.
  3. Alonso, Alexander; Baker, David P.; Holtzman, Amy; Day, Rachel; King, Heidi; Toomey, Lauren; Salas, Eduardo (2006-09-01). "Reducing medical error in the Military Health System: How can team training help?". Human Resource Management Review. Large Scale Human Resource Initiatives in the U.S. Federal Government. 16 (3): 396–415. doi:10.1016/j.hrmr.2006.05.006. ISSN   1053-4822.
  4. Fruhen, Laura Sophie; Keith, Nina (2014-06-01). "Team cohesion and error culture in risky work environments". Safety Science. 65: 20–27. doi:10.1016/j.ssci.2013.12.011. ISSN   0925-7535.