Normalcy bias

Last updated

Normalcy bias, or normality bias, is a cognitive bias which leads people to disbelieve or minimize threat warnings. [1] Consequently, individuals underestimate the likelihood of a disaster, when it might affect them, and its potential adverse effects. [2] The normalcy bias causes many people to prepare inadequately for natural disasters, market crashes, and calamities caused by human error. About 80% of people reportedly display normalcy bias during a disaster. [3]

Contents

The normalcy bias can manifest in response to warnings about disasters and actual catastrophes. Such events can range in scale from incidents such as traffic collisions to global catastrophic risk. The event may involve social constructionism phenomena such as loss of money in market crashes, or direct threats to continuity of life: as in natural disasters like a tsunami or violence in war.

Normalcy bias has also been called analysis paralysis, the ostrich effect, [4] and by first responders, the negative panic. [5] The opposite of normalcy bias is overreaction, or worst-case scenario bias, [6] [7] in which small deviations from normality are dealt with as signals of an impending catastrophe.

Phases

Amanda Ripley, author of The Unthinkable: Who Survives When Disaster Strikes – and Why, identifies common response patterns of people in disasters and explains that there are three phases of response: "denial, deliberation, and the decisive moment". With regard to the first phase, described as "denial", Ripley found that people were likely to deny that a disaster was happening. It takes time for the brain to process information and recognize that a disaster is a threat. In the "deliberation" phase, people have to decide what to do. If a person does not have a plan in place, this causes a serious problem because the effects of life-threatening stress on the body (e.g. tunnel vision, audio exclusion, time dilations, out-of-body experiences, or reduced motor skills) limit an individual's ability to perceive information and make plans. Ripley asserts that in the third and final phase, described as the "decisive moment", a person must act quickly and decisively. Failure to do so can result in injury or death. She explains that the faster someone can get through the denial and deliberation phases, the quicker they will reach the decisive moment and begin to take action. [8]

Examples

Normalcy bias can occur during car crashes. Crushed Saturn.jpg
Normalcy bias can occur during car crashes.

Journalist David McRaney wrote that "Normalcy bias flows into the brain no matter the scale of the problem. It will appear whether you have days and plenty of warning or are blindsided with only seconds between life and death." [9] It can manifest itself in phenomena such as car crashes. Car crashes occur very frequently, but the average individual experiences them only rarely, if ever. It also manifests itself in connection with events in world history. According to a 2001 study by sociologist Thomas Drabek, when people are asked to leave in anticipation of a disaster, most check with four or more sources of information before deciding what to do. The process of checking in, known as milling, is common in disasters. [10]

1827 illustration of the eruption of Vesuvius in 79 CE Eruption of Vesuvius from Pacini's opera L'ultimo giorno di Pompei.jpg
1827 illustration of the eruption of Vesuvius in 79 CE

As for events in world history, the normalcy bias can explain why, when the volcano Vesuvius erupted in 79 CE, the residents of Pompeii watched for hours without evacuating. [11] It can explain why thousands of people refused to leave New Orleans as Hurricane Katrina approached [12] and why at least 70% of 9/11 survivors spoke with others before evacuating. [10] Officials at the White Star Line made insufficient preparations to evacuate passengers on the Titanic and people refused evacuation orders, possibly because they underestimated the odds of a worst-case scenario and minimized its potential impact. [13] Similarly, experts connected with the Fukushima nuclear power plant were strongly convinced that a multiple reactor meltdown could never occur. [14]

A website for police officers has noted that members of that profession have "all seen videos of officers who were injured or killed while dealing with an ambiguous situation, like the old one of a father with his young daughter on a traffic stop". In the video referred to, "the officer misses multiple threat cues...because the assailant talks lovingly about his daughter and jokes about how packed his minivan is. The officer only seems to react to the positive interactions, while seeming to ignore the negative signals. It's almost as if the officer is thinking, 'Well I've never been brutally assaulted before so it certainly won't happen now.' No one is surprised at the end of the video when the officer is violently attacked, unable to put up an effective defense." This professional failure, notes the website, is a consequence of normalcy bias. [15]

Normalcy bias, David McRaney has written, "is often factored into fatality predictions in everything from ship sinkings to stadium evacuations". Disaster movies, he adds, "get it all wrong. When you and others are warned of danger, you don't evacuate immediately while screaming and flailing your arms." McRaney notes that in the book Big Weather, tornado chaser Mark Svenvold discusses "how contagious normalcy bias can be. He recalled how people often tried to convince him to chill out while fleeing from impending doom. Even when tornado warnings were issued, people assumed it was someone else's problem. Stake-holding peers, he said, would try to shame him into denial so they could remain calm. They didn't want him deflating their attempts at feeling normal". [9]

Hypothesized cause

The normalcy bias may be caused in part by the way the brain processes new data. Research suggests that even when the brain is calm, it takes 8–10 seconds to process new information. Stress slows the process, and when the brain cannot find an acceptable response to a situation, it fixates on a single and sometimes default solution that may or may not be correct. An evolutionary reason for this response could be that paralysis gives an animal a better chance of surviving an attack and predators are less likely to see prey that is not moving. [10]

Effects

About 80% of people reportedly display normalcy bias in disasters. [3] Normalcy bias has been described as "one of the most dangerous biases we have". The lack of preparation for disasters often leads to inadequate shelter, supplies, and evacuation plans. Even when all these things are in place, individuals with a normalcy bias often refuse to leave their homes. [16] [17]

Normalcy bias can cause people to drastically underestimate the effects of the disaster. Therefore, people think that they will be safe even though information from the radio, television, or neighbors gives them reasons to believe there is a risk. The normalcy bias causes a cognitive dissonance that people then must work to eliminate. Some manage to eliminate it by refusing to believe new warnings coming in and refusing to evacuate (maintaining the normalcy bias), while others eliminate the dissonance by escaping the danger. The possibility that some people may refuse to evacuate causes significant problems in disaster planning. [18]

Prevention

The negative effects of normalcy bias can be combated through the four stages of disaster response: [19]

See also

Related Research Articles

<span class="mw-page-title-main">Disaster</span> Event or chain of events resulting in major damage, destruction or death

A disaster is a serious problem that happens over a period of time and causes so much harm to people, things, economies, or the environment that the affected community or society cannot handle it on its own. In theory, natural disasters are those caused by natural hazards, whereas human-made disasters are those caused by human hazards. However, in modern times, the divide between natural, human-made or human-accelerated disasters is more and more difficult to draw. In fact, all disasters can be seen as human-made, due to human failure to introduce appropriate emergency management measures.

<span class="mw-page-title-main">Natural disaster</span> Major adverse event resulting from natural processes of the Earth

A natural disaster is the highly harmful impact on a society or community following a natural hazard event. Some examples of natural hazard events include: flooding, drought, earthquake, tropical cyclone, lightning, tsunami, volcanic activity, wildfire. A natural disaster can cause loss of life or damage property, and typically leaves economic damage in its wake. The severity of the damage depends on the affected population's resilience and on the infrastructure available. Scholars have been saying that the term natural disaster is unsuitable and should be abandoned. Instead, the simpler term disaster could be used, while also specifying the category of hazard. A disaster is a result of a natural or human-made hazard impacting a vulnerable community. It is the combination of the hazard along with exposure of a vulnerable society that results in a disaster.

<span class="mw-page-title-main">Business continuity planning</span> Prevention and recovery from threats that might affect a company

Business continuity may be defined as "the capability of an organization to continue the delivery of products or services at pre-defined acceptable levels following a disruptive incident", and business continuity planning is the process of creating systems of prevention and recovery to deal with potential threats to a company. In addition to prevention, the goal is to enable ongoing operations before and during execution of disaster recovery. Business continuity is the intended outcome of proper execution of both business continuity planning and disaster recovery.

Risk assessment determines possible mishaps, their likelihood and consequences, and the tolerances for such events. The results of this process may be expressed in a quantitative or qualitative fashion. Risk assessment is an inherent part of a broader risk management strategy to help reduce any potential risk-related consequences.

Crisis management is the process by which an organization deals with a disruptive and unexpected event that threatens to harm the organization or its stakeholders. The study of crisis management originated with large-scale industrial and environmental disasters in the 1980s. It is considered to be the most important process in public relations.

Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, is the common tendency for people to perceive past events as having been more predictable than they were.

IT disaster recovery is the process of maintaining or reestablishing vital infrastructure and systems following a natural or human-induced disaster, such as a storm or battle. It employs policies, tools, and procedures. Disaster recovery focuses on information technology (IT) or technology systems supporting critical business functions as opposed to business continuity. This involves keeping all essential aspects of a business functioning despite significant disruptive events; it can therefore be considered a subset of business continuity. Disaster recovery assumes that the primary site is not immediately recoverable and restores data and services to a secondary site.

<span class="mw-page-title-main">Emergency management</span> Dealing with all humanitarian aspects of emergencies

Emergency management or disaster management is a science and a system charged with creating the framework within which communities reduce vulnerability to hazards and cope with disasters. Emergency management, despite its name, does not actually focus on the management of emergencies, which can be understood as minor events with limited impacts and are managed through the day-to-day functions of a community. Instead, emergency management focuses on the management of disasters, which are events that produce more impacts than a community can handle on its own. The management of disasters tends to require some combination of activity from individuals and households, organizations, local, and/or higher levels of government. Although many different terminologies exist globally, the activities of emergency management can be generally categorized into preparedness, response, mitigation, and recovery, although other terms such as disaster risk reduction and prevention are also common. The outcome of emergency management is to prevent disasters and where this is not possible, to reduce their harmful impacts.

<span class="mw-page-title-main">Disaster response</span> Second phase of the disaster management cycle

Disaster response refers to the actions taken directly before, during or in the immediate aftermath of a disaster. The objective is to save lives, ensure health and safety and to meet the subsistence needs of the people affected. This includes warning/evacuation, search and rescue, providing immediate assistance, assessing damage, continuing assistance and the immediate restoration or construction of infrastructure. The aim of emergency response is to provide immediate assistance to maintain life, improve health and support the morale of the affected population. Such assistance may range from providing specific but limited aid, such as assisting refugees with transport, temporary shelter, and food to establishing semi-permanent settlements in camps and other locations. It also may involve initial repairs to damage or diversion to infrastructure.

A cost overrun, also known as a cost increase or budget overrun, involves unexpected incurred costs. When these costs are in excess of budgeted amounts due to a value engineering underestimation of the actual cost during budgeting, they are known by these terms.

Optimism bias is a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event. It is also known as unrealistic optimism or comparative optimism.

<span class="mw-page-title-main">Climate crisis</span> Term used to refer to anthropogenic climate change

Climate crisis is a term describing global warming and climate change, and their impacts. This term and the term climate emergency have been used to describe the threat of global warming to humanity and the planet, and to urge aggressive climate change mitigation and "transformational" adaptation. In the scientific journal BioScience, a January 2020 article, endorsed by over 11,000 scientists worldwide, stated that "the climate crisis has arrived" and that an "immense increase of scale in endeavors to conserve our biosphere is needed to avoid untold suffering due to the climate crisis."

<span class="mw-page-title-main">National Disaster Risk Reduction and Management Council</span> Philippine government agency

The National Disaster Risk Reduction and Management Council (NDRRMC), formerly known as the National Disaster Coordinating Council (NDCC) until August 2011, is a working group of various government, non-government, civil sector and private sector organizations of the Government of the Republic of the Philippines established on June 11, 1978 by Presidential Decree 1566. It is administered by the Office of Civil Defense (OCD) under the Department of National Defense (DND). The council is responsible for ensuring the protection and welfare of the people during disasters or emergencies. The NDRRMC plans and leads the guiding activities in the field of communication, warning signals, emergency, transportation, evacuation, rescue, engineering, health and rehabilitation, public education and auxiliary services such as fire fighting and the police in the country. The Council utilizes the UN Cluster Approach in disaster management. It is the country's focal for the ASEAN Agreement on Disaster Management and Emergency Response (AADMER) and many other related international commitments.

<span class="mw-page-title-main">Global catastrophic risk</span> Potentially harmful worldwide events

A global catastrophic risk or a doomsday scenario is a hypothetical event that could damage human well-being on a global scale, even endangering or destroying modern civilization. An event that could cause human extinction or permanently and drastically curtail humanity's existence or potential is known as an "existential risk."

A delaying tactic or delay tactic is a strategic device sometimes used during business, diplomatic or interpersonal negotiations, in which one party to the negotiation seeks to gain an advantage by postponing a decision. Someone uses a delaying tactic when they expect to have a stronger negotiating position at a later time. They may also use a delaying tactic when they prefer the status quo to any of the potential resolutions, or to impose costs on the other party to compel them to accept a settlement or compromise. Delay tactics are also sometimes used as a form of indirect refusal wherein one party postpones a decision indefinitely rather than refusing a negotiation outright. To use a delaying tactic, the delaying party must have some form of control over the decision-making process.

ISO 22300:2021, Security and resilience – Vocabulary, is an international standard developed by ISO/TC 292 Security and resilience. This document defines terms used in security and resilience standards and includes 360 terms and definitions. This edition was published in the beginning of 2021 and replaces the second edition from 2018.

<span class="mw-page-title-main">Psychology of climate change denial</span> Human behaviour with regards to climate change denial

The psychology of climate change denial is the study of why people deny climate change, despite the scientific consensus on climate change. A study assessed public perception and action on climate change on grounds of belief systems, and identified seven psychological barriers affecting behavior that otherwise would facilitate mitigation, adaptation, and environmental stewardship: cognition, ideological worldviews, comparisons to key people, costs and momentum, disbelief in experts and authorities, perceived risks of change, and inadequate behavioral changes. Other factors include distance in time, space, and influence.

<span class="mw-page-title-main">Doomscrolling</span> Compulsive consumption of large quantity of negative online news

Doomscrolling or doomsurfing is the act of spending an excessive amount of time reading large quantities of news online. Doomscrolling can also be defined as the excessive consumption of vertical, short-form videos for a long period of time, without knowing the amount of time passed. It may leave the person with a feeling of tiredness or unproductiveness after doomscrolling. This phenomenon is most seen in teenagers and children. This can also be considered as a form of Internet addiction disorder. In 2019, a study by the National Academy of Sciences found that doomscrolling can be linked to a decline in mental and physical health.

<span class="mw-page-title-main">Crowd collapses and crushes</span> Type of disaster that occurs due to overcrowding

Crowd collapses and crowd crushes are catastrophic incidents that can occur when a body of people becomes dangerously overcrowded. When numbers are up to about five people per square meter, the environment may feel cramped but manageable; when numbers reach between eight and ten people per square meter, individuals become pressed against each other and may be swept along against their will by the motion of the crowd. Under these conditions, the crowd may undergo a progressive collapse where the pressure pushes people off their feet, resulting in people being trampled or crushed by the weight of other people falling on top of them. At even higher densities, the pressure on each individual can cause them to be crushed or asphyxiated while still upright.

The preparedness paradox is the proposition that if a society or individual acts effectively to mitigate a potential disaster such as a pandemic, natural disaster or other catastrophe so that it causes less harm, the avoided danger will be perceived as having been much less serious because of the limited damage actually caused. The paradox is the incorrect perception that there had been no need for careful preparation as there was little harm, although in reality the limitation of the harm was due to preparation. Several cognitive biases can consequently hamper proper preparation for future risks.

References

  1. Drabek, Thomas E. (1986). Human system responses to disaster : an inventory of sociological findings. New York: Springer Verlag. p. 72. ISBN   978-1-4612-4960-3. OCLC   852789578. The initial response to a disaster warning is disbelief.
  2. Omer, Haim; Alon, Nahman (April 1994). "The continuity principle: A unified approach to disaster and trauma". American Journal of Community Psychology. 22 (2): 275–276. doi:10.1007/BF02506866. PMID   7977181. S2CID   21140114. ... normalcy bias consists in underestimating the probability of disaster, or the disruption involved in it ...
  3. 1 2 Inglis-Arkell, Esther (May 2, 2013). "The frozen calm of normalcy bias". Gizmodo. Retrieved 23 May 2017. Cites:
  4. Ince, Wyne (October 23, 2017). Thoughts of Life and Time. Wyne Ince. p. 122. ISBN   978-1-973727-15-6 . Retrieved 20 December 2017.
  5. McRaney, David (2012). You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways You're Deluding Yourself. Gotham Books. p. 54. ISBN   978-1-59240-736-1 . Retrieved 20 December 2017.
  6. Schneier, Bruce. "Worst-case thinking makes us nuts, not safe", CNN, May 12, 2010 (retrieved April 18, 2014); reprinted in Schneier on Security, May 13, 2010 (retrieved April 18, 2014)
  7. Evans, Dylan. "Nightmare Scenario: The Fallacy of Worst-Case Thinking", Risk Management, April 2, 2012 (retrieved April 18, 2014); from Risk Intelligence: How To Live With Uncertainty, by Dylan Evans, Free Press/Simon & Schuster, Inc., 2012; ISBN   9781451610901
  8. Ripley, Amanda (June 10, 2008). The Unthinkable: Who Survives When Disaster Strikes – and Why. Potter/Ten Speed Press/Harmony. ISBN   978-0-307-44927-6.
  9. 1 2 McRaney, David (2012). You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways You're Deluding Yourself. Gotham Books. p. 55. ISBN   978-1-59240-736-1.
  10. 1 2 3 Ripley, Amanda (25 April 2005). "How to Get Out Alive". Time. 165 (18). TIME Magazine: 58–62. PMID   16128022 . Retrieved 11 November 2013.
  11. Estelita, Vaz; Joanaz de Melo, Cristina; Costa Pinto, Lígia (2017). Environmental History in the Making. Springer Publishing. ISBN   978-3-319-41085-2.
  12. Strandberg, Todd. "The Normalcy Bias and Bible Prophecy". Prophezine. Retrieved 20 December 2017.
  13. Hoffman, Bryce (May 16, 2017). Red Teaming: How Your Business Can Conquer the Competition by Challenging Everything. Crown Publishing. p. 80. ISBN   978-1-101-90597-5.
  14. Saito, William (April 20, 2017). "What Fukushima Disaster Taught Me About Risk Management In Cybersecurity". Forbes . Retrieved 20 December 2017.
  15. Smith, Dave (20 August 2015). "Normalcy Bias". Police The Law Enforcement Magazine. Police The Law Enforcement Magazine. Retrieved 23 May 2017.
  16. "Beware Your Dangerous Normalcy Bias". Gerold Blog. Gerold Blog. 2013-04-27. Retrieved 24 May 2017.
  17. "Disaster Prep for the Rest of Us: normalcy bias". The Coos Bay World. Retrieved 22 September 2021.
  18. Oda, Katsuya. "Information Technology for Advancement of Evacuation" (PDF). National Institute for Land and Infrastructure Management.
  19. Valentine, Pamela V.; Smith, Thomas Edward (2002). "Finding Something to Do: The Disaster Continuity Care Model". Brief Treatment and Crisis Intervention. 2 (2): 183–196. doi:10.1093/brief-treatment/2.2.183.