Belief perseverance

Last updated

Belief perseverance (also known as conceptual conservatism [1] ) is maintaining a belief despite new information that firmly contradicts it. [2] Such beliefs may even be strengthened when others attempt to present evidence debunking them, a phenomenon known as the backfire effect (compare boomerang effect). [3] For example, in a 2014 article in The Atlantic , journalist Cari Romm describes a study involving vaccination hesitancy. In the study, the subjects expressed their concerns of the side effects of flu shots. After being told that the vaccination was completely safe, they became even less eager to accept them. This new knowledge pushed them to distrust the vaccine even more, reinforcing the idea that they already had before. [4] [5]

Contents

There are three kinds of backfire effects: Familiarity Backfire Effect (from making myths more familiar), Overkill Backfire Effect (from providing too many arguments), and Worldview Backfire Effect (from providing evidence that threatens someone's worldview). [6] According to Cook & Lewandowsky (2011), there are a number of techniques to debunk misinformation, such as emphasizing the core facts and not the myth, or providing explicit warnings that the upcoming information is false, and providing alternative explanations to fill the gaps left by debunking the misinformation. [7] However, more recent studies provided evidence that the backfire effects are not as likely as once thought. [8]

Since rationality involves conceptual flexibility, [9] [10] belief perseverance is consistent with the view that human beings act at times in an irrational manner. Philosopher F.C.S. Schiller holds that belief perseverance "deserves to rank among the fundamental 'laws' of nature". [11]

Evidence from experimental psychology

According to Lee Ross and Craig A. Anderson, "beliefs are remarkably resilient in the face of empirical challenges that seem logically devastating". [12]

The first study of belief perseverance was carried out by Festinger, Riecken, and Schachter. [13] These psychiatrists spent time with members of a doomsday cult who believed the world would end on December 21, 1954. [13] Despite the failure of the forecast, most believers continued to adhere to their faith. [13] [14] [15] In When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World (1956) and A Theory of Cognitive Dissonance (1957), Festinger proposed that human beings strive for internal psychological consistency to function mentally in the real world. [13] A person who experiences internal inconsistency tends to become psychologically uncomfortable and is motivated to reduce the cognitive dissonance. [13] [14] [16] They tend to make changes to justify the stressful behavior, either by adding new parts to the cognition causing the psychological dissonance (rationalization) or by avoiding circumstances and contradictory information likely to increase the magnitude of the cognitive dissonance (confirmation bias). [13] [14] [16]

When asked to reappraise probability estimates in light of new information, subjects displayed a marked tendency to give insufficient weight to the new evidence. They refused to acknowledge the inaccurate prediction as a reflection of the overall validity of their faith. In some cases, subjects reported having a stronger faith in their religion than before. [17]

In a separate study, mathematically capable teenagers and adults were given seven arithmetical problems and asked to estimate approximate solutions using manual estimating. Then, using a calculator rigged to provide increasingly erroneous figures, they were asked for accurate answers (e.g., yielding 252 × 1.2 = 452.4, when it is actually 302.4). About half of the participants went through all seven tasks while commenting on their estimating abilities or tactics, never letting go of the belief that calculators are infallible. They simply refused to admit that their previous assumptions about calculators could have been incorrect. [18]

Lee Ross and Craig A. Anderson led some subjects to the false belief that there existed a positive correlation between a firefighter's stated preference for taking risks and their occupational performance. Other subjects were told that the correlation was negative. The participants were then thoroughly debriefed and informed that there was no link between risk taking and performance. These authors found that post-debriefing interviews pointed to significant levels of belief perseverance. [19]

In another study, subjects spent about four hours following instructions of a hands-on instructional manual.  At a certain point, the manual introduced a formula which led them to believe that spheres were 50 percent larger than they are. Subjects were then given an actual sphere and asked to determine its volume; first by using the formula, and then by filling the sphere with water, transferring the water to a box, and directly measuring the volume of the water in the box. In the last experiment in this series, all 19 subjects held a Ph.D. degree in a natural science, were employed as researchers or professors at two major universities, and carried out the comparison between the two volume measurements a second time with a larger sphere. All but one of these scientists clung to the spurious formula despite their empirical observations. [20]

Even when we deal with ideologically neutral conceptions of reality, when these conceptions have been recently acquired, when they came to us from unfamiliar sources, when they were assimilated for spurious reasons, when their abandonment entails little tangible risks or costs, and when they are sharply contradicted by subsequent events, we are, at least for a time, disinclined to doubt such conceptions on the verbal level and unlikely to let go of them in practice.

–Moti Nissani [1]

In cultural innovations

Physicist Max Planck wrote that "the new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it". [21] For example, the heliocentric theory of the great Greek astronomer, Aristarchus of Samos, had to be rediscovered about 1,800 years later, and even then undergo a major struggle before astronomers took its veracity for granted. [22]

Belief persistence is frequently accompanied by intrapersonal cognitive processes. "When the decisive facts did at length obtrude themselves upon my notice," wrote the chemist Joseph Priestley, "it was very slowly, and with great hesitation, that I yielded to the evidence of my senses." [23]

In education

Students often "cling to ideas that form part of their world view even when confronted by information that does not coincide with this view." [24] For example, students may spend months studying the solar system and do well on related tests, but still believe that moon phases are produced by Earth's shadow. What they learned was not able to intrude on the beliefs they held prior to that knowledge. [25]

Causes

The causes of belief perseverance remain unclear. Experiments in the 2010s suggest that neurochemical processes in the brain underlie the strong attentional bias of reward learning. Similar processes could underlie belief perseverance. [26]

Peter Marris suggests that the process of abandoning a conviction is similar to the working out of grief. "The impulse to defend the predictability of life is a fundamental and universal principle of human psychology." Human beings possess "a deep-rooted and insistent need for continuity". [27]

Philosopher of science Thomas Kuhn points to the resemblance between conceptual change and Gestalt perceptual shifts (e.g., the difficulty encountered in seeing the hag as a young lady). Hence, the difficulty of switching from one conviction to another could be traced to the difficulty of rearranging one's perceptual or cognitive field. [28]

See also

Related Research Articles

Social psychology is the scientific study of how thoughts, feelings, and behaviors are influenced by the actual, imagined, or implied presence of others. Social psychologists typically explain human behavior as a result of the relationship between mental states and social situations, studying the social conditions under which thoughts, feelings, and behaviors occur, and how these variables influence social interactions.

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills.

<span class="mw-page-title-main">Leon Festinger</span> American social psychologist

Leon Festinger was an American social psychologist who originated the theory of cognitive dissonance and social comparison theory. The rejection of the previously dominant behaviorist view of social psychology by demonstrating the inadequacy of stimulus-response conditioning accounts of human behavior is largely attributed to his theories and research. Festinger is also credited with advancing the use of laboratory experimentation in social psychology, although he simultaneously stressed the importance of studying real-life situations, a principle he practiced when personally infiltrating a doomsday cult. He is also known in social network theory for the proximity effect.

In the field of psychology, cognitive dissonance is the perception of contradictory information and the mental toll of it. Relevant items of information include a person's actions, feelings, ideas, beliefs, values, and things in the environment. Cognitive dissonance is typically experienced as psychological stress when persons participate in an action that goes against one or more of those things. According to this theory, when an action or idea is psychologically inconsistent with the other, people do all in their power to change either so that they become consistent. The discomfort is triggered by the person's belief clashing with new information perceived, wherein the individual tries to find a way to resolve the contradiction to reduce their discomfort.

Self-perception theory (SPT) is an account of attitude formation developed by psychologist Daryl Bem. It asserts that people develop their attitudes by observing their own behavior and concluding what attitudes must have caused it. The theory is counterintuitive in nature, as the conventional wisdom is that attitudes determine behaviors. Furthermore, the theory suggests that people induce attitudes without accessing internal cognition and mood states. The person interprets their own overt behaviors rationally in the same way they attempt to explain others' behaviors.

In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to "see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances". In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.

A debunker is a person or organization that exposes or discredits claims believed to be false, exaggerated, or pretentious. The term is often associated with skeptical investigation of controversial topics such as UFOs, claimed paranormal phenomena, cryptids, conspiracy theories, alternative medicine, religion, or exploratory or fringe areas of scientific or pseudoscientific research.

Disconfirmed expectancy is a psychological term for what is commonly known as a failed prophecy. According to the American social psychologist Leon Festinger's theory of cognitive dissonance, disconfirmed expectancies create a state of psychological discomfort because the outcome contradicts expectancy. Upon recognizing the falsification of an expected event an individual will experience the competing cognitions, "I believe [X]," and, "I observed [Y]." The individual must either discard the now disconfirmed belief or justify why it has not actually been disconfirmed. As such, disconfirmed expectancy and the factors surrounding the individual's consequent actions have been studied in various settings.

Selective exposure is a theory within the practice of psychology, often used in media and communication research, that historically refers to individuals' tendency to favorite information which reinforces their pre-existing views while avoiding contradictory information. Selective exposure has also been known and defined as "congeniality bias" or "confirmation bias" in various texts throughout the years.

The Semmelweis reflex or "Semmelweis effect" is a metaphor for the reflex-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs, or paradigms.

Effort justification is an idea and paradigm in social psychology stemming from Leon Festinger's theory of cognitive dissonance. Effort justification is a person's tendency to attribute the value of an outcome they put effort into achieving as greater than the objective value of the outcome.

Conceptual change is the process whereby concepts and relationships between them change over the course of an individual person's lifetime or over the course of history. Research in four different fields – cognitive psychology, cognitive developmental psychology, science education, and history and philosophy of science - has sought to understand this process. Indeed, the convergence of these four fields, in their effort to understand how concepts change in content and organization, has led to the emergence of an interdisciplinary sub-field in its own right. This sub-field is referred to as "conceptual change" research.

True-believer syndrome is an informal or rhetorical term used by M. Lamar Keene in his 1976 book The Psychic Mafia. Keene coined the term in that book. He used the term to refer to people who continued to believe in a paranormal event or phenomenon even after it had been proven to have been staged. Keene considered it to be a cognitive disorder, and regarded it as being a key factor in the success of many psychic mediums.

Motivated reasoning is a cognitive and social response in which individuals, consciously or unconsciously, allow emotion-loaded motivational biases to affect how new information is perceived. Individuals tend to favor evidence that coincides with their current beliefs and reject new information that contradicts them, despite contrary evidence.

In social psychology, the boomerang effect, also known as "reactance", refers to the unintended consequences of an attempt to persuade resulting in the adoption of an opposing position instead. It is sometimes also referred to as "the theory of psychological reactance", stating that attempts to restrict a person's freedom often produce an "anticonformity boomerang effect". In other words, the boomerang effect is a situation where people tend to pick the opposite of what something or someone is saying or doing because of how it is presented to them. Typically, the more aggressive something is presented, people would more than likely want to do the opposite. For example, if someone were to walk up to a yard with a sign saying "KEEP OFF LAWN" the person would be more likely to want to walk on the lawn because of the way they read the sign. If the sign read "please stay off my grass" people would be more likely to follow the directions.

In cognitive psychology and decision science, conservatism or conservatism bias is a bias which refers to the tendency to revise one's belief insufficiently when presented with new evidence. This bias describes human belief revision in which people over-weigh the prior distribution and under-weigh new sample evidence when compared to Bayesian belief-revision.

<span class="mw-page-title-main">Stephan Lewandowsky</span> Australian psychologist

Stephan Lewandowsky is an Australian psychologist. He has worked in both the United States and Australia, and is currently based at the University of Bristol, UK, where he is the chair of cognitive psychology at the School of Psychological Science. His research, which originally pertained to computer simulations of people's decision-making processes, recently has focused on the public's understanding of science and why people often embrace beliefs that are sharply at odds with scientific evidence.

The gateway belief model (GBM) suggests that public perception of the degree of expert or scientific consensus on an issue functions as a so-called "gateway" cognition. Perception of scientific agreement is suggested to be a key step towards acceptance of related beliefs. Increasing the perception that there is normative agreement within the scientific community can increase individual support for an issue. A perception of disagreement may decrease support for an issue.

Vicarious cognitive dissonance is the state of negative arousal in an individual from observing a member of their in-group behave in counterattitudinal ways. The phenomenon is distinguished from the type of cognitive dissonance proposed by Leon Festinger, which can be referred to as personal cognitive dissonance, because the discomfort is experienced vicariously by an observer rather than the actor engaging in inconsistent behavior. Like personal cognitive dissonance, vicarious cognitive dissonance can lead to changes in the observer’s attitudes and behavior to reduce psychological stress.

References

  1. 1 2 Nissani, Moti (December 1990). "A Cognitive Reinterpretation of Stanley Milgram's Observations on Obedience to Authority". American Psychologist . 45 (12): 1384–1385. doi:10.1037/0003-066X.45.12.1384 . Retrieved November 21, 2021.
  2. Baumeister, R. F.; et al., eds. (2007). Encyclopedia of Social Psychology. Thousand Oaks, CA: Sage. pp. 109–110. ISBN   9781412916707.
  3. Silverman, Craig (June 17, 2011). "The Backfire Effect: More on the press’s inability to debunk bad information". Columbia Journalism Review , Columbia University (New York City).
  4. Romm, Cari (December 12, 2014). "Vaccine Myth-Busting Can Backfire". The Atlantic .
  5. Nyhan, Brendan and Reifler, Jason (January 9, 2015) "Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information" Vaccine (journal)
  6. Swire-Thompson, Briony; DeGutis, Joseph; Lazer, David (September 2020). "Searching for the backfire effect: Measurement and design considerations". Journal of Applied Research in Memory and Cognition. 9 (3): 286–299. doi:10.1016/j.jarmac.2020.06.006. ISSN   2211-369X. PMC   7462781 . PMID   32905023.
  7. Cook, J., Lewandowsky, S. (2011), The Debunking Handbook. St. Lucia, Australia: University of Queensland. November 5. ISBN   978-0-646-56812-6.
  8. Lewandowsky, Stephan; Cook, John; Lombardi, Doug (2020), Debunking Handbook 2020, Databrary, pp. 9–11, doi:10.17910/b7.1182 , retrieved 2021-01-20
  9. Voss, J. F.; et al., eds. (1991). Informal Reasoning and Education. Hillsdale: Erlbaum. p. 172.
  10. West, L.H.T.; et al., eds. (1985). Cognitive Structure and Conceptual Change. Orlando, FL: Academic Press. p. 211.
  11. Beveridge, W. I. B. (1950). The Art of Scientific Investigation. New York: Norton. p. 106.
  12. Kahneman, Daniel, ed. (1982). Judgment Under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press. p. 144.
  13. 1 2 3 4 5 6 Dawson, Lorne L. (October 1999). "When Prophecy Fails and Faith Persists: A Theoretical Overview" (PDF). Nova Religio: The Journal of Alternative and Emergent Religions . Berkeley: University of California Press. 3 (1): 60–82. doi: 10.1525/nr.1999.3.1.60 . ISSN   1092-6690. LCCN   98656716 . Retrieved 20 September 2021.
  14. 1 2 3 Festinger, L. (1962). "Cognitive dissonance". Scientific American. 207 (4): 93–107. Bibcode:1962SciAm.207d..93F. doi:10.1038/scientificamerican1062-93. PMID   13892642. S2CID   56193073.
  15. Festinger, Leon; et al. (1956). When Prophecy Fails. Minneapolis: University of Minnesota Press.
  16. 1 2 Festinger, L. (1957). A Theory of Cognitive Dissonance. California: Stanford University Press.
  17. Kleinmuntz, B., ed. (1968). Formal Representation of Human Judgment. New York: Wiley. pp. 17–52.
  18. Timnick, Lois (1982). "Electronic Bullies". Psychology Today. 16: 10–15.
  19. Anderson, C. A. (1983). "Abstract and Concrete Data in the Conservatism of Social Theories: When Weak Data Lead to Unshakeable Beliefs" (PDF). Journal of Experimental Social Psychology. 19 (2): 93–108. doi:10.1016/0022-1031(83)90031-8. Archived from the original (PDF) on 2016-10-05. Retrieved 2016-07-18.
  20. Nissani, M. and Hoefler-Nissani, D. M. (1992). "Experimental Studies of Belief-Dependence of Observations and of Resistance to Conceptual Change". Cognition and Instruction. 9 (2): 97–111. doi:10.1207/s1532690xci0902_1.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  21. Eisenck, Hans J. (1990). Rebel with a Cause. London: W. H. Allen. p. 67.
  22. Koestler, Arthur (1990). The Sleepwalkers: A History of Man's Changing Vision of the Universe . Penguin Books. ISBN   978-0140192469.
  23. Roberts, Royston M. (1989). Serendipity. New York: Wiley. p. 28.
  24. Burbules, N.C.; et al. (1992). "Response to contradiction: scientific reasoning during adolescence". Journal of Educational Psychology. 80: 67–75. doi:10.1037/0022-0663.80.1.67.
  25. Lightman, A.; et al. (1993). "Teacher predictions versus actual student gains". The Physics Teacher. 31 (3): 162–167. Bibcode:1993PhTea..31..162L. doi:10.1119/1.2343698.
  26. Anderson, Brian A.; et al. (2016). "The Role of Dopamine in Value-Based Attentional Orienting". Current Biology. 26 (4): 550–555. doi:10.1016/j.cub.2015.12.062. PMC   4767677 . PMID   26877079.
  27. Marris, Peter (1986). Loss and Change. London: Routledge. p. 2.
  28. Kuhn, Thomas (1962). The Structure of Scientific Revolutions. Chicago: University of Chicago Press.

Further reading