Illusory truth effect

Last updated

The illusory truth effect (also known as the illusion of truth effect, validity effect, truth effect, or the reiteration effect) is the tendency to believe false information to be correct after repeated exposure. [1] This phenomenon was first identified in a 1977 study at Villanova University and Temple University. [2] [3] When truth is assessed, people rely on whether the information is in line with their understanding or if it feels familiar. The first condition is logical, as people compare new information with what they already know to be true. Repetition makes statements easier to process relative to new, unrepeated statements, leading people to believe that the repeated conclusion is more truthful. The illusory truth effect has also been linked to hindsight bias, in which the recollection of confidence is skewed after the truth has been received.

Contents

In a 2012 study, researchers discovered that familiarity can overpower rationality and that repetitively hearing that a certain statement is wrong can paradoxically cause it to feel right. [4] Researchers attributed the illusory truth effect's impact even on participants who knew the correct answer to begin with, but were persuaded to believe otherwise through the repetition of a falsehood, to "processing fluency".

The illusory truth effect plays a significant role in fields such as advertising, news media, and political propaganda.

Initial study

The effect was first named and defined following the results in a study from 1977 at Villanova University and Temple University where participants were asked to rate a series of trivia statements as true or false. [2] [5] On three occasions, Lynn Hasher, David Goldstein, and Thomas Toppino presented the same group of college students with lists of sixty plausible statements, some of them true and some of them false. The second list was distributed two weeks after the first, and the third two weeks after that. Twenty statements appeared on all three lists; the other forty items on each list were unique to that list. Participants were asked how confident they were of the truth or falsity of the statements, which concerned matters about which they were unlikely to know anything. (For example, "The first air force base was launched in New Mexico." Or "Basketball became an Olympic discipline in 1925.") Specifically, the participants were asked to grade their belief in the truth of each statement on a scale of one to seven. While the participants' confidence in the truth of the non-repeated statements remained steady, their confidence in the truth of the repeated statements increased from the first to the second and second to third sessions, with an average score for those items rising from 4.2 to 4.6 to 4.7. The conclusion made by the researchers was that repeating a statement makes it more likely to appear factual. [1] [2]

In 1989, Hal R. Arkes, Catherine Hackett, and Larry Boehm replicated the original study, with similar results showing that exposure to false information changes the perceived truthfulness and plausibility of that information. [6]

The effect works because when people assess truth, they rely on whether the information agrees with their understanding or whether it feels familiar. The first condition is logical as people compare new information with what they already know to be true and consider the credibility of both sources. However, researchers discovered that familiarity can overpower rationality—so much so that repetitively hearing that a certain fact is wrong can paradoxically cause it to feel right. [4]

Relation to other phenomena

Processing fluency

At first, the truth effect was believed to occur only when individuals are highly uncertain about a given statement. [1] Psychologists also assumed that "outlandish" headlines wouldn't produce this effect however, recent research shows the illusory truth effect is indeed at play with false news. [5] This assumption was challenged by the results of a 2015 study by Lisa K. Fazio, Nadia M. Brasier, B. Keith Payne, and Elizabeth J. Marsh. Published in the Journal of Experimental Psychology ; the study suggested that the truth effect can influence participants who actually knew the correct answer to begin with, but who were swayed to believe otherwise through the repetition of a falsehood. For example, when participants encountered on multiple occasions the statement "A sari is the name of the short plaid skirt worn by Scots," some of them were likely to come to believe it was true, even though these same people were able to correctly answer the question "What is the name of the short pleated skirt worn by Scots?"

After replicating these results in another experiment, Fazio and her team attributed this curious phenomenon to processing fluency, the facility with which people comprehend statements. "Repetition," explained the researcher, "makes statements easier to process (i.e. fluent) relative to new statements, leading people to the (sometimes) false conclusion that they are more truthful." [7] [8] When an individual hears something for a second or third time, their brain responds faster to it and misattributes that fluency as a signal for truth. [9]

Hindsight bias

In a 1997 study, Ralph Hertwig, Gerd Gigerenzer, and Ulrich Hoffrage linked the truth effect to the phenomenon known as "hindsight bias", described as a situation in which the recollection of confidence is skewed after the truth or falsity has been received. They have described the truth effect (which they call "the reiteration effect") as a subset of hindsight bias. [10]

Other studies

In a 1979 study, participants were told that repeated statements were no more likely to be true than unrepeated ones. Despite this warning, the participants perceived repeated statements as being more true than unrepeated ones. [6]

Studies in 1981 and 1983 showed that information deriving from recent experience tends to be viewed as "more fluent and familiar" than new experience. A 2011 study by Jason D. Ozubko and Jonathan Fugelsang built on this finding by demonstrating that, generally speaking, information retrieved from memory is "more fluent or familiar than when it was first learned" and thus produces an illusion of truth. The effect grew even more pronounced when statements were repeated twice and yet more pronounced when they were repeated four times. The researchers thus concluded that memory retrieval is a powerful method for increasing the so-called validity of statements and that the illusion of truth is an effect that can be observed without directly polling the factual statements in question. [11]

A 1992 study by Ian Maynard Begg, Ann Anas, and Suzanne Farinacci suggested that a statement will seem true if the information seems familiar. [6]

A 2012 experiment by Danielle C. Polage showed that some participants exposed to false news stories would go on to have false memories. The conclusion was that repetitive false claims increase believability and may also result in errors. [6] [5]

In a 2014 study, Eryn J. Newman, Mevagh Sanson, Emily K. Miller, Adele Quigley-McBride, Jeffrey L. Foster, Daniel M. Bernstein, and Maryanne Garry asked participants to judge the truth of statements attributed to various people, some of whose names were easier to pronounce than others. Consistently, statements by persons with easily pronounced names were viewed as being more truthful than those with names that were harder to pronounce. The researchers' conclusion was that subjective, tangential properties such as ease of processing can matter when people evaluate sourced information. [3]

See also

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills.

The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.

Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, is the common tendency for people to perceive past events as having been more predictable than they were.

In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to "see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances". In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.

The mere-exposure effect is a psychological phenomenon by which people tend to develop liking or disliking for things merely because they are familiar with them. In social psychology, this effect is sometimes called the familiarity principle. The effect has been demonstrated with many kinds of things, including words, Chinese characters, paintings, pictures of faces, geometric figures, and sounds. In studies of interpersonal attraction, the more often people see a person, the more pleasing and likeable they find that person.

In psychology, illusory correlation is the phenomenon of perceiving a relationship between variables even when no such relationship exists. A false association may be formed because rare or novel occurrences are more salient and therefore tend to capture one's attention. This phenomenon is one way stereotypes form and endure. Hamilton & Rose (1980) found that stereotypes can lead people to expect certain groups and traits to fit together, and then to overestimate the frequency with which these correlations actually occur. These stereotypes can be learned and perpetuated without any actual contact occurring between the holder of the stereotype and the group it is about.

Belief bias is the tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they justify that conclusion. A person is more likely to accept an argument that supports a conclusion that aligns with their values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion. Belief bias is an extremely common and therefore significant form of error; we can easily be blinded by our beliefs and reach the wrong conclusion. Belief bias has been found to influence various reasoning tasks, including conditional reasoning, relation reasoning and transitive reasoning.

In psychology, a heuristic is an easy-to-compute procedure or rule of thumb that people use when forming beliefs, judgments or decisions. The familiarity heuristic was developed based on the discovery of the availability heuristic by psychologists Amos Tversky and Daniel Kahneman; it happens when the familiar is favored over novel places, people, or things. The familiarity heuristic can be applied to various situations that individuals experience in day-to-day life. When these situations appear similar to previous situations, especially if the individuals are experiencing a high cognitive load, they may regress to the state of mind in which they have felt or behaved before. This heuristic is useful in most situations and can be applied to many fields of knowledge; however, there are both positives and negatives to this heuristic as well.

In social psychology, illusory superiority is a cognitive bias wherein people overestimate their own qualities and abilities compared to others. Illusory superiority is one of many positive illusions, relating to the self, that are evident in the study of intelligence, the effective performance of tasks and tests, and the possession of desirable personal characteristics and personality traits. Overestimation of abilities compared to an objective measure is known as the overconfidence effect.

There is evidence suggesting that different processes are involved in remembering something versus knowing whether it is familiar. It appears that "remembering" and "knowing" represent relatively different characteristics of memory as well as reflect different ways of using memory.

In psychology, implicit memory is one of the two main types of long-term human memory. It is acquired and used unconsciously, and can affect thoughts and behaviours. One of its most common forms is procedural memory, which allows people to perform certain tasks without conscious awareness of these previous experiences; for example, remembering how to tie one's shoes or ride a bicycle without consciously thinking about those activities.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Processing fluency is the ease with which information is processed. Perceptual fluency is the ease of processing stimuli based on manipulations to perceptual quality. Retrieval fluency is the ease with which information can be retrieved from memory.

Memory gaps and errors refer to the incorrect recall, or complete loss, of information in the memory system for a specific detail and/or event. Memory errors may include remembering events that never occurred, or remembering them differently from the way they actually happened. These errors or gaps can occur due to a number of different reasons, including the emotional involvement in the situation, expectations and environmental changes. As the retention interval between encoding and retrieval of the memory lengthens, there is an increase in both the amount that is forgotten, and the likelihood of a memory error occurring.

In psychology, the misattribution of memory or source misattribution is the misidentification of the origin of a memory by the person making the memory recall. Misattribution is likely to occur when individuals are unable to monitor and control the influence of their attitudes, toward their judgments, at the time of retrieval. Misattribution is divided into three components: cryptomnesia, false memories, and source confusion. It was originally noted as one of Daniel Schacter's seven sins of memory.

The rhyme-as-reason effect, or Eaton–Rosen phenomenon, is a cognitive bias whereupon a saying or aphorism is judged as more accurate or truthful when it is rewritten to rhyme.

Knowledge neglect refers to cases when people fail to retrieve and apply previously stored knowledge appropriately into a current situation . Perhaps the most famous example of knowledge neglect is the Moses Illusion, discovered in 1981 by Erickson and Mattson. For the illusion, participants are asked to answer the question, “How many pairs of each animal did Moses bring on his ark?” If a participant answers the question by simply saying, “2,” then this is an example of knowledge neglect because the person has failed to apply their previously learned knowledge that Noah was the individual who constructed the ark and herded the animals, not Moses. Another example would be a teacher asking the class, "Who was the main villain in Stephen King's Harry Potter series?" Any fan of the Harry Potter series knows that J. K. Rowling authored the books, however someone might still answer this question without applying their previous knowledge about the correct author, demonstrating knowledge neglect.

The frequency illusion is a cognitive bias in which a person notices a specific concept, word, or product more frequently after recently becoming aware of it.

References

  1. 1 2 3 "The Truth Effect and Other Processing Fluency Miracles". Science Blogs. Archived from the original on 6 May 2021. Retrieved 30 December 2016.
  2. 1 2 3 Hasher, Lynn; Goldstein, David; Toppino, Thomas (1977). "Frequency and the conference of referential validity" (PDF). Journal of Verbal Learning and Verbal Behavior. 16 (1): 107–112. doi:10.1016/S0022-5371(77)80012-1. Archived from the original on 2016-05-15.{{cite journal}}: CS1 maint: bot: original URL status unknown (link)
  3. 1 2 Newman, Eryn J.; Sanson, Mevagh; Miller, Emily K.; Quigley-Mcbride, Adele; Foster, Jeffrey L.; Bernstein, Daniel M.; Garry, Maryanne (September 6, 2014). "People with Easier to Pronounce Names Promote Truthiness of Claims". PLOS ONE. 9 (2): e88671. Bibcode:2014PLoSO...988671N. doi: 10.1371/journal.pone.0088671 . PMC   3935838 . PMID   24586368.
  4. 1 2 Dreyfuss, Emily (February 11, 2017). "Want to Make a Lie Seem True? Say It Again. And Again. And Again". Wired. Archived from the original on 6 May 2021. Retrieved 31 October 2017.
  5. 1 2 3 Resnick, Brian (2017-06-17). "Alex Jones and the illusory truth effect, explained". Vox. Archived from the original on 2021-05-05. Retrieved 31 October 2017.
  6. 1 2 3 4 Polage, Danielle (2012). "Making up History: False Memories of Fake News Stories". Europe's Journal of Psychology. 8 (2): 245–250. doi: 10.5964/ejop.v8i2.456 . Archived from the original on 2016-12-31.
  7. Fazio, Lisa K.; Brashier, Nadia M.; Payne, B. Keith; Marsh, Elizabeth J. (2015). "Knowledge does not protect against illusory truth" (PDF). Journal of Experimental Psychology: General. 144 (5): 993–1002. doi: 10.1037/xge0000098 . PMID   26301795. Archived from the original (PDF) on 2016-05-14.
  8. Nason, Brian (December 8, 2015). "The Illusory Truth Effect". Vox Populi News. Archived from the original on 2015-12-14. Retrieved 29 December 2016.
  9. Resnick, Brian (October 5, 2017). "The science behind why fake news is so hard to wipe out". Vox. Archived from the original on 20 April 2021. Retrieved 31 October 2017.
  10. Hertwig, Ralph; Gigerenzer, Gerd; Hoffrage, Ulrich (1997). "The reiteration effect in hindsight bias". Psychological Review. 104: 194–202. doi:10.1037/0033-295X.104.1.194. hdl: 11858/00-001M-0000-0025-A38B-2 . Archived from the original on 2012-08-22. Retrieved 2016-12-30.
  11. Ozubko, JD; Fugelsang, J (January 2011). "Remembering makes evidence compelling: retrieval from memory can give rise to the illusion of truth". Journal of Experimental Psychology: Learning, Memory, and Cognition. 37 (1): 270–6. doi:10.1037/a0021323. PMID   21058878. S2CID   22767092.

Further reading