Underlying theories of misinformation

Last updated

The belief and spread of misinformation (incorrect or misleading information) occur for many reasons. Although often attributed to ignorance, it can also be explained by other factors such as moral values and motivated reasoning. [1] [2] This is because decision-making entails both the cognitive architecture of the individual as well as their social context. [3]

Contents

There are many ways to explain the phenomena of misinformation, including traditional science communication theories, but also various psychological and social theories. These theories attempt to explain why individuals believe and share misinformation, and they also inform the rationale behind various misinformation interventions seeking to prevent the spread of false information.

Science communication theories

Information deficit model

Information deficit model attributes false beliefs to a lack of understanding or lack of information. This model assumes that misinformation may be corrected by providing individuals with further credible information. Critics argue that the model fails to address other reasons why individuals believe false information, such as the illusory truth effect (repeated statements receive higher truth ratings than new statements). [4] In fact, in one study, participants failed to rely on stored knowledge and instead relied on repeated false statements. [4] This explains why people deny facts such as climate change despite having access to evidence that suggests otherwise. [5] Thus, the theory has largely been debunked as a reliable explanation for why individuals believe misinformation.

Misinformation interventions such as fact-checking and debunking stem from the underlying theory of the information deficit model, as they seek to correct false information with true information. While they may be useful in cases involving non-controversial or technical/quantitative issues, they tend to be less useful when related to highly salient or controversial issues or race, ethnicity, and culture. [2]

Psychological theories

Inoculation theory

Inoculation theory is a psychological theory that posits that preemptive exposure to misinformation techniques strengthens an individual's resilience to misinformation they encounter later on, because it will be easier to spot and refute. [6] The crucial difference between this theory and the information deficit model is that the former highlights the importance of knowing the forms, techniques, and characteristics of misinformation, rather than knowing the veracity of misinformation itself.

The most common form of misinformation interventions rooted in inoculation theory are pre-bunking and gamified interventions that seek to inform the participant about the various ways that misinformation appears online. Examples of gamified interventions include Bad News, Harmony Square, and Go Viral!, among others. [7]

Inattentional blindness

Inattentional blindness is a theory that suggests that individuals fail to perceive information due to lack of attention. Research exploring attention and the sharing of misinformation found that participants shared misinformation because their attention was focused on factors other than accuracy. [8]

The inattentional blindness theory, then, suggests that shifting attention to accuracy and veracity will increase the quality of news that people subsequently share, offering a useful framework for countering misinformation. [8] The most prominent type of misinformation intervention relying on the theory of inattenional blindness is nudging, which attempts to shape the decision-making and behavior of users online in a way that prevents the spread of misinformation.

Social theories

Although useful, psychological theories do not adequately capture the social nature of holding and sharing beliefs, especially online. Social theories offer an alternative to psychological theories by addressing this context.

Affect control theory (ACT)

Affect control theory (ACT) is a social theory proposing that individuals "perceive events and construct lines of social action that maintain preexisting sentiments for themselves". [9] According to ACT, socialization imbues concepts with shared connotative meanings, known as sentiments, which humans use to make sense of experiences. [10]

Research suggests that the "interpretation, encoding, and response to false information" is a process driven by affects—including the affect of credibility. [10] One study, for example, suggests that when people interact with misinformation that challenges their beliefs and perceptions, they will either reinterpret the information (deflect) or adjust their beliefs based on the credibility of the source of information. In fact, the researchers found that demonstrating that a source spreads falsehoods deliberately (disinformation) is more effective in discrediting opponents than claiming they spread falsehoods unintentionally (misinformation). [10] This is one example how ACT may be useful for developing strategies for discrediting sources of falsehoods. [9]

Social network theory

Social network theory describes the structure of relationships and interactions between social actors. A fundamental concept in social network theory is a network, which consists of nodes or actors with a set of ties or connections between them. Nodes may be people, organizations, or other types of social entities, and ties may be communications, alliances, friendships, and more. [11]

Such a representation of social actors is very applicable to online environments such as social media, where users (nodes) interact with other users by following, sharing, liking, re-posting, etc. (ties). The application of social network theory to social media provides useful insights into the spread of misinformation. For example, tightly connected networks may be used to represent echo chambers.

This theory is useful for devising countermeasures to misinformation on a social media platform level, such as down ranking or removing posts and enabling forwarding restriction policies on suspicious users. It is also useful for evaluating such countermeasures using social network metrics such as centrality, dispersibility, and influenceability [12] .

See also

Related Research Articles

<span class="mw-page-title-main">Conspiracy theory</span> Attributing events to less-probable plots

A conspiracy theory is an explanation for an event or situation that asserts the existence of a conspiracy, when other explanations are more probable. The term generally has a negative connotation, implying that the appeal of a conspiracy theory is based in prejudice, emotional conviction, or insufficient evidence. A conspiracy theory is distinct from a conspiracy; it refers to a hypothesized conspiracy with specific characteristics, including but not limited to opposition to the mainstream consensus among those who are qualified to evaluate its accuracy, such as scientists or historians.

<span class="mw-page-title-main">Attitude (psychology)</span> Concept in psychology and communication studies

An attitude "is a summary evaluation of an object of thought. An attitude object can be anything a person discriminates or holds in mind". Attitudes include beliefs (cognition), emotional responses (affect) and behavioral tendencies. In the classical definition an attitude is persistent, while in more contemporary conceptualizations, attitudes may vary depending upon situations, context, or moods.

In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to "see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances". In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.

Fact-checking is the process of verifying the factual accuracy of questioned reporting and statements. Fact-checking can be conducted before or after the text or content is published or otherwise disseminated. Internal fact-checking is such checking done in-house by the publisher to prevent inaccurate content from being published; when the text is analyzed by a third party, the process is called external fact-checking.

<span class="mw-page-title-main">Misinformation</span> Incorrect or misleading information

Misinformation is incorrect or misleading information. Misinformation and disinformation are not interchangeable terms: Misinformation can exist with or without specific malicious intent whereas disinformation is distinct in that the information is deliberately deceptive and propagated. Misinformation can include inaccurate, incomplete, misleading, or false information as well as selective or half-truths. In January 2024, the World Economic Forum identified misinformation and disinformation, propagated by both internal and external interests, to "widen societal and political divides" as the most severe global risks within the next two years.

<span class="mw-page-title-main">Echo chamber (media)</span> Situation that reinforces beliefs by repetition inside a closed system

In news media and social media, an echo chamber is an environment or ecosystem in which participants encounter beliefs that amplify or reinforce their preexisting beliefs by communication and repetition inside a closed system and insulated from rebuttal. An echo chamber circulates existing views without encountering opposing views, potentially resulting in confirmation bias. Echo chambers may increase social and political polarization and extremism. On social media, it is thought that echo chambers limit exposure to diverse perspectives, and favor and reinforce presupposed narratives and ideologies.

Inoculation theory is a social psychological/communication theory that explains how an attitude or belief can be made resistant to persuasion or influence, in analogy to how a body gains resistance to disease. The theory uses medical inoculation as its explanatory analogy but instead of applying it to disease, it is used to discuss attitudes and other positions, like opinions, values, and beliefs. It has applicability to public campaigns targeting misinformation and fake news, but it is not limited to misinformation and fake news.

Memory conformity, also known as social contagion of memory, is the phenomenon where memories or information reported by others influences an individual and is incorporated into the individual's memory. Memory conformity is a memory error due to both social influences and cognitive mechanisms. Social contamination of false memory can be exemplified in prominent situations involving social interactions, such as eyewitness testimony. Research on memory conformity has revealed that such suggestibility and errors with source monitoring has far reaching consequences, with important legal and social implications. It is one of many social influences on memory.

Motivated reasoning is a cognitive and social response in which individuals, consciously or sub-consciously, allow emotion-loaded motivational biases to affect how new information is perceived. Individuals tend to favor evidence that coincides with their current beliefs and reject new information that contradicts them, despite contrary evidence.

The gateway belief model (GBM) suggests that public perception of the degree of expert or scientific consensus on an issue functions as a so-called "gateway" cognition. Perception of scientific agreement is suggested to be a key step towards acceptance of related beliefs. Increasing the perception that there is normative agreement within the scientific community can increase individual support for an issue. A perception of disagreement may decrease support for an issue.

Brandolini's law, also known as the bullshit asymmetry principle, is an internet adage coined in 2013 by Alberto Brandolini, an Italian programmer, that emphasizes the effort of debunking misinformation, in comparison to the relative ease of creating it in the first place. The law states:

The amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it.

Intellectual humility is a metacognitive process characterized by recognizing the limits of one's knowledge and acknowledging one's fallibility. It involves several components, including not thinking too highly of oneself, refraining from believing one's own views are superior to others', lacking intellectual vanity, being open to new ideas, and acknowledging mistakes and shortcomings. It is positively associated with openness to new ideas, empathy, prosocial values, tolerance for diverse perspectives, and scrutiny of misinformation. Individuals with higher levels of intellectual humility experience benefits such as improved decision-making, positive social interactions, and the moderation of conflicts. There is a long history of philosophers considering the importance of intellectual humility as a 'virtue'. The modern study of this phenomenon began in the mid-2000s.

<span class="mw-page-title-main">Fake news</span> False or misleading information presented as real

Fake news or information disorder is false or misleading information claiming the aesthetics and legitimacy of news. Fake news often has the aim of damaging the reputation of a person or entity, or making money through advertising revenue. Although false news has always been spread throughout history, the term fake news was first used in the 1890s when sensational reports in newspapers were common. Nevertheless, the term does not have a fixed definition and has been applied broadly to any type of false information presented as news. It has also been used by high-profile people to apply to any news unfavorable to them. Further, disinformation involves spreading false information with harmful intent and is sometimes generated and propagated by hostile foreign actors, particularly during elections. In some definitions, fake news includes satirical articles misinterpreted as genuine, and articles that employ sensationalist or clickbait headlines that are not supported in the text. Because of this diversity of types of false news, researchers are beginning to favour information disorder as a more neutral and informative term.

Internet manipulation is the co-optation of online digital technologies, including algorithms, social bots, and automated scripts, for commercial, social, military, or political purposes. Internet and social media manipulation are the prime vehicles for spreading disinformation due to the importance of digital platforms for media consumption and everyday communication. When employed for political purposes, internet manipulation may be used to steer public opinion, polarise citizens, circulate conspiracy theories, and silence political dissidents. Internet manipulation can also be done for profit, for instance, to harm corporate or political adversaries and improve brand reputation. Internet manipulation is sometimes also used to describe the selective enforcement of Internet censorship or selective violations of net neutrality.

<span class="mw-page-title-main">Sander van der Linden</span> Social psychologist

Sander L. van der Linden is a Dutch social psychologist and author who is Professor of Social Psychology at the University of Cambridge. He studies the psychology of social influence, risk, human judgment, and decision-making. He is particularly known for his research on the psychology of social issues, such as fake news, COVID-19, and climate change.

Bad News is a free browser game in which players take the perspective of a fake news tycoon. It was released on February 19, 2018. The game is classified as a serious game and a newsgame aimed at improving media literacy and social impact. The game was produced by the Dutch media platform "DROG" in collaboration with University of Cambridge scientists. The game has been described by the media as a "fake news vaccine".

Disinformation attacks are strategic deception campaigns involving media manipulation and internet manipulation, to disseminate misleading information, aiming to confuse, paralyze, and polarize an audience. Disinformation can be considered an attack when it occurs as an adversarial narrative campaign that weaponizes multiple rhetorical strategies and forms of knowing—including not only falsehoods but also truths, half-truths, and value-laden judgements—to exploit and amplify identity-driven controversies. Disinformation attacks use media manipulation to target broadcast media like state-sponsored TV channels and radios. Due to the increasing use of internet manipulation on social media, they can be considered a cyber threat. Digital tools such as bots, algorithms, and AI technology, along with human agents including influencers, spread and amplify disinformation to micro-target populations on online platforms like Instagram, Twitter, Google, Facebook, and YouTube.

Misinformation related to immunization and the use of vaccines circulates in mass media and social media in spite of the fact that there is no serious hesitancy or debate within mainstream medical and scientific circles about the benefits of vaccination. Unsubstantiated safety concerns related to vaccines are often presented on the internet as being scientific information. A large proportion of internet sources on the topic are mostly inaccurate which can lead people searching for information to form misconceptions relating to vaccines.

Algorithmic radicalization is the concept that recommender algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them developing radicalized extremist political views. Algorithms record user interactions, from likes/dislikes to amount of time spent on posts, to generate endless media aimed to keep users engaged. Through echo chamber channels, the consumer is driven to be more polarized through preferences in media and self-confirmation.

This timeline includes entries on the spread of COVID-19 misinformation and conspiracy theories related to the COVID-19 pandemic in Canada. This includes investigations into the origin of COVID-19, and the prevention and treatment of COVID-19 which is caused by the virus SARS-CoV-2. Social media apps and platforms, including Facebook, TikTok, Telegram, and YouTube, have contributed to the spread of misinformation. The Canadian Anti-Hate Network (CAHN) reported that conspiracy theories related to COVID-19 began on "day one". CAHN reported on March 16, 2020, that far-right groups in Canada were taking advantage of the climate of anxiety and fear surrounding COVID, to recycle variations of conspiracies from the 1990s, that people had shared over shortwave radio. COVID-19 disinformation is intentional and seeks to create uncertainty and confusion. But most of the misinformation is shared online unintentionally by enthusiastic participants who are politically active.

References

  1. Amin, Avnika B.; Bednarczyk, Robert A.; Ray, Cara E.; Melchiori, Kala J.; Graham, Jesse; Huntsinger, Jeffrey R.; Omer, Saad B. (December 2017). "Association of moral values with vaccine hesitancy". Nature Human Behaviour. 1 (12): 873–880. doi:10.1038/s41562-017-0256-5. ISSN   2397-3374. PMID   31024188.
  2. 1 2 Nyhan, Brendan; Reifler, Jason. "Misinformation and Fact-checking: Research Findings from Social Science" (PDF). New America Foundation.
  3. Ecker, Ullrich K. H.; Lewandowsky, Stephan; Cook, John; Schmid, Philipp; Fazio, Lisa K.; Brashier, Nadia; Kendeou, Panayiota; Vraga, Emily K.; Amazeen, Michelle A. (January 2022). "The psychological drivers of misinformation belief and its resistance to correction". Nature Reviews Psychology. 1 (1): 13–29. doi:10.1038/s44159-021-00006-y. ISSN   2731-0574.
  4. 1 2 Fazio, Lisa K.; Brashier, Nadia M.; Payne, B. Keith; Marsh, Elizabeth J. (October 2015). "Knowledge does not protect against illusory truth". Journal of Experimental Psychology: General. 144 (5): 993–1002. doi: 10.1037/xge0000098 . ISSN   1939-2222.
  5. Hansson, Sven Ove (2017-06-01). "Science denial as a form of pseudoscience". Studies in History and Philosophy of Science Part A. 63: 39–47. Bibcode:2017SHPSA..63...39H. doi:10.1016/j.shpsa.2017.05.002. ISSN   0039-3681. PMID   28629651.
  6. Traberg, Cecilie S.; Roozenbeek, Jon; van der Linden, Sander (2022-03-01). "Psychological Inoculation against Misinformation: Current Evidence and Future Directions". The Annals of the American Academy of Political and Social Science. 700 (1): 136–151. doi:10.1177/00027162221087936. ISSN   0002-7162.
  7. Kiili, Kristian; Siuko, Juho; Ninaus, Manuel (3 January 2024). "Tackling misinformation with games: a systematic literature review". Interactive Learning Environments: 1–16. doi:10.1080/10494820.2023.2299999. ISSN   1049-4820 via Taylor and Francis Online.
  8. 1 2 Pennycook, Gordon; Epstein, Ziv; Mosleh, Mohsen; Arechar, Antonio A.; Eckles, Dean; Rand, David G. (April 2021). "Shifting attention to accuracy can reduce misinformation online". Nature. 592 (7855): 590–595. Bibcode:2021Natur.592..590P. doi:10.1038/s41586-021-03344-2. ISSN   1476-4687.
  9. 1 2 Kroska, Amy; Powell, Brian; Rogers, Kimberly B.; Smith-Lovin, Lynn (2023-01-01). "Affect Control Theories: A Double Special Issue in Honor of David R. Heise". American Behavioral Scientist. 67 (1): 3–11. doi:10.1177/00027642211066044. ISSN   0002-7642.
  10. 1 2 3 Campos-Castillo, Celeste; Shuster, Stef M. (2023-02-01). "So What if They're Lying to Us? Comparing Rhetorical Strategies for Discrediting Sources of Disinformation and Misinformation Using an Affect-Based Credibility Rating". American Behavioral Scientist. 67 (2): 201–223. doi:10.1177/00027642211066058. ISSN   0002-7642.
  11. Daly, Alan J.; Borgatti, Stephen; Ofem, Brandon (2010). "Social Network Theory and Analysis". Social network theory and educational change. Cambridge (Mass.): Harvard Education press. ISBN   978-1-934742-80-8.
  12. Ng, Ka Chung; Tang, Jie; Lee, Dongwon (2021-10-02). "The Effect of Platform Intervention Policies on Fake News Dissemination and Survival: An Empirical Examination". Journal of Management Information Systems. 38 (4): 898–930. doi:10.1080/07421222.2021.1990612. hdl: 10397/93630 . ISSN   0742-1222 via Taylor and Francis Online.