Motivated reasoning (motivational reasoning bias) is a cognitive and social response in which individuals, consciously or sub-consciously, allow emotion-loaded motivational biases to affect how new information is perceived. Individuals tend to favor evidence that coincides with their current beliefs and reject new information that contradicts them, despite contrary evidence.
Motivated reasoning overlaps with confirmation bias. Both favor evidence supporting one's beliefs, at the same time dismissing contradictory evidence. However, confirmation bias is mainly a sub-conscious (innate) cognitive bias. In contrast, motivated reasoning (motivational bias) is a sub-conscious or conscious process by which one's emotions control the evidence supported or dismissed. For confirmation bias, the evidence or arguments can be logical as well as emotional.
Motivated reasoning can be classified into two categories: 1) Accuracy-oriented (non-directional), in which the motive is to arrive at an accurate conclusion, irrespective of the individual's beliefs, and 2) Goal-oriented (directional), in which the motive is to arrive at a particular conclusion.
Motivated reasoning is a cognitive and social response, in which individuals, consciously or unconsciously, allow emotion-loaded motivational biases to affect how new information is perceived. Individuals tend to favor arguments that support their current beliefs and reject new information that contradicts these beliefs. [1]
Motivated reasoning, confirmation bias and cognitive dissonance are closely related. [2] Both motivated reasoning and confirmation bias favor evidence supporting one's beliefs, at the same time dismissing contradictory evidence. Motivated reasoning (motivational bias) is an unconscious or conscious process by which personal emotions control the evidence that is supported or dismissed. However, confirmation bias is mainly an unconscious (innate, implicit) cognitive bias, and the evidence or arguments utilised can be logical as well as emotional. More broadly, it is feasible that motivated reasoning can moderate cognitive biases generally, including confirmation bias. [2]
Individual differences such as political beliefs can moderate the emotional/motivational effect. In addition, social context (groupthink, peer pressure) also partly controls the evidence utilised for motivated reasoning, particularly in dysfunctional societies.[ clarification needed ] Social context moderates emotions, which in turn moderate beliefs.[ citation needed ]
Motivated reasoning differs from critical thinking, in which beliefs are assessed with a skeptical but open-minded attitude.
Individuals are compelled to initiate motivated reasoning to lessen the amount of cognitive dissonance they feel. Cognitive dissonance is the feeling of psychological and physiological stress and unease between two conflicting cognitive and/or emotional elements (such as the desire to smoke, despite knowing it is unhealthy). According to Leon Festinger,[ who? ] there are two paths individuals can engage in to reduce the amount of distress: the first is altering behavior or cognitive bias; the second, more common path is avoiding or discrediting information or situations that would create dissonance. [2]
Research suggests that reasoning away contradictions is psychologically easier than revising feelings. Emotions tend to color how "facts" are perceived. Feelings come first, and evidence is used in service of those feelings. Evidence that supports what is already believed is accepted; evidence which contradicts those beliefs is not. [3]
The notion that motives or goals affect reasoning has a long and controversial history in social psychology. This is because supportive research could be reinterpreted in entirely cognitive non-motivational terms (the hot versus cold cognition controversy). This controversy existed because of a failure to explore mechanisms underlying motivated reasoning. [1]
Early research on how humans evaluated and integrated information supported a cognitive approach consistent with Bayesian probability, in which individuals weighted new information using rational calculations ("cold cognition"). [4] More recent theories endorse these cognitive processes as only partial explanations of motivated reasoning, but have also introduced motivational [1] or affective (emotional) processes ("hot cognition"). [5]
Ziva Kunda reviewed research and developed a theoretical model to explain the mechanism by which motivated reasoning results in bias. [1] Motivation to arrive at a desired conclusion provides a level of arousal, which acts as an initial trigger for the operation of cognitive processes. To participate in motivated reasoning, either consciously or subconsciously, an individual first needs to be motivated. Motivation then affects reasoning by influencing the knowledge structures (beliefs, memories, information) that are accessed and the cognitive processes used.
Milton Lodge and Charles Taber introduced an empirically supported model in which affect is intricately tied to cognition, and information processing is biased toward support for positions that the individual already holds. Their model has three components: [6]
This theory is developed and evaluated in their book The Rationalizing Voter (2013). [8] David Redlawsk (2002) found that the timing of when disconfirming information was introduced played a role in determining bias. When subjects encounter incongruity during an information search, the automatic assimilation and update process is interrupted. This results in one of two outcomes: subjects may enhance attitude strength in a desire to support existing affect (resulting in degradation in decision quality and potential bias), or subjects may counter-argue existing beliefs in an attempt to integrate the new data. [6] This second outcome is consistent with research on how processing occurs when one is tasked with accuracy goals.
To summarize, the two models differ in that Kunda identifies a primary role for cognitive strategies such as memory processes, and the use of rules in determining biased information selection, whereas Lodge and Taber identify a primary role for affect in guiding cognitive processes and maintaining bias.
A neuroimaging study by Drew Westen and colleagues does not support the use of cognitive processes in motivated reasoning, lending greater support to affective processing as a key mechanism in supporting bias. This study, designed to test the neural circuitry of individuals engaged in motivated reasoning, found that motivated reasoning "was not associated with neural activity in regions previously linked with cold reasoning tasks [Bayesian reasoning] nor conscious (explicit) emotion regulation". [9]
This neuroscience data suggests that "motivated reasoning is qualitatively distinct from reasoning when people do not have a strong emotional stake in the conclusions reached." [9] However, if there is a strong emotion attached during their previous round of motivated reasoning and that emotion is again present when the individual's conclusion is reached, a strong emotional stake is then attached to the conclusion. Any new information in regards to that conclusion will cause motivated reasoning to reoccur. This can create pathways within the neural network that further ingrain the reasoned beliefs of that individual along similar neural networks to where logical reasoning occurs. This causes the strong emotion to reoccur when confronted with contradictory information, time and time again. This is referred to by Lodge and Taber as affective contagion. [8] But instead of "infecting" other individuals, the emotion "infects" the individual's reasoning pathways and conclusions.
Motivated reasoning can be classified into two categories: 1) Accuracy-oriented (non-directional), in which the motive is to arrive at an accurate conclusion, irrespective of the individual's beliefs, and 2) goal-oriented (directional), in which the motive is to arrive at a particular conclusion. Politically motivated reasoning, in particular, is strongly directional. [1] [10]
Despite their differences in information processing, an accuracy-motivated and a goal-motivated individual can reach the same conclusion. Both accuracy-oriented and directional-oriented messages move in the desired direction. [10] [ clarification needed ] However, the distinction lies in crafting effective communication, where those who are accuracy motivated will respond better to credible evidence catered to the community,[ clarification needed ] while those who are goal-oriented will feel less threatened when the issue is framed to fit their identity or values. [11]
Several works on accuracy-driven reasoning suggest that when people are motivated to be accurate, they expend more cognitive effort, attend to relevant information more carefully, and process it more deeply, often using more complex rules.
Kunda asserts that accuracy goals delay the process of coming to a premature conclusion, in that accuracy goals increase both the quantity and quality of processing—particularly in leading to more complex inferential cognitive processing procedures. When researchers manipulated test subjects' motivation to be accurate by informing them that the target task was highly important or that they would be expected to defend their judgments, it was found that subjects utilized deeper processing and that there was less biasing of information. This was true when accuracy motives were present at the initial processing and encoding of information. [12] [13] In reviewing a line of research on accuracy goals and bias, Kunda concludes, "several different kinds of biases have been shown to weaken in the presence of accuracy goals". [1] However, accuracy goals do not always eliminate biases and improve reasoning: some biases (e.g. those resulting from using the availability heuristic) might be resistant to accuracy manipulations. For accuracy to reduce bias, the following conditions must be present:
However, these last two conditions introduce the construct that accuracy goals include a conscious process of utilizing cognitive strategies in motivated reasoning. This construct is called into question by neuroscience research that concludes that motivated reasoning is qualitatively distinct from reasoning in which there is no strong emotional stake in the outcomes. [9] Accuracy-oriented individuals who are thought to use "objective" processing can vary in information updating, depending on how much faith they place in a provided piece of evidence and inability to detect misinformation that can lead to beliefs that diverge from scientific consensus. [11]
Directional goals enhance the accessibility of knowledge structures (memories, beliefs, information) that are consistent with desired conclusions. According to Kunda, such goals can lead to biased memory search and belief construction mechanisms. [1] Several studies[ which? ] support the effect of directional goals in selection and construction of beliefs about oneself, other people and the world.
Cognitive dissonance research provides extensive evidence that people may bias their self-characterizations when motivated to do so. Other biases such as confirmation bias, prior attitude effect and disconfirmation bias could contribute to goal-oriented motivated reasoning. [11] For example, in one study, subjects altered their self-view by viewing themselves as more extroverted when induced to believe that extroversion was beneficial.
Michael Thaler of Princeton University, conducted a study[ vague ] that found that men are more likely than women to demonstrate performance-motivated reasoning due to a gender gap in beliefs about personal performance. [14] After a second study was conducted the conclusion was drawn[ vague ] that both men and women are susceptible to motivated reasoning, but certain motivated beliefs can be separated into genders. [14]
The motivation to achieve directional goals could also influence which rules (procedural structures, such as inferential rules) are accessed to guide the search for information. Studies also suggest that evaluation of scientific evidence may be biased by whether the conclusions are in line with the reader's beliefs.
In spite of goal-oriented motivated reasoning, people are not at liberty to conclude whatever they want merely because of that want. [1] People tend to draw conclusions only if they can muster up supportive evidence. They search memory for those beliefs and rules that could support their desired conclusion or they could create new beliefs to logically support their desired goals.
When an individual is trying to quit smoking, they might engage in motivated reasoning to convince themselves to keep smoking. They might focus on information that makes smoking seem less harmful while discrediting any evidence which emphasizes any dangers associated with the behavior. Individuals in situations like this are driven to initiate motivated reasoning to lessen the amount of cognitive dissonance they feel. This can make it harder for individuals to quit and lead to continued smoking, even though they know it is not good for their health. [15]
Peter Ditto and his students conducted a meta-analysis in 2018 of studies relating to political bias. [16] Their aim was to assess which U.S. political orientation (left/liberal or right/conservative) was more biased and initiated more motivated reasoning. They found that both political orientations are susceptible to bias to the same extent. [16] The analysis was disputed by Jonathan Baron and John Jost, [17] to whom Ditto and colleagues responded. [18] Reviewing the debate, Stuart Vyse concluded that the answer to the question of whether U.S. liberals or conservatives are more biased is: "We don't know." [19]
On April 22, 2011, The New York Times published a series of articles attempting to explain the Barack Obama citizenship conspiracy theories. One of these articles by political scientist David Redlawsk explained these "birther" conspiracies as an example of political motivated reasoning. [20] U.S. presidential candidates are required to be born in the U.S. Despite ample evidence that President Barack Obama was born in the U.S. state of Hawaii, many people continue to believe that he was not born in the U.S., and therefore that he was an illegitimate president. [20] Similarly, many people believe he is a Muslim (as was his father), despite ample lifetime evidence of his Christian beliefs and practice (as was true of his mother). [20] Subsequent research by others suggested that political partisan identity was more important for motivating "birther" beliefs than for some other conspiracy beliefs such as 9/11 conspiracy theories. [21]
Despite a scientific consensus on climate change, citizens are divided on the topic, particularly along political lines. [22] A significant segment of the American public has fixed beliefs, either because they are not politically engaged, or because they hold strong beliefs that are unlikely to change. Liberals and progressives generally believe, based on extensive evidence, that human activity is the main driver of climate change. By contrast, conservatives are generally much less likely to hold this belief, and a subset believes that there is no human involvement, and that the reported evidence is faulty (or even fraudulent). A prominent explanation is political directional motivated reasoning, in that conservatives are more likely to reject new evidence that contradicts their long established beliefs. In addition, some highly directional climate deniers not only discredit scientific information on human-induced climate change but also to seek contrary evidence that leads to a posterior belief of greater denial. [23] [11]
A study by Robin Bayes and colleagues of the human-induced climate change views of 1,960 Republicans found that both accuracy and directional motives move in the desired direction, but only in the presence of politically motivated messages congruent with the induced beliefs. [10]
Social media is used for many different purposes and ways of spreading opinions. It is the number one place people go to get information and most of that information is complete opinion and bias. The way this applies to motivated reasoning is the way it spreads. "However, motivated reasoning suggests that informational uses of social media are conditioned by various social and cultural ways of thinking". [24] All ideas and opinions are shared and makes it very easy for motivated reasoning and biases to come through when searching for an answer or just facts on the internet or any news source.
In the context of the COVID-19 pandemic, people who refuse to wear masks or get vaccinated may engage in motivated reasoning to justify their beliefs and actions. They may reject scientific evidence that supports mask-wearing and vaccination and instead seek out information that supports their pre-existing beliefs, such as conspiracy theories or misinformation. This can lead to behaviors that are harmful to both themselves and others. [25]
In a 2020 study, Van Bavel and colleagues explored the concept of motivated reasoning as a contributor to the spread of misinformation and resistance to public health measures during the COVID-19 pandemic. Their results indicated that people often engage in motivated reasoning when processing information about the pandemic, interpreting it to confirm their pre-existing beliefs and values. [26] The authors argue that addressing motivated reasoning is critical to promoting effective public health messaging and reducing the spread of misinformation. They suggested several strategies, such as reframing public health messages to align with individuals' values and beliefs. In addition, they suggested using trusted sources to convey information by creating social norms that support public health behaviors. [26]
The outcomes of motivated reasoning derive from "a biased set of cognitive processes—that is, strategies for accessing, constructing, and evaluating beliefs. The motivation to be accurate enhances use of those beliefs and strategies that are considered most appropriate, whereas the motivation to arrive at particular conclusions enhances use of those that are considered most likely to yield the desired conclusion." [1] Careful or "reflective" reasoning has been linked to both overcoming and reinforcing motivated reasoning, suggesting that reflection is not a panacea, but a tool that can be used for rational or irrational purposes depending on other factors. [27] For example, when people are presented with and forced to think analytically about something complex that they lack adequate knowledge of (i.e. being presented with a new study on meteorology whilst having no degree in the subject), there is no directional shift in thinking, and their extant conclusions are more likely to be supported with motivated reasoning. Conversely, if they are presented with a more simplistic test of analytical thinking that confronts their beliefs (i.e. seeing implausible headlines as false), motivated reasoning is less likely to occur and a directional shift in thinking may result. [28]
Research on motivated reasoning tested accuracy goals (i.e., reaching correct conclusions) and directional goals (i.e., reaching preferred conclusions). Factors such as these affect perceptions; and results confirm that motivated reasoning affects decision-making and estimates. [29] These results have far reaching consequences because, when confronted with a small amount of information contrary to an established belief, an individual is motivated to reason away the new information, contributing to a hostile media effect. [30] If this pattern continues over an extended period of time, the individual becomes more entrenched in their beliefs.
However, recent studies have shown that motivated reasoning can be overcome. "When the amount of incongruency is relatively small, the heightened negative affect does not necessarily override the motivation to maintain [belief]."[ This quote needs a citation ] However, there is evidence of a theoretical "tipping point" where the amount of incongruent information that is received by the motivated reasoner can turn certainty into anxiety. This anxiety of being incorrect may lead to a change of opinion to the better. [3]
A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.
Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs.
In the field of psychology, cognitive dissonance is described as the mental disturbance people feel when they realize their cognitions and actions are inconsistent or contradictory. This may ultimately result in some change in their cognitions or actions to cause greater alignment between them so as to reduce this dissonance. Relevant items of information include peoples' actions, feelings, ideas, beliefs, values, and things in the environment. Cognitive dissonance is typically experienced as psychological stress when persons participate in an action that goes against one or more of those things. According to this theory, when an action or idea is psychologically inconsistent with the other, people do all in their power to change either so that they become consistent. The discomfort is triggered by the person's belief clashing with new information perceived, wherein the individual tries to find a way to resolve the contradiction to reduce their discomfort.
An attitude "is a summary evaluation of an object of thought. An attitude object can be anything a person discriminates or holds in mind." Attitudes include beliefs (cognition), emotional responses (affect) and behavioral tendencies. In the classical definition an attitude is persistent, while in more contemporary conceptualizations, attitudes may vary depending upon situations, context, or moods.
Wishful thinking is the formation of beliefs based on what might be pleasing to imagine, rather than on evidence, rationality, or reality. It is a product of resolving conflicts between belief and desire. Methodologies to examine wishful thinking are diverse. Various disciplines and schools of thought examine related mechanisms such as neural circuitry, human cognition and emotion, types of bias, procrastination, motivation, optimism, attention and environment. This concept has been examined as a fallacy. It is related to the concept of wishful seeing.
Moral reasoning is the study of how people think about right and wrong and how they acquire and apply moral rules. It is a subdiscipline of moral psychology that overlaps with moral philosophy, and is the foundation of descriptive ethics.
The hostile media effect, originally deemed the hostile media phenomenon and sometimes called hostile media perception, is a perceptual theory of mass communication that refers to the tendency for individuals with a strong preexisting attitude on an issue to perceive media coverage as biased against their side and in favor of their antagonists' point of view. Partisans from opposite sides of an issue will tend to find the same coverage to be biased against them. The phenomenon was first proposed and studied experimentally by Robert Vallone, Lee Ross and Mark Lepper.
Ziva Kunda was an Israeli social psychologist and professor at the University of Waterloo known for her work in social cognition and motivated reasoning. Her seminal paper "The Case for Motivated Reasoning", published in Psychological Bulletin in 1990, posthumously received the Scientific Impact Award from the Society of Experimental Social Psychology. Kunda authored the book Social Cognition: Making Sense of People.
Attitudes are associated beliefs and behaviors towards some object. They are not stable, and because of the communication and behavior of other people, are subject to change by social influences, as well as by the individual's motivation to maintain cognitive consistency when cognitive dissonance occurs—when two attitudes or attitude and behavior conflict. Attitudes and attitude objects are functions of affective and cognitive components. It has been suggested that the inter-structural composition of an associative network can be altered by the activation of a single node. Thus, by activating an affective or emotional node, attitude change may be possible, though affective and cognitive components tend to be intertwined.
Belief bias is the tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they justify that conclusion. A person is more likely to accept an argument that supports a conclusion that aligns with their values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion. Belief bias is an extremely common and therefore significant form of error; we can easily be blinded by our beliefs and reach the wrong conclusion. Belief bias has been found to influence various reasoning tasks, including conditional reasoning, relation reasoning and transitive reasoning.
Choice-supportive bias or post-purchase rationalization is the tendency to retroactively ascribe positive attributes to an option one has selected and/or to demote the forgone options. It is part of cognitive science, and is a distinct cognitive bias that occurs once a decision is made. For example, if a person chooses option A instead of option B, they are likely to ignore or downplay the faults of option A while amplifying or ascribing new negative faults to option B. Conversely, they are also likely to notice and amplify the advantages of option A and not notice or de-emphasize those of option B.
Affect, in psychology, is the underlying experience of feeling, emotion, attachment, or mood. It encompasses a wide range of emotional states and can be positive or negative. Affect is a fundamental aspect of human experience and plays a central role in many psychological theories and studies. It can be understood as a combination of three components: emotion, mood, and affectivity. In psychology, the term affect is often used interchangeably with several related terms and concepts, though each term may have slightly different nuances. These terms encompass: emotion, feeling, mood, emotional state, sentiment, affective state, emotional response, affective reactivity, disposition. Researchers and psychologists may employ specific terms based on their focus and the context of their work.
In psychology, a dual process theory provides an account of how thought can arise in two different ways, or as a result of two different processes. Often, the two processes consist of an implicit (automatic), unconscious process and an explicit (controlled), conscious process. Verbalized explicit processes or attitudes and actions may change with persuasion or education; though implicit process or attitudes usually take a long amount of time to change with the forming of new habits. Dual process theories can be found in social, personality, cognitive, and clinical psychology. It has also been linked with economics via prospect theory and behavioral economics, and increasingly in sociology through cultural analysis.
The following outline is provided as an overview of and topical guide to thought (thinking):
Hot cognition is a hypothesis on motivated reasoning in which a person's thinking is influenced by their emotional state. Put simply, hot cognition is cognition coloured by emotion. Hot cognition contrasts with cold cognition, which implies cognitive processing of information that is independent of emotional involvement. Hot cognition is proposed to be associated with cognitive and physiological arousal, in which a person is more responsive to environmental factors. As it is automatic, rapid and led by emotion, hot cognition may consequently cause biased decision making. Hot cognition may arise, with varying degrees of strength, in politics, religion, and other sociopolitical contexts because of moral issues, which are inevitably tied to emotion. Hot cognition was initially proposed in 1963 by Robert P. Abelson. The idea became popular in the 1960s and the 1970s.
Selective exposure is a theory within the practice of psychology, often used in media and communication research, that historically refers to individuals' tendency to favor information which reinforces their pre-existing views while avoiding contradictory information. Selective exposure has also been known and defined as "congeniality bias" or "confirmation bias" in various texts throughout the years.
The ostrich effect, also known as the ostrich problem, was originally coined by Galai & Sade (2003). The name comes from the common legend that ostriches bury their heads in the sand to avoid danger. This effect is a cognitive bias where people tend to “bury their head in the sand” and avoid potentially negative but useful information, such as feedback on progress, to avoid psychological discomfort.
In social psychology, a motivated tactician is someone who shifts between quick-and-dirty cognitively economical tactics and more thoughtful, thorough strategies when processing information, depending on the type and degree of motivation. Such behavior is a type of motivated reasoning. The idea has been used to explain why people use stereotyping, biases and categorization in some situations, and more analytical thinking in others.
Cognitive inertia is the tendency for a particular orientation in how an individual thinks about an issue, belief, or strategy to resist change. Clinical and neuroscientific literature often defines it as a lack of motivation to generate distinct cognitive processes needed to attend to a problem or issue. The physics term inertia emphasizes the rigidity and resistance to change in the method of cognitive processing that has been used for a significant amount of time. Commonly confused with belief perseverance, cognitive inertia is the perseverance of how one interprets information, not the perseverance of the belief itself.
In consumer behaviour studies, the Blissful Ignorance Effect is when people who have good information about a product are not expected to be as happy with the product as people who have less information about it. This happens because the person who bought the product wants to feel like they have bought the right thing. However, if the person already knows how the product works they have a tougher time trying to justify the product to themselves if it has any problems.
{{cite book}}
: CS1 maint: multiple names: authors list (link)[ page needed ]