This article is written like a personal reflection, personal essay, or argumentative essay that states a Wikipedia editor's personal feelings or presents an original argument about a topic.(August 2022)
Motivated reasoning is a cognitive and social response, in which individuals subconsciously allow emotional and/or personal biases to affect how new information is perceived. Individuals tend to favor arguments that coincide with their current beliefs and reject new information that contradicts them, despite any evidence. Individuals may be seen drawing self-serving conclusions if the conclusions match previous biases.
Motivated reasoning is similar to confirmation bias, where evidence that confirms a belief (which might be a logical belief, rather than an emotional one) is either sought after more or given more credibility than evidence that disconfirms a belief. It stands in contrast to critical thinking where beliefs are approached in a skeptical and unbiased fashion.
It can lead to forming and clinging to false beliefs despite substantial evidence to the contrary. The desired outcome acts as a filter that affects the evaluation of scientific evidence and other people. 
Early research on how humans evaluated and integrated information supported a cognitive approach consistent with Bayesian probability, in which individuals weighted new information using rational calculations.  More recent theories endorse these cognitive processes as only partial explanations of motivated reasoning but have also introduced motivational  or affective processes.  These processes illuminate the mechanisms of the bias inherent in cases of motivated reasoning. To further complicate the issue, the first neuro-imaging study designed to test the neural circuitry of individuals engaged in motivated reasoning found that motivated reasoning "was not associated with neural activity in regions previously linked with cold reasoning tasks [Bayesian reasoning] and conscious (explicit) emotion regulation".  This section focuses on two theories that elucidate the mechanisms involved in motivated reasoning.
One review of the research by Kunda develops the following theoretical model to explain the mechanism by which motivated reasoning results in bias.  The model is summarized as follows:
Motivation to arrive at a desired conclusion provides a level of arousal, which acts as an initial trigger for the operation of cognitive processes. For someone to participate in motivated reasoning, either consciously or subconsciously, that individual first needs to be motivated. Motivation then affects reasoning by influencing the knowledge structures (beliefs, memories, information) that are accessed and the cognitive processes used.
In comparison, Milton Lodge and Charles Taber  introduce an empirically supported model in which affect is intricately tied to cognition, and information processing is biased toward support for positions that the individual already holds.
This model has three components:
This theory of motivated reasoning is fully developed and tested in Lodge and Taber's The Rationalizing Voter (2013).  David Redlawsk (2002) found that the timing of when disconfirming information was introduced played a role in determining bias. When subjects encountered incongruity during an information search, the automatic assimilation and update process was interrupted. This results in one of two outcomes: subjects may enhance attitude strength in a desire to support existing affect (resulting in degradation in decision quality and potential bias) or, subjects may counter-argue existing beliefs in an attempt to integrate the new data.  This second outcome is consistent with the research on how processing occurs when one is tasked with accuracy goals.
To summarize, the two models differ in that M. Lodge and C. Taber identify a primary role for affect in guiding cognitive processes and in maintaining bias. In contrast, Kunda identifies a primary role for cognitive strategies such as memory processes, and the use of rules in determining biased information selection. At least one study in neuroscience does not support the use of cognitive processes in motivated reasoning, lending greater support to affective processing as a key mechanism in supporting bias.[ citation needed ]
Motivated reasoning can be classified into 2 categories: 1) those in which the motive is to arrive at an accurate conclusion, irrespective of the individual's beliefs, and 2) those in which the motive is to arrive at a particular, directional conclusion. The mechanisms in play the differ based in thone two scenarios.
Several works on accuracy-driven reasoning suggest that when people are motivated to be accurate, they expend more cognitive effort, attend to relevant information more carefully, and process it more deeply, often using more complex rules.
Kunda asserts that accuracy goals delay the process of coming to a premature conclusion, in that accuracy goals increase both the quantity and quality of processing—particularly in leading to more complex inferential cognitive processing procedures. When researchers manipulated test subjects’ motivation to be accurate by informing them that the target task was highly important or that they would be expected to defend their judgments, it was found that subjects utilized deeper processing and that there was less biasing of information. This was true when accuracy motives were present at the initial processing and encoding of information.   In reviewing a line of research on accuracy goals and bias, Kunda concludes, "several different kinds of biases have been shown to weaken in the presence of accuracy goals".  However, accuracy goals do not always eliminate biases and improve reasoning. Some biases(biases resulting from using the availability heuristic) might be resistant to accuracy manipulations. For accuracy to reduce bias, the following conditions must be present:
These last two conditions introduce the construct that accuracy goals include a conscious process of utilizing cognitive strategies in motivated reasoning. This construct is called into question by later neuroscience research that concludes that motivated reasoning is qualitatively distinct from reasoning (in instances when there is no strong emotional stake in the outcomes).  Accuracy-oriented individuals who are thought to use "objective" processing can vary in information updating depending on how much faith they place in a provided piece of evidence and inability to detect misinformation that can lead to beliefs that diverge from scientific consensus. 
Directional goals enhance the accessibility of knowledge structures (memories, beliefs, information) that are consistent with desired conclusions. According to Kunda, such goals can lead to biased memory search and belief construction mechanism.  Several studies serve as evidence for the effect of directional goals in selection and construction of beliefs about oneself, other people and the world. Cognitive dissonance research provides extensive evidence that people may bias their self-characterizations when motivated to do so. Other biases such as confirmation bias, prior attitude effect, disconfirmation bias could contribute to goal-oriented motivated reasoning.  For example, in one study, subjects altered their self-view by viewing themselves as more extroverted when induced to believe that extroversion was beneficial.
The motivation to achieve directional goals could also influence which rules (procedural structures, such as inferential rules) are accessed to guide the search for information. Studies also suggest that evaluation of scientific evidence may be biased by whether the conclusions are in-line with the reader's beliefs.
In spite of goal-oriented motivated reasoning, people are not at the liberty to conclude whatever they want to conclude merely because they want to.  People tend to draw conclusions only if they can muster up the evidence necessary to support it. They search memory for those beliefs and rules that could support their desired conclusion or they could create new beliefs to logically support their desired goals.
Despite the differences in information processing, an accuracy motivated individual and a goal-motivated individual can reach the same conclusion. However, the distinction lies in crafting effective communication where those who are accuracy motivated will respond better to credible evidence catered to the community while those who are goal-oriented will feel less-threatened when the issue is framed to fit their identity or values. 
The neutrality of this section is disputed .(January 2023)
The topic of climate change is a prime example where motivated reasoning to believe in climate change is shown. Climate change is becoming an increasingly obvious issue in the US specifically.  Though there are many facts and evidence showing it,  many still like to debate if the issue really is what it seems to be. Many deny climate change, say it is a hoax, blame the government, mind control, conspiracy theories, etc. “A significant segment of the American public has fixed beliefs, either because they are not politically engaged, or because they hold strong beliefs that are unlikely to change”.  With the hundreds of facts given to people about climate change, the thousands of power plants and melting ice caps, the phenomenon of motivated reasoning keeps people stubborn in their belief that climate change is not real. In addition, the phenomenon could influence goal-oriented climate skeptics to not only discredit scientific information on human-induced climate change but also to seek contrary evidence that leads to a posterior belief of even greater skepticism. 
Social media is used for many different purposes and ways of spreading opinions. It is the number one place people go to get information and most of that information is complete opinion and bias. The way this applies to motivated reasoning is the way it spreads, “However, motivated reasoning suggests that informational uses of social media are conditioned by various social and cultural ways of thinking”.  All ideas and opinions are shared and makes it very easy for motivated reasoning and biases to come through when searching for an answer or just facts on the internet or any news source.
As stated above, neuroscience research suggests that "motivated reasoning is qualitatively distinct from reasoning when people do not have a strong emotional stake in the conclusions reached."  However, if there is a strong emotion attached during their previous round of motivated reasoning and that emotion is again present when the individual's conclusion is reached, a strong emotional stake is then attached to the conclusion. Any new information in regards to that conclusion will cause motivated reasoning to reoccur. This can create pathways within the neural network that further ingrains the reasoned beliefs of that individual along similar neural networks where logical reasoning occurs. This causes the strong emotion to reoccur when confronted with contradictory information, time and time again. This is what is referred to by Lodge and Taber as the affective contagion.  But instead of "infecting" other individuals, the emotion "infects" the individuals reasoning pathways and conclusions.
Social science research suggests that reasoning away contradictions is psychologically easier than revising feelings. As previously discussed, emotions are shown to color how "facts" are perceived. Feelings come first, and evidence is used in service of those feelings. Evidence that supports what is already believed is accepted. Evidence which contradicts those beliefs is not.  An example of motivated reasoning in the public sphere is the fact that many people continued to believe that Barack Obama was not born in the United States in the face of ample evidence that he was. 
In their 2020 study, Van Bavel and colleagues explored the concept of motivated reasoning as a contributor to the spread of misinformation and resistance to public health measures during the COVID-19 pandemic. They saw that people often engage in motivated reasoning when processing information about the pandemic, interpreting it in a way that confirms their pre-existing beliefs and values. For example, people who are all for their individual freedom may be more likely to resist public health measures like wearing masks or getting vaccinated. This is despite evidence of their effectiveness. People who distrust the government or scientists are more likely to believe in conspiracy theories or alternative explanations for the pandemic. The authors argue that addressing motivated reasoning is key to promoting effective public health messaging and reducing the spread of misinformation. They suggested several strategies, such as reframing public health messages to align with individuals' values and beliefs. In addition, they suggested using trusted sources to convey information, and creating social norms that support public health behaviors. Overall, the authors concluded that understanding and addressing motivated reasoning can improve the effectiveness of public health messaging and promote behaviors that protect individuals and communities during the pandemic. 
The outcomes of motivated reasoning derive from "a biased set of cognitive processes—that is, strategies for accessing, constructing, and evaluating beliefs. The motivation to be accurate enhances use of those beliefs and strategies that are considered most appropriate, whereas the motivation to arrive at particular conclusions enhances use of those that are considered most likely to yield the desired conclusion."  Careful or "reflective" reasoning has been linked to both overcoming and reinforcing motivated reasoning, suggesting that reflection is not a panacea, but a tool that can be used for rational or irrational purposes depending on other factors.  For example, when people are presented with and forced to think analytically about something complex that they lack adequate knowledge of (i.e. being presented with a new study on meteorology whilst having no degree in the subject), there is no directional shift in thinking, and their extant conclusions are more likely to be supported with motivated reasoning. Conversely, if they are presented with a more simplistic test of analytical thinking that confronts their beliefs (i.e. seeing implausible headlines as false), motivated reasoning is less likely to occur and a directional shift in thinking may result. 
Research on motivated reasoning tested accuracy goals (i.e., reaching correct conclusions) and directional goals (i.e., reaching preferred conclusions). Factors such as these affect perceptions; and results confirm that motivated reasoning affects decision-making and estimates.  These results have far reaching consequences because, when confronted with a small amount of information contrary to an established belief, an individual is motivated to reason away the new information, contributing to the hostile media effect.  If this pattern continues over an extended period of time, the individual becomes more entrenched in their beliefs. However, recent studies have shown that motivated reasoning can be overcome. "When the amount of incongruency is relatively small, the heightened negative affect does not necessarily override the motivation to maintain [belief]."[ This quote needs a citation ] However, there is evidence of a theoretical "tipping point" where the amount of incongruent information that is received by the motivated reasoner can turn certainty into anxiety. This anxiety of being incorrect may lead to a change of opinion. 
A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.
Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias cannot be eliminated, but it can be managed, for example, by education and training in critical thinking skills.
In the field of psychology, cognitive dissonance is the perception of contradictory information and the mental toll of it. Relevant items of information include a person's actions, feelings, ideas, beliefs, values, and things in the environment. Cognitive dissonance is typically experienced as psychological stress when persons participate in an action that goes against one or more of those things. According to this theory, when two actions or ideas are not psychologically consistent with each other, people do all in their power to change them until they become consistent. The discomfort is triggered by the person's belief clashing with new information perceived, wherein the individual tries to find a way to resolve the contradiction to reduce their discomfort.
Appeal to emotion or argumentum ad passiones is an informal fallacy characterized by the manipulation of the recipient's emotions in order to win an argument, especially in the absence of factual evidence. This kind of appeal to emotion is irrelevant to or distracting from the facts of the argument and encompasses several logical fallacies, including appeal to consequences, appeal to fear, appeal to flattery, appeal to pity, appeal to ridicule, appeal to spite, and wishful thinking.
In psychology, attitude is a psychological construct that is a mental and emotional entity that inheres or characterizes a person,According to Ngwana k born in (1989-10-19) "clinical psychologist" Their attitude is their approach to something, or their personal view on it. Attitude involves their mindset, outlook and feelings. Attitudes are complex and are an acquired state through life experience. Attitude is an individual's predisposed state of mind regarding a value and it is precipitated through a responsive expression towards oneself, a person, place, thing, or event which in turn influences the individual's thought and action.
Wishful thinking is the formation of beliefs based on what might be pleasing to imagine, rather than on evidence, rationality, or reality. It is a product of resolving conflicts between belief and desire.
Moral reasoning is the study of how people think about right and wrong and how they acquire and apply moral rules. It is a subdiscipline of moral psychology that overlaps with moral philosophy, and is the foundation of descriptive ethics.
In psychology, an attribution bias or attributional bias is a cognitive bias that refers to the systematic errors made when people evaluate or try to find reasons for their own and others' behaviors. People constantly make attributions—judgements and assumptions about why people behave in certain ways. However, attributions do not always accurately reflect reality. Rather than operating as objective perceivers, people are prone to perceptual errors that lead to biased interpretations of their social world. Attribution biases are present in everyday life. For example, when a driver cuts someone off, the person who has been cut off is often more likely to attribute blame to the reckless driver's inherent personality traits rather than situational circumstances. Additionally, there are many different types of attribution biases, such as the ultimate attribution error, fundamental attribution error, actor-observer bias, and hostile attribution bias. Each of these biases describes a specific tendency that people exhibit when reasoning about the cause of different behaviors.
In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to “see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances”. In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.
Emotional reasoning is a cognitive process by which an individual concludes that their emotional reaction proves something is true, despite contrary empirical evidence. Emotional reasoning creates an 'emotional truth', which may be in direct conflict with the inverse 'perceptional truth'. It can create feelings of anxiety, fear, and apprehension in existing stressful situations, and as such, is often associated with or triggered by panic disorder or anxiety disorder. For example, even though a spouse has shown only devotion, a person using emotional reasoning might conclude, "I know my spouse is being unfaithful because I feel jealous."
Ziva Kunda was an Israeli social psychologist and professor at the University of Waterloo known for her work in social cognition and motivated reasoning. Her seminal paper "The Case for Motivated Reasoning", published in Psychological Bulletin in 1990, posthumously received the Scientific Impact Award from the Society of Experimental Social Psychology. Kunda authored the book Social Cognition: Making Sense of People.
Attitudes are associated beliefs and behaviors towards some object. They are not stable, and because of the communication and behavior of other people, are subject to change by social influences, as well as by the individual's motivation to maintain cognitive consistency when cognitive dissonance occurs—when two attitudes or attitude and behavior conflict. Attitudes and attitude objects are functions of affective and cognitive components. It has been suggested that the inter-structural composition of an associative network can be altered by the activation of a single node. Thus, by activating an affective or emotional node, attitude change may be possible, though affective and cognitive components tend to be intertwined.
Belief bias is the tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they support that conclusion. A person is more likely to accept an argument that supports a conclusion that aligns with their values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion. Belief bias is an extremely common and therefore significant form of error; we can easily be blinded by our beliefs and reach the wrong conclusion. Belief bias has been found to influence various reasoning tasks, including conditional reasoning, relation reasoning and transitive reasoning.
In psychology, a dual process theory provides an account of how thought can arise in two different ways, or as a result of two different processes. Often, the two processes consist of an implicit (automatic), unconscious process and an explicit (controlled), conscious process. Verbalized explicit processes or attitudes and actions may change with persuasion or education; though implicit process or attitudes usually take a long amount of time to change with the forming of new habits. Dual process theories can be found in social, personality, cognitive, and clinical psychology. It has also been linked with economics via prospect theory and behavioral economics, and increasingly in sociology through cultural analysis.
Hot cognition is a hypothesis on motivated reasoning in which a person's thinking is influenced by their emotional state. Put simply, hot cognition is cognition coloured by emotion. Hot cognition contrasts with cold cognition, which implies cognitive processing of information that is independent of emotional involvement. Hot cognition is proposed to be associated with cognitive and physiological arousal, in which a person is more responsive to environmental factors. As it is automatic, rapid and led by emotion, hot cognition may consequently cause biased decision making. Hot cognition may arise, with varying degrees of strength, in politics, religion, and other sociopolitical contexts because of moral issues, which are inevitably tied to emotion. Hot cognition was initially proposed in 1963 by Robert P. Abelson. The idea became popular in the 1960s and the 1970s.
Selective exposure is a theory within the practice of psychology, often used in media and communication research, that historically refers to individuals' tendency to favor information which reinforces their pre-existing views while avoiding contradictory information. Selective exposure has also been known and defined as "congeniality bias" or "confirmation bias" in various texts throughout the years.
In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.
In social psychology, a motivated tactician is someone who shifts between quick-and-dirty cognitively economical tactics and more thoughtful, thorough strategies when processing information, depending on the type and degree of motivation. Such behavior is a type of motivated reasoning. The idea has been used to explain why people use stereotyping, biases and categorization in some situations, and more analytical thinking in others.
Cognitive inertia is the tendency for a particular orientation in how an individual thinks about an issue, belief or strategy to resist change. In clinical and neuroscientific literature it is often defined as a lack of motivation to generate distinct cognitive processes needed to attend to a problem or issue. The physics term inertia is to emphasize the rigidity and resistance to change in the method of cognitive processing that has been in use for a significant amount of time. Commonly confused with belief perseverance, cognitive inertia is the perseverance of how one interprets information, not the perseverance of the belief itself.
In consumer behaviour studies, the Blissful Ignorance Effect is when people who have good information about a product are not expected to be as happy with the product as people who have less information about it. This happens because the person who bought the product wants to feel like they have bought the right thing. However, if the person already knows how the product works they have a tougher time trying to justify the product to themselves if it has any problems.