Belief bias

Last updated

Belief bias is the tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they support that conclusion. [1] A person is more likely to accept an argument that supports a conclusion that aligns with their values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion. [2] Belief bias is an extremely common and therefore significant form of error; we can easily be blinded by our beliefs and reach the wrong conclusion. Belief bias has been found to influence various reasoning tasks, including conditional reasoning, [3] relation reasoning [4] and transitive reasoning. [5]

Contents

Syllogisms

A syllogism is a kind of logical argument in which one proposition (the conclusion) is inferred from two or more others (the premises ) of a specific form. The classical example of a valid syllogism is:

All humans are mortal. (major premise)
Socrates is human. (minor premise)
Therefore, Socrates is mortal. (conclusion)

An example of an invalid syllogism is:

All teenage girls are ambitious.
Teenage girls study hard.
Therefore, girls study hard because they are ambitious.

Typically, a majority of test subjects in studies incorrectly identify this syllogism as one in which the conclusion follows from the premises. [1] It might be true in the real world that a) girls study and b) this is because they are ambitious. However, this argument is a fallacy, because the conclusion is not supported by its premises. The validity of an argument is independent from the truth of its conclusion: there are valid arguments for false conclusions and invalid arguments for true conclusions. Hence, it is an error to judge the validity of an argument from the plausibility of its conclusion. This is the reasoning error known as belief bias. [1]

When a person gives a response that is determined by the believability of the conclusion rather than logical validity, this is referred to as belief bias only when a syllogism is used. This phenomenon is so closely related to syllogistic reasoning that, when it does occur, in areas such as Wason's selection task or the THOG problem, it is called "memory cueing" or the "effects of content". [2]

Dual-process theory of belief bias

Many researchers in thinking and reasoning have provided evidence for a dual-process cognitive approach to reasoning, judgment and decision making. They argue that these two mental processes (system 1 and system 2) engage in a constant battle for control over our brain to reason and make decisions. System 1 can be described as an automatic response system [6] characterised by "unconscious", [7] "intuitive" [8] and "rapid" [6] evaluation; whereas system 2 is said to be a controlled response system, [6] characterised by “conscious”, [7] “analytic” [8] and “slow” [6] evaluation; some researchers even claimed to have found a link between general intelligence and the effectiveness of decision making. [9] [10] It is important to note that the dual-process cognitive theory is different from the two minds hypothesis. Research done by Jonathan St B. T. Evans in 2007 provided evidence for the view that System 1, which serves as a quick heuristic processor, fights for control over System 2's slower analytical approach. [11] In the experiment, participants were asked to evaluate syllogisms that have valid arguments with unconvincing conclusions; valid arguments with convincing conclusions; invalid arguments with unconvincing conclusions; invalid arguments with convincing conclusions. The results show that when the conclusion is believable, people blindly accept invalid conclusions more so than invalid arguments are accepted.

Influencing factors of belief bias

Time

Various studies have proved that the time period for which a subject is allowed to think when evaluating arguments is related to the tendency for belief bias to take place. In a study done by Evans and Holmes in 2005, [12] they recruited two different groups of people to answer a series of reasoning questions. One group of people were given only two seconds to answer the questions; whereas the other group of people were allowed to use as much time as they would like to answer the questions. The result obtained was that a higher percentage of incorrect answers were found in the time pressured group than the other; they concluded that this was a result of a shift in logical to belief-biased way of thinking.

Nature of content

The nature of the content presented can also affect belief bias of an individual as shown by a study done by Goel & Vartanian in 2011. [13] In their experiment, 34 participants were presented with a syllogism upon each trial. Each trial were either neutral or carried some degree of negative content. Negative content involved in the experiment were politically incorrect social norm violations, such as the statement “Some wars are not unjustified, Some wars involve raping of women, therefore, Some raping of women is not unjustified”. For syllogisms where the content was neutral, the results were consistent with studies of belief bias; however, for syllogisms with negative emotional content, participants were more likely to reason logically on invalid syllogisms with believable conclusions instead of automatically judging them to be valid. In other words, the effect of belief bias is reduced when the content presented is of negative emotion. According to Goel and Vartanian, this is because negative emotions prompts us to reason in more careful and in more detail. This argument was supported by the observation that for questions with negative emotions, the reaction time taken was significantly longer than that of questions that a neutral.

Instructions given

In an experiment done by Evans, Newstead, Allen & Pollard in 1994, [14] where subjects were given detailed instructions which lack specific reference to the notion of logical necessity when answering questions, it was shown that a larger proportion of answers actually rejected invalid arguments with convincing conclusions as opposed to when no further instructions were given when the subjects were asked to answer the questions. The results of the experiments reflects that when elaborated instructions were given to subjects to reason logically, the effects of belief bias is decreased.

Research

In a series of experiments by Evans, Barston and Pollard (1983), [15] participants were presented with evaluation task paradigms, containing two premises and a conclusion. The participants were asked to make an evaluation of logical validity. The subjects, however, exhibited a belief bias, evidenced by their tendency to reject valid arguments with unbelievable conclusions, and endorse invalid arguments with believable conclusions. Instead of following directions and assessing logical validity, the subjects based their assessments on personal beliefs.

Consequently, these results demonstrated a greater acceptance of more believable (80%), than unbelievable (33%) conclusions. Participants also illustrated evidence of logical competences and the results determined an increase in acceptance of valid (73%) than invalid (41%). Additionally, there's a small difference between believable and valid (89%) in comparison to unbelievable and invalid (56%) (Evans, Barston & Pollard, 1983; Morley, Evans & Handley, 2004). [15] [16]

It has been argued that using more realistic content in syllogisms can facilitate more normative performance from participants. It has been suggested that the use of more abstract, artificial content will also have a biasing effect on performance. Therefore, more research is required to fully understand how and why belief bias occurs and if there are certain mechanisms that are responsible for such things. [17] There is also evidence of clear individual differences in normative responding that are predicted by the response times of participants. [18]

A 1989 study by Markovits and Nantel gave participants four reasoning tasks. The results indicated “a significant belief-bias effect” that existed “independently of the subjects' abstract reasoning ability.” [19]

A 2010 study by Donna Torrens examined differences in belief bias among individuals. Torrens found that “the extent of an individual's belief bias effect was unrelated to a number of measures of reasoning competence” but was, instead, related to that person's ability “to generate alternative representations of premises: the more alternatives a person generated, the less likely they were to show a belief bias effect." [20]

In a 2010 study, Chad Dube and Caren M. Rotello of the University of Massachusetts and Evan Heit of the University of California, Merced, showed that “the belief bias effect is simply a response bias effect.” [21]

In a 2012 study, Adrian P. Banks of the University of Surrey explained that “belief bias is caused by the believability of a conclusion in working memory which influences its activation level, determining its likelihood of retrieval and therefore its effect on the reasoning process.” [22]

Michelle Colleen and Elizabeth Hilscher of the University of Toronto showed in 2014 that belief bias can be affected by the difficulty level and placement of the syllogism in question. [23]

See also

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

<span class="mw-page-title-main">Syllogism</span> Type of logical argument that applies deductive reasoning

A syllogism is a kind of logical argument that applies deductive reasoning to arrive at a conclusion based on two propositions that are asserted or assumed to be true.

A fallacy, also known as paralogia in modern psychology, is the use of invalid or otherwise faulty reasoning in the construction of an argument that may appear to be well-reasoned if unnoticed. The term was introduced in the Western intellectual tradition by the Aristotelian De Sophisticis Elenchis.

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias cannot be eliminated, but it can be managed, for example, by education and training in critical thinking skills.

Deductive reasoning is the mental process of drawing deductive inferences. An inference is deductively valid if its conclusion follows logically from its premises, i.e. it is impossible for the premises to be true and the conclusion to be false.

Inferences are steps in reasoning, moving from premises to logical consequences; etymologically, the word infer means to "carry forward". Inference is theoretically traditionally divided into deduction and induction, a distinction that in Europe dates at least to Aristotle. Deduction is inference deriving logical conclusions from premises known or assumed to be true, with the laws of valid inference being studied in logic. Induction is inference from particular evidence to a universal conclusion. A third type of inference is sometimes distinguished, notably by Charles Sanders Peirce, contradistinguishing abduction from induction.

Moral reasoning is the study of how people think about right and wrong and how they acquire and apply moral rules. It is a subdiscipline of moral psychology that overlaps with moral philosophy, and is the foundation of descriptive ethics.

In logic and philosophy, a formal fallacy, deductive fallacy, logical fallacy or non sequitur is a pattern of reasoning rendered invalid by a flaw in its logical structure that can neatly be expressed in a standard logic system, for example propositional logic. It is defined as a deductive argument that is invalid. The argument itself could have true premises, but still have a false conclusion. Thus, a formal fallacy is a fallacy where deduction goes wrong, and is no longer a logical process. This may not affect the truth of the conclusion, since validity and truth are separate in formal logic.

In psychology, a dual process theory provides an account of how thought can arise in two different ways, or as a result of two different processes. Often, the two processes consist of an implicit (automatic), unconscious process and an explicit (controlled), conscious process. Verbalized explicit processes or attitudes and actions may change with persuasion or education; though implicit process or attitudes usually take a long amount of time to change with the forming of new habits. Dual process theories can be found in social, personality, cognitive, and clinical psychology. It has also been linked with economics via prospect theory and behavioral economics, and increasingly in sociology through cultural analysis.

<span class="mw-page-title-main">Need for cognition</span> Psychology concept

The need for cognition (NFC), in psychology, is a personality variable reflecting the extent to which individuals are inclined towards effortful cognitive activities.

Hot cognition is a hypothesis on motivated reasoning in which a person's thinking is influenced by their emotional state. Put simply, hot cognition is cognition coloured by emotion. Hot cognition contrasts with cold cognition, which implies cognitive processing of information that is independent of emotional involvement. Hot cognition is proposed to be associated with cognitive and physiological arousal, in which a person is more responsive to environmental factors. As it is automatic, rapid and led by emotion, hot cognition may consequently cause biased decision making. Hot cognition may arise, with varying degrees of strength, in politics, religion, and other sociopolitical contexts because of moral issues, which are inevitably tied to emotion. Hot cognition was initially proposed in 1963 by Robert P. Abelson. The idea became popular in the 1960s and the 1970s.

An argument is a series of sentences, statements or propositions some of which are called premises and one is the conclusion. The purpose of an argument is to give reasons for one's conclusion via justification, explanation, and/or persuasion.

<span class="mw-page-title-main">Psychology of reasoning</span> Study of how people reason

The psychology of reasoning is the study of how people reason, often broadly defined as the process of drawing conclusions to inform how people solve problems and make decisions. It overlaps with psychology, philosophy, linguistics, cognitive science, artificial intelligence, logic, and probability theory.

Cognitive inertia is the tendency for a particular orientation in how an individual thinks about an issue, belief, or strategy to resist change. In clinical and neuroscientific literature, it is often defined as a lack of motivation to generate distinct cognitive processes needed to attend to a problem or issue. The physics term inertia is to emphasize the rigidity and resistance to change in the method of cognitive processing that has been in use for a significant amount of time. Commonly confused with belief perseverance, cognitive inertia is the perseverance of how one interprets information, not the perseverance of the belief itself.

Motivated reasoning is a cognitive and social response in which individuals, consciously or unconsciously, allow emotion-loaded motivational biases to affect how new information is perceived. Individuals tend to favor evidence that coincides with their current beliefs and reject new information that contradicts them, despite contrary evidence.

The rhyme-as-reason effect, or Eaton–Rosen phenomenon, is a cognitive bias whereupon a saying or aphorism is judged as more accurate or truthful when it is rewritten to rhyme.

Logic-based therapy (LBT) is a proposed modality of philosophical counseling developed by philosopher Elliot D. Cohen beginning in the mid-1980s. It is a philosophical variant of rational emotive behavior therapy (REBT), which was developed by psychologist Albert Ellis. However, there have been no independent, controlled studies to measure its therapeutic value or advantages over classical REBT.

References

  1. 1 2 3 Robert J. Sternberg; Jacqueline P. Leighton (2004). The Nature of Reasoning. Cambridge University Press. p. 300. ISBN   978-0-521-00928-7 . Retrieved 3 September 2013.
  2. 1 2 Evans, Jonathan; Newstead, Stephen; Byrne, Ruth (1993). Human Reasoning: The Psychology of Deduction . Lawrence Erlbaum Associates, Publishers. p.  243. ISBN   9780863773136 . Retrieved 26 January 2017. belief bias.
  3. Evans, Jonathan St. B. T.; Handley, Simon J.; Bacon, Alison M. (2009-01-01). "Reasoning Under Time Pressure". Experimental Psychology. 56 (2): 77–83. doi:10.1027/1618-3169.56.2.77. ISSN   1618-3169. PMID   19261582.
  4. Andrews, Glenda (2010-10-01). "Belief-based and analytic processing in transitive inference depends on premise integration difficulty". Memory & Cognition. 38 (7): 928–940. doi: 10.3758/MC.38.7.928 . hdl: 10072/35167 . ISSN   0090-502X. PMID   20921105.
  5. Roberts, Maxwell J.; Sykes, Elizabeth D. A. (2003-01-01). "Belief bias and relational reasoning". The Quarterly Journal of Experimental Psychology Section A. 56 (1): 131–154. doi:10.1080/02724980244000233. ISSN   0272-4987. PMID   12587899. S2CID   44544112.
  6. 1 2 3 4 Schneider, Walter; Shiffrin, Richard M. (1977-01-01). "Controlled and automatic human information processing: I. Detection, search, and attention". Psychological Review. 84 (1): 1–66. doi:10.1037/0033-295x.84.1.1. ISSN   1939-1471.
  7. 1 2 Wilson, Barbara J.; Smith, Stacy L.; Potter, W. James; Kunkel, Dale; Linz, Daniel; Colvin, Carolyn M.; Donnerstein, Edward (2002-01-01). "Violence in Children's Television Programming: Assessing the Risks". Journal of Communication. 52 (1): 5–35. doi:10.1111/j.1460-2466.2002.tb02531.x. ISSN   1460-2466.
  8. 1 2 Hammond, Thomas H. (1996-04-01). "Formal Theory and the Institutions of Governance". Governance. 9 (2): 107–185. doi:10.1111/j.1468-0491.1996.tb00237.x. ISSN   1468-0491.
  9. Reber, Authur S. (1993). Implicit learning and tacit knowledge: An essay on the cognitive unconscious. Oxford University Press. ISBN   0-19-510658-X.
  10. Sá, Walter C.; West, Richard F.; Stanovich, Keith E. (1999). "The domain specificity and generality of belief bias: Searching for a generalizable critical thinking skill". Journal of Educational Psychology. 91 (3): 497–510. doi:10.1037/0022-0663.91.3.497. ISSN   1939-2176. S2CID   8184173.
  11. Evans, Jonathan St B. T. (1 October 2007). "On the resolution of conflict in dual process theories of reasoning". Thinking & Reasoning. 13 (4): 321–339. doi:10.1080/13546780601008825. ISSN   1354-6783. S2CID   121501595.
  12. Evans, Jonathan St B. T.; Curtis-Holmes, Jodie (2005-09-01). "Rapid responding increases belief bias: Evidence for the dual-process theory of reasoning". Thinking & Reasoning. 11 (4): 382–389. doi:10.1080/13546780542000005. ISSN   1354-6783. S2CID   145290672.
  13. Goel, Vinod; Vartanian, Oshin (2011-01-01). "Negative emotions can attenuate the influence of beliefs on logical reasoning". Cognition and Emotion. 25 (1): 121–131. doi:10.1080/02699931003593942. ISSN   0269-9931. PMID   21432659. S2CID   21884466.
  14. Newstead, Stephen E.; Pollard, Paul; Evans, Jonathan St. B. T.; Allen, Julie L. (1992-12-01). "The source of belief bias effects in syllogistic reasoning". Cognition. 45 (3): 257–284. doi:10.1016/0010-0277(92)90019-E. PMID   1490324. S2CID   42135889.
  15. 1 2 Evans, J. St. B.T.; Barston, J.L.; Pollard, P. (1983). "On the conflict between logic and belief in syllogistic reasoning". Memory and Cognition. 11 (3): 295–306. doi: 10.3758/bf03196976 . PMID   6621345.
  16. Morely, N. J.; Evans, J. St. B. T.; Handley, S. J. (2004). "Belief bias & figural bias in syllogistic reasoning". The Quarterly Journal of Experimental Psychology. 57A (4): 666–692. doi:10.1080/02724980343000440. PMID   15204128. S2CID   9965828.
  17. Ding, Daoqun; Chen, Yang; Lai, Ji; Chen, Xiyou; Han, Meng; Zhang, Xiangyi (2020-01-23). "Belief Bias Effect in Older Adults: Roles of Working Memory and Need for Cognition". Frontiers in Psychology. 10: 2940. doi: 10.3389/fpsyg.2019.02940 . ISSN   1664-1078. PMC   6990430 . PMID   32038362.
  18. Stupple, E.J.N.; L. J. Ball; J. St. B. T. Evans; E. Kamal-Smith (2011). "When logic and belief collide: Individual differences in reasoning times support a selective processing model". Journal of Cognitive Psychology. 23 (8): 931–941. doi:10.1080/20445911.2011.589381. hdl: 10545/575936 . S2CID   143396820.
  19. Markovits, H.; Nantel, G. (January 1989). "The belief-bias effect in the production and evaluation of logical conclusions". Memory and Cognition. 17 (1): 11–7. doi: 10.3758/bf03199552 . PMID   2913452.
  20. Torrens, Donna (September 24, 2010). "Individual Differences and the Belief Bias Effect: Mental Models, Logical Necessity, and Abstract Reasoning". Thinking and Reasoning. 5 (1): 1–28. doi:10.1080/135467899394066.
  21. Dube, Chad; Rotello, Caren; Heit, Evan (2011). "The Belief Bias Effect Is Aptly Named: A Reply to Klauer and Kellen (2011)" (PDF). Psychological Review. 118 (1): 155–163. doi:10.1037/a0021774. PMID   21244191 . Retrieved 6 December 2016.
  22. Banks, Adrian (September 4, 2009). "The Influence of Activation Level on Belief Bias in Relational Reasoning" (PDF). Cognitive Science. 37 (3): 544–577. doi: 10.1111/cogs.12017 . PMID   23294043 . Retrieved 6 December 2016.
  23. Hilscher, Michelle. "Attenuating Belief Bias Effects in Syllogistic Reasoning: The Role of Belief-Content Conflict" (PDF). University of Toronto. Retrieved 6 December 2016.

Further reading