Cognitive bias

Last updated
The Cognitive Bias Codex Cognitive Bias Codex - 180+ biases, designed by John Manoogian III (jm3).jpg
The Cognitive Bias Codex

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. [1] Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality. [2] [3] [4]

Contents

While cognitive biases may initially appear to be negative, some are adaptive. They may lead to more effective actions in a given context. [5] Furthermore, allowing cognitive biases enables faster decisions which can be desirable when timeliness is more valuable than accuracy, as illustrated in heuristics. [6] Other cognitive biases are a "by-product" of human processing limitations, [1] resulting from a lack of appropriate mental mechanisms (bounded rationality), the impact of an individual's constitution and biological state (see embodied cognition), or simply from a limited capacity for information processing. [7] [8]

A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive science, social psychology, and behavioral economics. The study of cognitive biases has practical implications for areas including clinical judgment, entrepreneurship, finance, and management. [9] [10]

Overview

The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972 [11] and grew out of their experience of people's innumeracy , or inability to reason intuitively with the greater orders of magnitude. Tversky, Kahneman, and colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. Tversky and Kahneman explained human differences in judgment and decision-making in terms of heuristics. Heuristics involve mental shortcuts which provide swift estimates about the possibility of uncertain occurrences. [12] Heuristics are simple for the brain to compute but sometimes introduce "severe and systematic errors." [6] For example, the representativeness heuristic is defined as "The tendency to judge the frequency or likelihood" of an occurrence by the extent of which the event "resembles the typical case." [12]

The "Linda Problem" illustrates the representativeness heuristic (Tversky & Kahneman, 1983 [13] ). Participants were given a description of "Linda" that suggests Linda might well be a feminist (e.g., she is said to be concerned about discrimination and social justice issues). They were then asked whether they thought Linda was more likely to be (a) a "bank teller" or (b) a "bank teller and active in the feminist movement." A majority chose answer (b). Independent of the information given about Linda, though, the more restrictive answer (b) is under any circumstance statistically less likely than answer (a). This is an example of the "conjunction fallacy". Tversky and Kahneman argued that respondents chose (b) because it seemed more "representative" or typical of persons who might fit the description of Linda. The representativeness heuristic may lead to errors such as activating stereotypes and inaccurate judgments of others (Haselton et al., 2005, p. 726).

Critics of Kahneman and Tversky, such as Gerd Gigerenzer, alternatively argued that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases. They should rather conceive rationality as an adaptive tool, not identical to the rules of formal logic or the probability calculus. [14] Nevertheless, experiments such as the "Linda problem" grew into heuristics and biases research programs, which spread beyond academic psychology into other disciplines including medicine and political science.

Definitions

DefinitionSource
"bias ... that occurs when humans are processing and interpreting information"ISO/IEC TR 24027:2021(en), 3.2.4, [15] ISO/IEC TR 24368:2022(en), 3.8 [16]

Types

Biases can be distinguished on a number of dimensions. Examples of cognitive biases include -

Other biases are due to the particular way the brain perceives, forms memories and makes judgments. This distinction is sometimes described as "hot cognition" versus "cold cognition", as motivated reasoning can involve a state of arousal. Among the "cold" biases,

As some biases reflect motivation specifically the motivation to have positive attitudes to oneself. [19] It accounts for the fact that many biases are self-motivated or self-directed (e.g., illusion of asymmetric insight, self-serving bias). There are also biases in how subjects evaluate in-groups or out-groups; evaluating in-groups as more diverse and "better" in many respects, even when those groups are arbitrarily defined (ingroup bias, outgroup homogeneity bias).

Some cognitive biases belong to the subgroup of attentional biases, which refers to paying increased attention to certain stimuli. It has been shown, for example, that people addicted to alcohol and other drugs pay more attention to drug-related stimuli. Common psychological tests to measure those biases are the Stroop task [20] [21] and the dot probe task.

Individuals' susceptibility to some types of cognitive biases can be measured by the Cognitive Reflection Test (CRT) developed by Shane Frederick (2005). [22] [23]

List of biases

The following is a list of the more commonly studied cognitive biases:

NameDescription
Fundamental attribution error (FAE, aka correspondence bias [24] )Tendency to overemphasize personality-based explanations for behaviors observed in others. At the same time, individuals under-emphasize the role and power of situational influences on the same behavior. Edward E. Jones and Victor A. Harris' (1967) [25] classic study illustrates the FAE. Despite being made aware that the target's speech direction (pro-Castro/anti-Castro) was assigned to the writer, participants ignored the situational pressures and attributed pro-Castro attitudes to the writer when the speech represented such attitudes.
Implicit bias (aka implicit stereotype, unconscious bias)Tendency to attribute positive or negative qualities to a group of individuals. It can be fully non-factual or be an abusive generalization of a frequent trait in a group to all individuals of that group.
Priming bias Tendency to be influenced by the first presentation of an issue to create our preconceived idea of it, which we then can adjust with later information.
Confirmation bias Tendency to search for or interpret information in a way that confirms one's preconceptions, and discredit information that does not support the initial opinion. [26] Related to the concept of cognitive dissonance, in that individuals may reduce inconsistency by searching for information which reconfirms their views (Jermias, 2001, p. 146). [27]
Affinity biasTendency to be favorably biased toward people most like ourselves. [28]
Self-serving bias Tendency to claim more responsibility for successes than for failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests.
Belief bias Tendency to evaluate the logical strength of an argument based on current belief and perceived plausibility of the statement's conclusion.
Framing Tendency to narrow the description of a situation in order to guide to a selected conclusion. The same primer can be framed differently and therefore lead to different conclusions.
Hindsight bias Tendency to view past events as being predictable. Also called the "I-knew-it-all-along" effect.
Embodied cognition Tendency to have selectivity in perception, attention, decision making, and motivation based on the biological state of the body.
Anchoring bias The inability of people to make appropriate adjustments from a starting point in response to a final answer. It can lead people to make sub-optimal decisions. Anchoring affects decision making in negotiations, medical diagnoses, and judicial sentencing. [29]
Status quo bias Tendency to hold to the current situation rather than an alternative situation, to avoid risk and loss (loss aversion). [30] In status quo bias, a decision-maker has the increased propensity to choose an option because it is the default option or status quo. Has been shown to affect various important economic decisions, for example, a choice of car insurance or electrical service. [31]
Overconfidence effect Tendency to overly trust one's own capability to make correct decisions. People tended to overrate their abilities and skills as decision makers. [32] See also the Dunning–Kruger effect.
Physical attractiveness stereotype The tendency to assume people who are physically attractive also possess other desirable personality traits. [33]

Practical significance

Many social institutions rely on individuals to make rational judgments.

The securities regulation regime largely assumes that all investors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects.

A fair jury trial, for example, requires that the jury ignore irrelevant features of the case, weigh the relevant features appropriately, consider different possibilities open-mindedly and resist fallacies such as appeal to emotion. The various biases demonstrated in these psychological experiments suggest that people will frequently fail to do all these things. [34] However, they fail to do so in systematic, directional ways that are predictable. [4]

In some academic disciplines, the study of bias is very popular. For instance, bias is a wide spread and well studied phenomenon because most decisions that concern the minds and hearts of entrepreneurs are computationally intractable. [10]

Cognitive biases can create other issues that arise in everyday life. One study showed the connection between cognitive bias, specifically approach bias, and inhibitory control on how much unhealthy snack food a person would eat. [35] They found that the participants who ate more of the unhealthy snack food, tended to have less inhibitory control and more reliance on approach bias. Others have also hypothesized that cognitive biases could be linked to various eating disorders and how people view their bodies and their body image. [36] [37]

It has also been argued that cognitive biases can be used in destructive ways. [38] Some believe that there are people in authority who use cognitive biases and heuristics in order to manipulate others so that they can reach their end goals. Some medications and other health care treatments rely on cognitive biases in order to persuade others who are susceptible to cognitive biases to use their products. Many see this as taking advantage of one's natural struggle of judgement and decision-making. They also believe that it is the government's responsibility to regulate these misleading ads.

Cognitive biases also seem to play a role in property sale price and value. Participants in the experiment were shown a residential property. [39] Afterwards, they were shown another property that was completely unrelated to the first property. They were asked to say what they believed the value and the sale price of the second property would be. They found that showing the participants an unrelated property did have an effect on how they valued the second property.

Cognitive biases can be used in non-destructive ways. In team science and collective problem-solving, the superiority bias can be beneficial. It leads to a diversity of solutions within a group, especially in complex problems, by preventing premature consensus on suboptimal solutions. This example demonstrates how a cognitive bias, typically seen as a hindrance, can enhance collective decision-making by encouraging a wider exploration of possibilities. [40]

Reducing

Because they cause systematic errors, cognitive biases cannot be compensated for using a wisdom of the crowd technique of averaging answers from several people. [41] Debiasing is the reduction of biases in judgment and decision-making through incentives, nudges, and training. Cognitive bias mitigation and cognitive bias modification are forms of debiasing specifically applicable to cognitive biases and their effects. Reference class forecasting is a method for systematically debiasing estimates and decisions, based on what Daniel Kahneman has dubbed the outside view.

Similar to Gigerenzer (1996), [42] Haselton et al. (2005) state the content and direction of cognitive biases are not "arbitrary" (p. 730). [1] Moreover, cognitive biases can be controlled. One debiasing technique aims to decrease biases by encouraging individuals to use controlled processing compared to automatic processing. [24] In relation to reducing the FAE, monetary incentives [43] and informing participants they will be held accountable for their attributions [44] have been linked to the increase of accurate attributions. Training has also shown to reduce cognitive bias. Carey K. Morewedge and colleagues (2015) found that research participants exposed to one-shot training interventions, such as educational videos and debiasing games that taught mitigating strategies, exhibited significant reductions in their commission of six cognitive biases immediately and up to 3 months later. [45]

Cognitive bias modification refers to the process of modifying cognitive biases in healthy people and also refers to a growing area of psychological (non-pharmaceutical) therapies for anxiety, depression and addiction called cognitive bias modification therapy (CBMT). CBMT is sub-group of therapies within a growing area of psychological therapies based on modifying cognitive processes with or without accompanying medication and talk therapy, sometimes referred to as applied cognitive processing therapies (ACPT). Although cognitive bias modification can refer to modifying cognitive processes in healthy individuals, CBMT is a growing area of evidence-based psychological therapy, in which cognitive processes are modified to relieve suffering [46] [47] from serious depression, [48] anxiety, [49] and addiction. [50] CBMT techniques are technology-assisted therapies that are delivered via a computer with or without clinician support. CBM combines evidence and theory from the cognitive model of anxiety, [51] cognitive neuroscience, [52] and attentional models. [53]

Cognitive bias modification has also been used to help those with obsessive-compulsive beliefs and obsessive-compulsive disorder. [54] [55] This therapy has shown that it decreases the obsessive-compulsive beliefs and behaviors.

Common theoretical causes of some cognitive biases

Bias arises from various processes that are sometimes difficult to distinguish. These include:

Individual differences in cognitive biases

The relation between cognitive bias, habit and social convention is still an important issue. Relation between Bias, habit and convention.png
The relation between cognitive bias, habit and social convention is still an important issue.

People do appear to have stable individual differences in their susceptibility to decision biases such as overconfidence, temporal discounting, and bias blind spot. [64] That said, these stable levels of bias within individuals are possible to change. Participants in experiments who watched training videos and played debiasing games showed medium to large reductions both immediately and up to three months later in the extent to which they exhibited susceptibility to six cognitive biases: anchoring, bias blind spot, confirmation bias, fundamental attribution error, projection bias, and representativeness. [65]

Individual differences in cognitive bias have also been linked to varying levels of cognitive abilities and functions. [66] The Cognitive Reflection Test (CRT) has been used to help understand the connection between cognitive biases and cognitive ability. There have been inconclusive results when using the Cognitive Reflection Test to understand ability. However, there does seem to be a correlation; those who gain a higher score on the Cognitive Reflection Test, have higher cognitive ability and rational-thinking skills. This in turn helps predict the performance on cognitive bias and heuristic tests. Those with higher CRT scores tend to be able to answer more correctly on different heuristic and cognitive bias tests and tasks. [67]

Age is another individual difference that has an effect on one's ability to be susceptible to cognitive bias. Older individuals tend to be more susceptible to cognitive biases and have less cognitive flexibility. However, older individuals were able to decrease their susceptibility to cognitive biases throughout ongoing trials. [68] These experiments had both young and older adults complete a framing task. Younger adults had more cognitive flexibility than older adults. Cognitive flexibility is linked to helping overcome pre-existing biases.

Criticism

Cognitive bias theory loses the sight of any distinction between reason and bias. If every bias can be seen as a reason, and every reason can be seen as a bias, then the distinction is lost. [69]

Criticism against theories of cognitive biases is usually founded in the fact that both sides of a debate often claim the other's thoughts to be subject to human nature and the result of cognitive bias, while claiming their own point of view to be above the cognitive bias and the correct way to "overcome" the issue. This rift ties to a more fundamental issue that stems from a lack of consensus in the field, thereby creating arguments that can be non-falsifiably used to validate any contradicting viewpoint.[ citation needed ]

Gerd Gigerenzer is one of the main opponents to cognitive biases and heuristics. [70] [71] [72] Gigerenzer believes that cognitive biases are not biases, but rules of thumb, or as he would put it "gut feelings" that can actually help us make accurate decisions in our lives. His view shines a much more positive light on cognitive biases than many other researchers. Many view cognitive biases and heuristics as irrational ways of making decisions and judgements.

See also

Further reading

Related Research Articles

A heuristic, or heuristic technique, is any approach to problem solving that employs a practical method that is not fully optimized, perfected, or rationalized, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Bounded rationality is the idea that rationality is limited when individuals make decisions, and under these limitations, rational individuals will select a decision that is satisfactory rather than optimal.

The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

The conjunction fallacy is an inference that a conjoint set of two or more specific conclusions is likelier than any single member of that same set, in violation of the laws of probability. It is a type of formal fallacy.

The anchoring effect is a psychological phenomenon in which an individual's judgements or decisions are influenced by a reference point or "anchor" which can be completely irrelevant. Both numeric and non-numeric anchoring have been reported in research. In numeric anchoring, once the value of the anchor is set, subsequent arguments, estimates, etc. made by an individual may change from what they would have otherwise been without the anchor. For example, an individual may be more likely to purchase a car if it is placed alongside a more expensive model. Prices discussed in negotiations that are lower than the anchor may seem reasonable, perhaps even cheap to the buyer, even if said prices are still relatively higher than the actual market value of the car. Another example may be when estimating the orbit of Mars, one might start with the Earth's orbit and then adjust upward until they reach a value that seems reasonable.

<span class="mw-page-title-main">Thomas Gilovich</span> American psychologist (born 1954)

Thomas Dashiff Gilovich an American psychologist who is the Irene Blecker Rosenfeld Professor of Psychology at Cornell University. He has conducted research in social psychology, decision making, behavioral economics, and has written popular books on these subjects. Gilovich has collaborated with Daniel Kahneman, Richard Nisbett, Lee Ross and Amos Tversky. His articles in peer-reviewed journals on subjects such as cognitive biases have been widely cited. In addition, Gilovich has been quoted in the media on subjects ranging from the effect of purchases on happiness to people's most common regrets, to perceptions of people and social groups. Gilovich is a fellow of the Committee for Skeptical Inquiry.

<span class="mw-page-title-main">Gerd Gigerenzer</span> German cognitive psychologist

Gerd Gigerenzer is a German psychologist who has studied the use of bounded rationality and heuristics in decision making. Gigerenzer is director emeritus of the Center for Adaptive Behavior and Cognition (ABC) at the Max Planck Institute for Human Development and director of the Harding Center for Risk Literacy, both in Berlin.

Daniel G. Goldstein is an American cognitive psychologist known for the specification and testing of heuristics and models of bounded rationality in the field of judgment and decision making. He is an honorary research fellow at London Business School and works with Microsoft Research as a principal researcher.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.

Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

Social heuristics are simple decision making strategies that guide people's behavior and decisions in the social environment when time, information, or cognitive resources are scarce. Social environments tend to be characterised by complexity and uncertainty, and in order to simplify the decision-making process, people may use heuristics, which are decision making strategies that involve ignoring some information or relying on simple rules of thumb.

Ecological rationality is a particular account of practical rationality, which in turn specifies the norms of rational action – what one ought to do in order to act rationally. The presently dominant account of practical rationality in the social and behavioral sciences such as economics and psychology, rational choice theory, maintains that practical rationality consists in making decisions in accordance with some fixed rules, irrespective of context. Ecological rationality, in contrast, claims that the rationality of a decision depends on the circumstances in which it takes place, so as to achieve one's goals in this particular context. What is considered rational under the rational choice account thus might not always be considered rational under the ecological rationality account. Overall, rational choice theory puts a premium on internal logical consistency whereas ecological rationality targets external performance in the world. The term ecologically rational is only etymologically similar to the biological science of ecology.

Debiasing is the reduction of bias, particularly with respect to judgment and decision making. Biased judgment and decision making is that which systematically deviates from the prescriptions of objective standards such as facts, logic, and rational behavior or prescriptive norms. Biased judgment and decision making exists in consequential domains such as medicine, law, policy, and business, as well as in everyday life. Investors, for example, tend to hold onto falling stocks too long and sell rising stocks too quickly. Employers exhibit considerable discrimination in hiring and employment practices, and some parents continue to believe that vaccinations cause autism despite knowing that this link is based on falsified evidence. At an individual level, people who exhibit less decision bias have more intact social environments, reduced risk of alcohol and drug use, lower childhood delinquency rates, and superior planning and problem solving abilities.

Intuitive statistics, or folk statistics, is the cognitive phenomenon where organisms use data to make generalizations and predictions about the world. This can be a small amount of sample data or training instances, which in turn contribute to inductive inferences about either population-level properties, future data, or both. Inferences can involve revising hypotheses, or beliefs, in light of probabilistic data that inform and motivate future predictions. The informal tendency for cognitive animals to intuitively generate statistical inferences, when formalized with certain axioms of probability theory, constitutes statistics as an academic discipline.

<span class="mw-page-title-main">Ralph Hertwig</span> German psychologist

Ralph Hertwig is a German psychologist whose work focuses on the psychology of human judgment and decision making. Hertwig is Director of the Center for Adaptive Rationality at the Max Planck Institute for Human Development in Berlin, Germany. He grew up with his brothers Steffen Hertwig and Michael Hertwig in Talheim, Heilbronn.

References

  1. 1 2 3 Haselton MG, Nettle D, Andrews PW (2005). "The evolution of cognitive bias.". In Buss DM (ed.). The Handbook of Evolutionary Psychology. Hoboken, NJ, US: John Wiley & Sons Inc. pp. 724–746.
  2. Kahneman D, Tversky A (1972). "Subjective probability: A judgment of representativeness" (PDF). Cognitive Psychology. 3 (3): 430–454. doi:10.1016/0010-0285(72)90016-3. Archived from the original (PDF) on 2019-12-14. Retrieved 2017-04-01.
  3. Baron J (2007). Thinking and Deciding (4th ed.). New York, NY: Cambridge University Press.
  4. 1 2 Ariely D (2008). Predictably Irrational: The Hidden Forces That Shape Our Decisions. New York, NY: HarperCollins. ISBN   978-0-06-135323-9.
  5. For instance: Gigerenzer G, Goldstein DG (October 1996). "Reasoning the fast and frugal way: models of bounded rationality" (PDF). Psychological Review. 103 (4): 650–69. CiteSeerX   10.1.1.174.4404 . doi:10.1037/0033-295X.103.4.650. hdl:21.11116/0000-0000-B771-2. PMID   8888650.
  6. 1 2 3 4 Tversky A, Kahneman D (September 1974). "Judgment under Uncertainty: Heuristics and Biases". Science. 185 (4157): 1124–31. Bibcode:1974Sci...185.1124T. doi:10.1126/science.185.4157.1124. PMID   17835457. S2CID   143452957.
  7. Bless H, Fiedler K, Strack F (2004). Social cognition: How individuals construct social reality. Hove and New York: Psychology Press.
  8. Morewedge CK, Kahneman D (October 2010). "Associative processes in intuitive judgment". Trends in Cognitive Sciences. 14 (10): 435–40. doi:10.1016/j.tics.2010.07.004. PMC   5378157 . PMID   20696611.
  9. Kahneman D, Tversky A (July 1996). "On the reality of cognitive illusions" (PDF). Psychological Review. 103 (3): 582–91, discussion 592–6. CiteSeerX   10.1.1.174.5117 . doi:10.1037/0033-295X.103.3.582. PMID   8759048.
  10. 1 2 Zhang SX, Cueto J (2015). "The Study of Bias in Entrepreneurship". Entrepreneurship Theory and Practice. 41 (3): 419–454. doi:10.1111/etap.12212. S2CID   146617323.
  11. Kahneman D, Frederick S (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Gilovich T, Griffin DW, Kahneman D (eds.). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 51–52. ISBN   978-0-521-79679-8.
  12. 1 2 Baumeister RF, Bushman BJ (2010). Social psychology and human nature: International Edition. Belmont, US: Wadsworth. p. 141.
  13. Tversky A, Kahneman D (1983). "Extensional versus intuitive reasoning: The conjunction fallacy in probability judgement" (PDF). Psychological Review. 90 (4): 293–315. doi:10.1037/0033-295X.90.4.293. Archived (PDF) from the original on 2007-09-28.
  14. Gigerenzer G (2006). "Bounded and Rational". In Stainton RJ (ed.). Contemporary Debates in Cognitive Science. Blackwell. p. 129. ISBN   978-1-4051-1304-5.
  15. "3.2.4". ISO/IEC TR 24027:2021 Information technology — Artificial intelligence (AI) — Bias in AI systems and AI aided decision making. ISO. 2021. Retrieved 21 June 2023.
  16. "3.8". ISO/IEC TR 24368:2022 Information technology — Artificial intelligence — Overview of ethical and societal concerns. ISO. 2022. Retrieved 21 June 2023.
  17. Schacter DL (March 1999). "The seven sins of memory. Insights from psychology and cognitive neuroscience". The American Psychologist. 54 (3): 182–203. doi:10.1037/0003-066X.54.3.182. PMID   10199218. S2CID   14882268.
  18. Kunda Z (November 1990). "The case for motivated reasoning" (PDF). Psychological Bulletin. 108 (3): 480–98. doi:10.1037/0033-2909.108.3.480. PMID   2270237. S2CID   9703661. Archived from the original (PDF) on 2017-07-06. Retrieved 2017-10-27.
  19. 1 2 Hoorens V (1993). "Self-enhancement and Superiority Biases in Social Comparison". In Stroebe, W., Hewstone, Miles (eds.). European Review of Social Psychology 4. Wiley.
  20. Jensen AR, Rohwer WD (1966). "The Stroop color-word test: a review". Acta Psychologica. 25 (1): 36–93. doi:10.1016/0001-6918(66)90004-7. PMID   5328883.
  21. MacLeod CM (March 1991). "Half a century of research on the Stroop effect: an integrative review". Psychological Bulletin. 109 (2): 163–203. CiteSeerX   10.1.1.475.2563 . doi:10.1037/0033-2909.109.2.163. hdl:11858/00-001M-0000-002C-5646-A. PMID   2034749.
  22. Frederick S (2005). "Cognitive Reflection and Decision Making". Journal of Economic Perspectives. 19 (4): 25–42. doi: 10.1257/089533005775196732 . ISSN   0895-3309.
  23. Oechssler J, Roider A, Schmitz PW (2009). "Cognitive abilities and behavioral biases" (PDF). Journal of Economic Behavior & Organization. 72 (1): 147–152. doi:10.1016/j.jebo.2009.04.018. ISSN   0167-2681. Archived (PDF) from the original on 2016-08-03.
  24. 1 2 Baumeister RF, Bushman BJ (2010). Social psychology and human nature: International Edition. Belmont, USA: Wadsworth.
  25. Jones EE, Harris VA (1967). "The attribution of attitudes". Journal of Experimental Social Psychology. 3: 1–24. doi:10.1016/0022-1031(67)90034-0.
  26. Mahoney MJ (1977). "Publication prejudices: An experimental study of confirmatory bias in the peer review system". Cognitive Therapy and Research. 1 (2): 161–175. doi:10.1007/bf01173636. S2CID   7350256.
  27. Jermias J (2001). "Cognitive dissonance and resistance to change: The influence of commitment confirmation and feedback on judgement usefulness of accounting systems". Accounting, Organizations and Society. 26 (2): 141–160. doi:10.1016/s0361-3682(00)00008-8.
  28. Thakrar, Monica. "Council Post: Unconscious Bias And Three Ways To Overcome It". Forbes.
  29. Cho, I. et al. (2018) 'The Anchoring Effect in Decision-Making with Visual Analytics', 2017 IEEE Conference on Visual Analytics Science and Technology, VAST 2017 - Proceedings. IEEE, pp. 116–126. doi : 10.1109/VAST.2017.8585665.
  30. Kahneman, D., Knetsch, J. L. and Thaler, R. H. (1991) Anomalies The Endowment Effect, Loss Aversion, and Status Quo Bias, Journal of Economic Perspectives.
  31. Dean, M. (2008) 'Status quo bias in large and small choice sets', New York, p. 52. Available at: http://www.yorkshire-exile.co.uk/Dean_SQ.pdf Archived 2010-12-25 at the Wayback Machine .
  32. Gimpel, Henner (2008), Gimpel, Henner; Jennings, Nicholas R.; Kersten, Gregory E.; Ockenfels, Axel (eds.), "Cognitive Biases in Negotiation Processes", Negotiation, Auctions, and Market Engineering, Lecture Notes in Business Information Processing, Berlin, Heidelberg: Springer Berlin Heidelberg, vol. 2, pp. 213–226, doi:10.1007/978-3-540-77554-6_16, ISBN   978-3-540-77553-9 , retrieved 2020-11-25
  33. Lorenz, Kate. (2005). "Do Pretty People Earn More?" http://www.CNN.com.
  34. Sutherland S (2007). Irrationality: The Enemy Within (Second ed.). Pinter & Martin. ISBN   978-1-905177-07-3.
  35. Kakoschke N, Kemps E, Tiggemann M (April 2015). "Combined effects of cognitive bias for food cues and poor inhibitory control on unhealthy food intake". Appetite. 87: 358–64. doi:10.1016/j.appet.2015.01.004. hdl: 2328/35717 . PMID   25592403. S2CID   31561602.
  36. Williamson DA, Muller SL, Reas DL, Thaw JM (October 1999). "Cognitive bias in eating disorders: implications for theory and treatment". Behavior Modification. 23 (4): 556–77. doi:10.1177/0145445599234003. PMID   10533440. S2CID   36189809.
  37. Williamson DA (1996). "Body image disturbance in eating disorders: A form of cognitive bias?". Eating Disorders. 4 (1): 47–58. doi:10.1080/10640269608250075. ISSN   1064-0266.
  38. Trout J (2005). "Paternalism and Cognitive Bias". Law and Philosophy. 24 (4): 393–434. doi:10.1007/s10982-004-8197-3. ISSN   0167-5249. S2CID   143783638.
  39. Levy DS, Frethey-Bentham C (2010). "The effect of context and the level of decision maker training on the perception of a property's probable sale price". Journal of Property Research. 27 (3): 247–267. doi:10.1080/09599916.2010.518406. ISSN   0959-9916. S2CID   154866472.
  40. Boroomand, Amin; Smaldino, Paul E. (2023). "Superiority bias and communication noise can enhance collective problem-solving". Journal of Artificial Societies and Social Simulation. 26 (3). doi:10.18564/jasss.5154.
  41. Buckingham M, Goodall A. "The Feedback Fallacy". Harvard Business Review . No. March–April 2019.
  42. Gigerenzer G (1996). "On narrow norms and vague heuristics: A reply to Kahneman and Tversky (1996)". Psychological Review. 103 (3): 592–596. CiteSeerX   10.1.1.314.996 . doi:10.1037/0033-295x.103.3.592.
  43. Vonk R (1999). "Effects of outcome dependency on correspondence bias". Personality and Social Psychology Bulletin. 25 (3): 382–389. doi:10.1177/0146167299025003009. S2CID   145752877.
  44. Tetlock PE (1985). "Accountability: A social check on the fundamental attribution error". Social Psychology Quarterly. 48 (3): 227–236. doi:10.2307/3033683. JSTOR   3033683.
  45. Morewedge CK, Yoon H, Scopelliti I, Symborski CW, Korris JH, Kassam KS (2015-08-13). "Debiasing Decisions Improved Decision Making With a Single Training Intervention" (PDF). Policy Insights from the Behavioral and Brain Sciences. 2: 129–140. doi:10.1177/2372732215600886. ISSN   2372-7322. S2CID   4848978.
  46. MacLeod C, Mathews A, Tata P (February 1986). "Attentional bias in emotional disorders". Journal of Abnormal Psychology. 95 (1): 15–20. doi:10.1037/0021-843x.95.1.15. PMID   3700842.
  47. Bar-Haim Y, Lamy D, Pergamin L, Bakermans-Kranenburg MJ, van IJzendoorn MH (January 2007). "Threat-related attentional bias in anxious and nonanxious individuals: a meta-analytic study". Psychological Bulletin. 133 (1): 1–24. CiteSeerX   10.1.1.324.4312 . doi:10.1037/0033-2909.133.1.1. PMID   17201568. S2CID   2861872.
  48. Holmes EA, Lang TJ, Shah DM (February 2009). "Developing interpretation bias modification as a "cognitive vaccine" for depressed mood: imagining positive events makes you feel better than thinking about them verbally". Journal of Abnormal Psychology. 118 (1): 76–88. doi:10.1037/a0012590. PMID   19222316.
  49. Hakamata Y, Lissek S, Bar-Haim Y, Britton JC, Fox NA, Leibenluft E, et al. (December 2010). "Attention bias modification treatment: a meta-analysis toward the establishment of novel treatment for anxiety". Biological Psychiatry. 68 (11): 982–90. doi:10.1016/j.biopsych.2010.07.021. PMC   3296778 . PMID   20887977.
  50. Eberl C, Wiers RW, Pawelczack S, Rinck M, Becker ES, Lindenmeyer J (April 2013). "Approach bias modification in alcohol dependence: do clinical effects replicate and for whom does it work best?". Developmental Cognitive Neuroscience. 4: 38–51. doi:10.1016/j.dcn.2012.11.002. PMC   6987692 . PMID   23218805.
  51. Clark DA, Beck AT (2009). Cognitive Therapy of Anxiety Disorders: Science and Practice. London: Guildford.
  52. Browning M, Holmes EA, Murphy SE, Goodwin GM, Harmer CJ (May 2010). "Lateral prefrontal cortex mediates the cognitive modification of attentional bias". Biological Psychiatry. 67 (10): 919–25. doi:10.1016/j.biopsych.2009.10.031. PMC   2866253 . PMID   20034617.
  53. Eysenck MW, Derakshan N, Santos R, Calvo MG (May 2007). "Anxiety and cognitive performance: attentional control theory". Emotion. 7 (2): 336–53. CiteSeerX   10.1.1.453.3592 . doi:10.1037/1528-3542.7.2.336. PMID   17516812. S2CID   33462708.
  54. Beadel JR, Smyth FL, Teachman BA (2014). "Change Processes During Cognitive Bias Modification for Obsessive Compulsive Beliefs". Cognitive Therapy and Research. 38 (2): 103–119. doi:10.1007/s10608-013-9576-6. ISSN   0147-5916. S2CID   32259433.
  55. Williams AD, Grisham JR (October 2013). "Cognitive Bias Modification (CBM) of obsessive compulsive beliefs". BMC Psychiatry. 13 (1): 256. doi: 10.1186/1471-244X-13-256 . PMC   3851748 . PMID   24106918.
  56. Van Eyghen H (2022). "Cognitive Bias. Philogenesis or Ontogenesis". Frontiers in Psychology. 13. doi: 10.3389/fpsyg.2022.892829 . PMC   9364952 . PMID   35967732.
  57. Kahneman D, Frederick S (2002). "Representativeness revisited: Attribute substitution in intuitive judgment". In Gilovich T, Griffin DW, Kahneman D (eds.). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 49–81. ISBN   978-0-521-79679-8. OCLC   47364085.
  58. Kahneman, D., Slovic, P., & Tversky, A. (1982). Judgment under uncertainty: Heuristics and biases (1st ed.). Cambridge University Press.
  59. Slovic P, Finucane M, Peters E, MacGregor DG (2002). "The Affect Heuristic". In Gilovich T, Griffin D, Kahneman D (eds.). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press. pp. 397–420. ISBN   978-0-521-79679-8.
  60. Pfister HR, Böhm G (2008). "The multiplicity of emotions: A framework of emotional functions in decision making". Judgment and Decision Making. 3: 5–17. doi: 10.1017/S1930297500000127 .
  61. Wang X, Simons F, Brédart S (2001). "Social cues and verbal framing in risky choice". Journal of Behavioral Decision Making. 14 (1): 1–15. doi:10.1002/1099-0771(200101)14:1<1::AID-BDM361>3.0.CO;2-N.
  62. Simon HA (1955). "A behavioral model of rational choice". The Quarterly Journal of Economics. 69 (1): 99–118. doi:10.2307/1884852. JSTOR   1884852.
  63. 1 2 Hilbert M (March 2012). "Toward a synthesis of cognitive biases: how noisy information processing can bias human decision making" (PDF). Psychological Bulletin. 138 (2): 211–37. CiteSeerX   10.1.1.432.8763 . doi:10.1037/a0025940. PMID   22122235.
  64. Scopelliti I, Morewedge CK, McCormick E, Min HL, Lebrecht S, Kassam KS (2015-04-24). "Bias Blind Spot: Structure, Measurement, and Consequences". Management Science. 61 (10): 2468–2486. doi: 10.1287/mnsc.2014.2096 .
  65. Morewedge CK, Yoon H, Scopelliti I, Symborski CW, Korris JH, Kassam KS (2015-10-01). "Debiasing Decisions Improved Decision Making With a Single Training Intervention" (PDF). Policy Insights from the Behavioral and Brain Sciences. 2 (1): 129–140. doi:10.1177/2372732215600886. ISSN   2372-7322. S2CID   4848978.
  66. Vartanian O, Beatty EL, Smith I, Blackler K, Lam Q, Forbes S, De Neys W (July 2018). "The Reflective Mind: Examining Individual Differences in Susceptibility to Base Rate Neglect with fMRI". Journal of Cognitive Neuroscience. 30 (7): 1011–1022. doi: 10.1162/jocn_a_01264 . PMID   29668391. S2CID   4933030.
  67. Toplak ME, West RF, Stanovich KE (October 2011). "The Cognitive Reflection Test as a predictor of performance on heuristics-and-biases tasks". Memory & Cognition. 39 (7): 1275–89. doi: 10.3758/s13421-011-0104-1 . PMID   21541821.
  68. Wilson CG, Nusbaum AT, Whitney P, Hinson JM (August 2018). "Age-differences in cognitive flexibility when overcoming a preexisting bias through feedback". Journal of Clinical and Experimental Neuropsychology. 40 (6): 586–594. doi:10.1080/13803395.2017.1398311. PMID   29161963. S2CID   13372385.
  69. "Kahneman's Fallacies, "Thinking, Fast & Slow"". Wenglinsky Review. 2017-01-23. Retrieved 2023-11-19.
  70. Clavien C (2010). "Gerd Gigerenzer, Gut Feelings: Short Cuts to Better Decision Making: Penguin Books, 2008 (1st ed. 2007), £ 8.99 (paperback), ISBN-13: 978-0141015910". Ethical Theory and Moral Practice. 13 (1): 113–115. doi:10.1007/s10677-009-9172-8. ISSN   1386-2820. S2CID   8097667.
  71. Gigerenzer G (2000). Adaptive thinking : rationality in the real world. Oxford: Oxford Univ. Press. ISBN   978-0-19-803117-8. OCLC   352897263.
  72. Gigerenzer G (1999). Simple heuristics that make us smart. Todd, Peter M., ABC Research Group. New York: Oxford University Press. ISBN   0-585-35863-X. OCLC   47009468.