List of cognitive biases

Last updated

Cognitive biases are systematic patterns of deviation from norm and/or rationality in judgment. They are often studied in psychology, sociology and behavioral economics. [1]

Contents

Although the reality of most of these biases is confirmed by reproducible research, [2] [3] there are often controversies about how to classify these biases or how to explain them. [4] Several theoretical causes are known for some cognitive biases, which provides a classification of biases by their common generative mechanism (such as noisy information-processing [5] ). Gerd Gigerenzer has criticized the framing of cognitive biases as errors in judgment, and favors interpreting them as arising from rational deviations from logical thought. [6]

Explanations include information-processing rules (i.e., mental shortcuts), called heuristics , that the brain uses to produce decisions or judgments. Biases have a variety of forms and appear as cognitive ("cold") bias, such as mental noise, [5] or motivational ("hot") bias, such as when beliefs are distorted by wishful thinking. Both effects can be present at the same time. [7] [8]

There are also controversies over some of these biases as to whether they count as useless or irrational, or whether they result in useful attitudes or behavior. For example, when getting to know others, people tend to ask leading questions which seem biased towards confirming their assumptions about the person. However, this kind of confirmation bias has also been argued to be an example of social skill; a way to establish a connection with the other person. [9]

Although this research overwhelmingly involves human subjects, some studies have found bias in non-human animals as well. For example, loss aversion has been shown in monkeys and hyperbolic discounting has been observed in rats, pigeons, and monkeys. [10]

Belief, decision-making and behavioral

These biases affect belief formation, reasoning processes, business and economic decisions, and human behavior in general.

Anchoring bias

The anchoring bias, or focalism, is the tendency to rely too heavily—to "anchor"—on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject). [11] [12] Anchoring bias includes or involves the following:

Apophenia

The tendency to perceive meaningful connections between unrelated things. [17] The following are types of apophenia:

Availability heuristic

The availability heuristic (also known as the availability bias) is the tendency to overestimate the likelihood of events with greater "availability" in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be. [20] The availability heuristic includes or involves the following:

Cognitive dissonance

Cognitive dissonance is the perception of contradictory information and the mental toll of it.

Confirmation bias

Confirmation bias is the tendency to search for, interpret, focus on and remember information in a way that confirms one's preconceptions. [31] There are multiple other cognitive biases which involve or are types of confirmation bias:

Egocentric bias

Egocentric bias is the tendency to rely too heavily on one's own perspective and/or have a different perception of oneself relative to others. [34] The following are forms of egocentric bias:

Extension neglect

Extension neglect occurs where the quantity of the sample size is not sufficiently taken into consideration when assessing the outcome, relevance or judgement. The following are forms of extension neglect:

False priors

False priors are initial beliefs and knowledge which interfere with the unbiased evaluation of factual evidence and lead to incorrect conclusions. Biases based on false priors include:

Framing effect

The framing effect is the tendency to draw different conclusions from the same information, depending on how that information is presented. Forms of the framing effect include:

Logical fallacy

Prospect theory

The following relate to prospect theory:

Self-assessment

Truth judgment

Other

NameDescription
Action bias The tendency for someone to act when faced with a problem even when inaction would be more effective, or to act when no evident problem exists. [88] [89]
Additive bias The tendency to solve problems through addition, even when subtraction is a better approach. [90] [91]
Attribute substitution Occurs when a judgment has to be made (of a target attribute) that is computationally complex, and instead a more easily calculated heuristic attribute is substituted. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system.
Curse of knowledge When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people. [92]
Declinism The predisposition to view the past favorably (rosy retrospection) and future negatively. [93]
End-of-history illusion The age-independent belief that one will change less in the future than one has in the past. [94]
Exaggerated expectationThe tendency to expect or predict more extreme outcomes than those outcomes that actually happen. [5]
Form function attribution biasIn human–robot interaction, the tendency of people to make systematic errors when interacting with a robot. People may base their expectations and perceptions of a robot on its appearance (form) and attribute functions which do not necessarily mirror the true functions of the robot. [95]
Fundamental pain biasThe tendency for people to believe they accurately report their own pain levels while holding the paradoxical belief that others exaggerate it. [96]
Hedonic recall biasThe tendency for people who are satisfied with their wage to overestimate how much they earn, and vice versa, for people who are unsatisfied with their wage to underestimate it. [97]
Hindsight bias Sometimes called the "I-knew-it-all-along" effect, or the "Hindsight is 20/20" effect, is the tendency to see past events as having been predictable [98] before they happened.
Impact bias The tendency to overestimate the length or the intensity of the impact of future feeling states. [46]
Information bias The tendency to seek information even when it cannot affect action. [99]
Interoceptive bias or Hungry judge effect The tendency for sensory input about the body itself to affect one's judgement about external, unrelated circumstances. (As for example, in parole judges who are more lenient when fed and rested.) [100] [101] [102] [103]
Money illusion The tendency to concentrate on the nominal value (face value) of money rather than its value in terms of purchasing power. [104]
Moral credential effectOccurs when someone who does something good gives themselves permission to be less good in the future.
Non-adaptive choice switching After experiencing a bad outcome with a decision problem, the tendency to avoid the choice previously made when faced with the same decision problem again, even though the choice was optimal. Also known as "once bitten, twice shy" or "hot stove effect". [105]
Mere exposure effect or
familiarity principle (in social psychology)
The tendency to express undue liking for things merely because of familiarity with them. [106]
Omission bias The tendency to judge harmful actions (commissions) as worse, or less moral, than equally harmful inactions (omissions). [107]
Optimism bias The tendency to be over-optimistic, underestimating greatly the probability of undesirable outcomes and overestimating favorable and pleasing outcomes (see also wishful thinking, valence effect, positive outcome bias, and compare pessimism bias). [108] [109]
Ostrich effect Ignoring an obvious negative situation.
Outcome bias The tendency to judge a decision by its eventual outcome instead of the quality of the decision at the time it was made.
Pessimism bias The tendency for some people, especially those with depression, to overestimate the likelihood of negative things happening to them. (compare optimism bias)
Present bias The tendency of people to give stronger weight to payoffs that are closer to the present time when considering trade-offs between two future moments. [110]
Plant blindness The tendency to ignore plants in their environment and a failure to recognize and appreciate the utility of plants to life on earth. [111]
Prevention bias When investing money to protect against risks, decision makers perceive that a dollar spent on prevention buys more security than a dollar spent on timely detection and response, even when investing in either option is equally effective. [112]
Probability matching Sub-optimal matching of the probability of choices with the probability of reward in a stochastic context.
Pro-innovation bias The tendency to have an excessive optimism towards an invention or innovation's usefulness throughout society, while often failing to identify its limitations and weaknesses.
Projection bias The tendency to overestimate how much one's future selves will share one's current preferences, thoughts and values, thus leading to sub-optimal choices. [113] [114] [115]
Proportionality bias Our innate tendency to assume that big events have big causes, may also explain our tendency to accept conspiracy theories. [116] [117]
Recency illusion The illusion that a phenomenon one has noticed only recently is itself recent. Often used to refer to linguistic phenomena; the illusion that a word or language usage that one has noticed only recently is an innovation when it is, in fact, long-established (see also frequency illusion). Also recency bias is a cognitive bias that favors recent events over historic ones. A memory bias, recency bias gives "greater importance to the most recent event", [118] such as the final lawyer's closing argument a jury hears before being dismissed to deliberate.
Systematic biasJudgement that arises when targets of differentiating judgement become subject to effects of regression that are not equivalent. [119]
Risk compensation or Peltzman effect The tendency to take greater risks when perceived safety increases.
Surrogation Losing sight of the strategic construct that a measure is intended to represent, and subsequently acting as though the measure is the construct of interest.
Teleological Bias The tendency to engage in overgeneralized ascriptions of purpose to entities and events that did not arise from goal-directed action, design, or selection based on functional effects. [120] [121]
Turkey illusion Absence of expectation of sudden trend breaks in continuous developments
Unconscious bias or implicit biasThe underlying attitudes and stereotypes that people unconsciously attribute to another person or group of people that affect how they understand and engage with them. Many researchers suggest that unconscious bias occurs automatically as the brain makes quick judgments based on past experiences and background. [122]
Unit biasThe standard suggested amount of consumption (e.g., food serving size) is perceived to be appropriate, and a person would consume it all even if it is too much for this particular person. [123]
Value selection biasThe tendency to rely on existing numerical data when reasoning in an unfamiliar context, even if calculation or numerical manipulation is required. [124] [125]
Weber–Fechner law Difficulty in comparing small differences in large quantities.
Women are wonderful effect A tendency to associate more positive attributes with women than with men.

Social

Association fallacy

Association fallacies include:

  • Authority bias, the tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion. [126]
  • Cheerleader effect, the tendency for people to appear more attractive in a group than in isolation. [127]
  • Halo effect, the tendency for a person's positive or negative traits to "spill over" from one personality area to another in others' perceptions of them (see also physical attractiveness stereotype). [128]

Attribution bias

Attribution bias includes:

  • Actor-observer bias, the tendency for explanations of other individuals' behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation (see also Fundamental attribution error), and for explanations of one's own behaviors to do the opposite (that is, to overemphasize the influence of our situation and underemphasize the influence of our own personality).
  • Defensive attribution hypothesis, a tendency to attribute more blame to a harm-doer as the outcome becomes more severe or as personal or situational similarity to the victim increases.
  • Extrinsic incentives bias, an exception to the fundamental attribution error, where people view others as having (situational) extrinsic motivations and (dispositional) intrinsic motivations for oneself
  • Fundamental attribution error, the tendency for people to overemphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior [115] (see also actor-observer bias, group attribution error, positivity effect, and negativity effect). [129]
  • Group attribution error, the biased belief that the characteristics of an individual group member are reflective of the group as a whole or the tendency to assume that group decision outcomes reflect the preferences of group members, even when information is available that clearly suggests otherwise.
  • Hostile attribution bias, the tendency to interpret others' behaviors as having hostile intent, even when the behavior is ambiguous or benign. [130]
  • Intentionality bias, the tendency to judge human action to be intentional rather than accidental. [131]
  • Just-world hypothesis, the tendency for people to want to believe that the world is fundamentally just, causing them to rationalize an otherwise inexplicable injustice as deserved by the victim(s).
  • Moral luck, the tendency for people to ascribe greater or lesser moral standing based on the outcome of an event.
  • Puritanical bias, the tendency to attribute cause of an undesirable outcome or wrongdoing by an individual to a moral deficiency or lack of self-control rather than taking into account the impact of broader societal determinants . [132]
  • Self-serving bias, the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias). [133]
  • Ultimate attribution error, similar to the fundamental attribution error, in this error a person is likely to make an internal attribution to an entire group instead of the individuals within the group.

Conformity

Conformity is involved in the following:

  • Availability cascade, a self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or "repeat something long enough and it will become true"). [134] See also availability heuristic.
  • Bandwagon effect, the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior. [135]
  • Courtesy bias , the tendency to give an opinion that is more socially correct than one's true opinion, so as to avoid offending anyone. [136]
  • Groupthink, the psychological phenomenon that occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome. Group members try to minimize conflict and reach a consensus decision without critical evaluation of alternative viewpoints by actively suppressing dissenting viewpoints, and by isolating themselves from outside influences.
  • Groupshift, the tendency for decisions to be more risk-seeking or risk-averse than the group as a whole, if the group is already biased in that direction
  • Social desirability bias, the tendency to over-report socially desirable characteristics or behaviours in oneself and under-report socially undesirable characteristics or behaviours. [137] See also: § Courtesy bias.
  • Truth bias is people's inclination towards believing, to some degree, the communication of another person, regardless of whether or not that person is actually lying or being untruthful. [138] [139]

Ingroup bias

Ingroup bias is the tendency for people to give preferential treatment to others they perceive to be members of their own groups. It is related to the following:

  • Not invented here, an aversion to contact with or use of products, research, standards, or knowledge developed outside a group.
  • Outgroup homogeneity bias, where individuals see members of other groups as being relatively less varied than members of their own group. [140]

Other social biases

NameDescription
Assumed similarity bias Where an individual assumes that others have more traits in common with them than those others actually do. [141]
Outgroup favoritism When some socially disadvantaged groups will express favorable attitudes (and even preferences) toward social, cultural, or ethnic groups other than their own. [142]
Pygmalion effect The phenomenon whereby others' expectations of a target person affect the target person's performance.
Reactance The urge to do the opposite of what someone wants one to do out of a need to resist a perceived attempt to constrain one's freedom of choice (see also Reverse psychology).
Reactive devaluation Devaluing proposals only because they purportedly originated with an adversary.
Social comparison bias The tendency, when making decisions, to favour potential candidates who do not compete with one's own particular strengths. [143]
Shared information bias The tendency for group members to spend more time and energy discussing information that all members are already familiar with (i.e., shared information), and less time and energy discussing information that only some members are aware of (i.e., unshared information). [144]
Worse-than-average effect A tendency to believe ourselves to be worse than others at tasks which are difficult. [145]

Memory

In psychology and cognitive science, a memory bias is a cognitive bias that either enhances or impairs the recall of a memory (either the chances that the memory will be recalled at all, or the amount of time it takes for it to be recalled, or both), or that alters the content of a reported memory. There are many types of memory bias, including:

Misattribution of memory

In psychology, the misattribution of memory or source misattribution is the misidentification of the origin of a memory by the person making the memory recall. Misattribution is likely to occur when individuals are unable to monitor and control the influence of their attitudes, toward their judgments, at the time of retrieval. [146] Misattribution is divided into three components: cryptomnesia, false memories, and source confusion. It was originally noted as one of Daniel Schacter's seven sins of memory. [147]

The misattributions include:

Other memory biases

NameDescription
Availability bias Greater likelihood of recalling recent, nearby, or otherwise immediately available examples, and the imputation of importance to those examples over others.
Bizarreness effect Bizarre material is better remembered than common material.
Boundary extension Remembering the background of an image as being larger or more expansive than the foreground [151]
Childhood amnesia The retention of few memories from before the age of four.
Choice-supportive bias The tendency to remember one's choices as better than they actually were. [152]
Confirmation bias The tendency to search for, interpret, or recall information in a way that confirms one's beliefs or hypotheses. See also under § Confirmation bias.
Conservatism or Regressive biasTendency to remember high values and high likelihoods/probabilities/frequencies as lower than they actually were and low ones as higher than they actually were. Based on the evidence, memories are not extreme enough. [153] [154]
Consistency biasIncorrectly remembering one's past attitudes and behaviour as resembling present attitudes and behaviour. [155]
Continued influence effectMisinformation continues to influence memory and reasoning about an event, despite the misinformation having been corrected. [156] cf. misinformation effect , where the original memory is affected by incorrect information received later.
Context effect That cognition and memory are dependent on context, such that out-of-context memories are more difficult to retrieve than in-context memories (e.g., recall time and accuracy for a work-related memory will be lower at home, and vice versa).
Cross-race effect The tendency for people of one race to have difficulty identifying members of a race other than their own.
Egocentric bias Recalling the past in a self-serving manner, e.g., remembering one's exam grades as being better than they were, or remembering a caught fish as bigger than it really was.
Euphoric recall The tendency of people to remember past experiences in a positive light, while overlooking negative experiences associated with that event.
Fading affect bias A bias in which the emotion associated with unpleasant memories fades more quickly than the emotion associated with positive events. [157]
Generation effect (Self-generation effect)That self-generated information is remembered best. For instance, people are better able to recall memories of statements that they have generated than similar statements generated by others.
Gender differences in eyewitness memoryThe tendency for a witness to remember more details about someone of the same gender.
Google effect The tendency to forget information that can be found readily online by using Internet search engines.
Hindsight bias ("I-knew-it-all-along" effect)The inclination to see past events as having been predictable.
Humor effectThat humorous items are more easily remembered than non-humorous ones, which might be explained by the distinctiveness of humor, the increased cognitive processing time to understand the humor, or the emotional arousal caused by the humor. [158]
Illusory correlation Inaccurately seeing a relationship between two events related by coincidence. [159] See also under {{Section link}}: required section parameter(s) missing
Illusory truth effect (Illusion-of-truth effect)People are more likely to identify as true statements those they have previously heard (even if they cannot consciously remember having heard them), regardless of the actual validity of the statement. In other words, a person is more likely to believe a familiar statement than an unfamiliar one. See also under {{Section link}}: required section parameter(s) missing
Lag effectThe phenomenon whereby learning is greater when studying is spread out over time, as opposed to studying the same amount of time in a single session. See also spacing effect.
Leveling and sharpening Memory distortions introduced by the loss of details in a recollection over time, often concurrent with sharpening or selective recollection of certain details that take on exaggerated significance in relation to the details or aspects of the experience lost through leveling. Both biases may be reinforced over time, and by repeated recollection or re-telling of a memory. [160]
Levels-of-processing effect That different methods of encoding information into memory have different levels of effectiveness. [161]
List-length effectA smaller percentage of items are remembered in a longer list, but as the length of the list increases, the absolute number of items remembered increases as well. [162]
Memory inhibition Being shown some items from a list makes it harder to retrieve the other items (e.g., Slamecka, 1968).
Misinformation effect Memory becoming less accurate because of interference from post-event information. [163] cf. continued influence effect , where misinformation about an event, despite later being corrected, continues to influence memory about the event.
Modality effect That memory recall is higher for the last items of a list when the list items were received via speech than when they were received through writing.
Mood-congruent memory bias (state-dependent memory)The improved recall of information congruent with one's current mood.
Negativity bias or Negativity effectPsychological phenomenon by which humans have a greater recall of unpleasant memories compared with positive memories. [164] [115] (see also actor-observer bias, group attribution error, positivity effect, and negativity effect). [129]
Next-in-line effect When taking turns speaking in a group using a predetermined order (e.g. going clockwise around a room, taking numbers, etc.) people tend to have diminished recall for the words of the person who spoke immediately before them. [165]
Part-list cueing effect That being shown some items from a list and later retrieving one item causes it to become harder to retrieve the other items. [166]
Peak–end rule That people seem to perceive not the sum of an experience but the average of how it was at its peak (e.g., pleasant or unpleasant) and how it ended.
PersistenceThe unwanted recurrence of memories of a traumatic event.
Picture superiority effect The notion that concepts that are learned by viewing pictures are more easily and frequently recalled than are concepts that are learned by viewing their written word form counterparts. [167] [168] [169] [170] [171] [172]
Placement biasTendency to remember ourselves to be better than others at tasks at which we rate ourselves above average (also Illusory superiority or Better-than-average effect) [77] and tendency to remember ourselves to be worse than others at tasks at which we rate ourselves below average (also Worse-than-average effect). [173]
Positivity effect (Socioemotional selectivity theory)That older adults favor positive over negative information in their memories. See also euphoric recall
Primacy effect Where an item at the beginning of a list is more easily recalled. A form of serial position effect. See also recency effect and suffix effect.
Processing difficulty effectThat information that takes longer to read and is thought about more (processed with more difficulty) is more easily remembered. [174] See also levels-of-processing effect.
Recency effect A form of serial position effect where an item at the end of a list is easier to recall. This can be disrupted by the suffix effect. See also primacy effect.
Reminiscence bump The recalling of more personal events from adolescence and early adulthood than personal events from other lifetime periods. [175]
Repetition blindness Unexpected difficulty in remembering more than one instance of a visual sequence
Rosy retrospection The remembering of the past as having been better than it really was.
Saying is believing effect Communicating a socially tuned message to an audience can lead to a bias of identifying the tuned message as one's own thoughts. [176]
Self-relevance effect That memories relating to the self are better recalled than similar information relating to others.
Serial position effect That items near the end of a sequence are the easiest to recall, followed by the items at the beginning of a sequence; items in the middle are the least likely to be remembered. [177] See also recency effect, primacy effect and suffix effect.
Spacing effect That information is better recalled if exposure to it is repeated over a long span of time rather than a short one.
Spotlight effect The tendency to overestimate the amount that other people notice one's appearance or behavior.
Stereotype bias or stereotypical biasMemory distorted towards stereotypes (e.g., racial or gender).
Suffix effectDiminishment of the recency effect because a sound item is appended to the list that the subject is not required to recall. [178] [179] A form of serial position effect. Cf. recency effect and primacy effect.
Subadditivity effect The tendency to estimate that the likelihood of a remembered event is less than the sum of its (more than two) mutually exclusive components. [180]
Tachypsychia When time perceived by the individual either lengthens, making events appear to slow down, or contracts. [181]
Telescoping effect The tendency to displace recent events backwards in time and remote events forward in time, so that recent events appear more remote, and remote events, more recent.
Testing effect The fact that one more easily recall information one has read by rewriting it instead of rereading it. [182] Frequent testing of material that has been committed to memory improves memory recall.
Tip of the tongue phenomenonWhen a subject is able to recall parts of an item, or related information, but is frustratingly unable to recall the whole item. This is thought to be an instance of "blocking" where multiple similar memories are being recalled and interfere with each other. [148]
Travis syndromeOverestimating the significance of the present. [183] It is related to chronological snobbery with possibly an appeal to novelty logical fallacy being part of the bias.
Verbatim effectThat the "gist" of what someone has said is better remembered than the verbatim wording. [184] This is because memories are representations, not exact copies.
von Restorff effect That an item that sticks out is more likely to be remembered than other items. [185]
Zeigarnik effect That uncompleted or interrupted tasks are remembered better than completed ones.

See also

Footnotes

  1. Haselton MG, Nettle D, Andrews PW (2005). "The evolution of cognitive bias" (PDF). In Buss DM (ed.). The Handbook of Evolutionary Psychology. Hoboken, NJ: John Wiley & Sons Inc. pp. 724–746.
  2. "Cognitive Bias – Association for Psychological Science". www.psychologicalscience.org. Retrieved 2018-10-10.
  3. Thomas O (2018-01-19). "Two decades of cognitive bias research in entrepreneurship: What do we know and where do we go from here?". Management Review Quarterly. 68 (2): 107–143. doi:10.1007/s11301-018-0135-9. ISSN   2198-1620. S2CID   148611312.
  4. Dougherty MR, Gettys CF, Ogden EE (1999). "MINERVA-DM: A memory processes model for judgments of likelihood" (PDF). Psychological Review. 106 (1): 180–209. doi:10.1037/0033-295x.106.1.180.
  5. 1 2 3 4 5 6 Hilbert M (March 2012). "Toward a synthesis of cognitive biases: how noisy information processing can bias human decision making". Psychological Bulletin. 138 (2): 211–37. doi:10.1037/a0025940. PMID   22122235.
  6. Gigerenzer G (2006). "Bounded and Rational". In Stainton RJ (ed.). Contemporary Debates in Cognitive Science. Blackwell. p. 129. ISBN   978-1-4051-1304-5.
  7. MacCoun RJ (1998). "Biases in the interpretation and use of research results" (PDF). Annual Review of Psychology . 49 (1): 259–287. doi:10.1146/annurev.psych.49.1.259. PMID   15012470.
  8. Nickerson RS (1998). "Confirmation Bias: A Ubiquitous Phenomenon in Many Guises" (PDF). Review of General Psychology. 2 (2): 175–220 [198]. doi:10.1037/1089-2680.2.2.175. S2CID   8508954.
  9. Dardenne B, Leyens JP (1995). "Confirmation Bias as a Social Skill". Personality and Social Psychology Bulletin. 21 (11): 1229–1239. doi:10.1177/01461672952111011. S2CID   146709087.
  10. Alexander WH, Brown JW (June 2010). "Hyperbolically discounted temporal difference learning". Neural Computation. 22 (6): 1511–1527. doi:10.1162/neco.2010.08-09-1080. PMC   3005720 . PMID   20100071.
  11. Zhang Y, Lewis M, Pellon M, Coleman P (2007). A Preliminary Research on Modeling Cognitive Agents for Social Environments in Multi-Agent Systems (PDF). 2007 AAAI Fall Symposium: Emergent agents and socialities: Social and organizational aspects of intelligence. Association for the Advancement of Artificial Intelligence. pp. 116–123.
  12. 1 2 3 Iverson GL, Brooks BL, Holdnack JA (2008). "Misdiagnosis of Cognitive Impairment in Forensic Neuropsychology". In Heilbronner RL (ed.). Neuropsychology in the Courtroom: Expert Analysis of Reports and Testimony. New York: Guilford Press. p. 248. ISBN   978-1-59385-634-2.
  13. Kim M, Daniel JL (2020-01-02). "Common Source Bias, Key Informants, and Survey-Administrative Linked Data for Nonprofit Management Research" . Public Performance & Management Review. 43 (1): 232–256. doi:10.1080/15309576.2019.1657915. ISSN   1530-9576. S2CID   203468837 . Retrieved 23 June 2021.
  14. DuCharme WW (1970). "Response bias explanation of conservative human inference". Journal of Experimental Psychology. 85 (1): 66–74. doi:10.1037/h0029546. hdl: 2060/19700009379 .
  15. 1 2 Edwards W (1968). "Conservatism in human information processing". In Kleinmuntz B (ed.). Formal representation of human judgment. New York: Wiley. pp. 17–52.
  16. "The Psychology Guide: What Does Functional Fixedness Mean?". PsycholoGenie. Retrieved 2018-10-10.
  17. Carroll RT. "apophenia". The Skeptic's Dictionary. Retrieved 17 July 2017.
  18. Tversky A, Kahneman D (September 1974). "Judgment under Uncertainty: Heuristics and Biases". Science. 185 (4157): 1124–1131. Bibcode:1974Sci...185.1124T. doi:10.1126/science.185.4157.1124. PMID   17835457. S2CID   143452957.
  19. Fiedler K (1991). "The tricky nature of skewed frequency tables: An information loss account of distinctiveness-based illusory correlations". Journal of Personality and Social Psychology. 60 (1): 24–36. doi:10.1037/0022-3514.60.1.24.
  20. Schwarz N, Bless H, Strack F, Klumpp G, Rittenauer-Schatka H, Simons A (1991). "Ease of Retrieval as Information: Another Look at the Availability Heuristic" (PDF). Journal of Personality and Social Psychology. 61 (2): 195–202. doi:10.1037/0022-3514.61.2.195. Archived from the original (PDF) on 9 February 2014. Retrieved 19 Oct 2014.
  21. Coley JD, Tanner KD (2012). "Common origins of diverse misconceptions: cognitive principles and the development of biology thinking". CBE: Life Sciences Education. 11 (3): 209–215. doi:10.1187/cbe.12-06-0074. PMC   3433289 . PMID   22949417.
  22. "The Real Reason We Dress Pets Like People". Live Science. 3 March 2010. Retrieved 2015-11-16.
  23. Harris LT, Fiske ST (January 2011). "Dehumanized Perception: A Psychological Means to Facilitate Atrocities, Torture, and Genocide?". Zeitschrift für Psychologie. 219 (3): 175–181. doi:10.1027/2151-2604/a000065. PMC   3915417 . PMID   24511459.
  24. Bar-Haim Y, Lamy D, Pergamin L, Bakermans-Kranenburg MJ, van IJzendoorn MH (January 2007). "Threat-related attentional bias in anxious and nonanxious individuals: a meta-analytic study" (PDF). Psychological Bulletin. 133 (1): 1–24. doi:10.1037/0033-2909.133.1.1. PMID   17201568. S2CID   2861872.
  25. Zwicky A (2005-08-07). "Just Between Dr. Language and I". Language Log.
  26. Bellows A (March 2006). "The Baader-Meinhof Phenomenon". Damn Interesting. Retrieved 2020-02-16.
  27. Kershner K (20 March 2015). "What's the Baader-Meinhof phenomenon?". howstuffworks.com. Retrieved 15 April 2018.
  28. "The Baader-Meinhof Phenomenon? Or: The Joy Of Juxtaposition?". twincities.com. St. Paul Pioneer Press. 23 February 2007. Retrieved October 20, 2020. As you might guess, the phenomenon is named after an incident in which I was talking to a friend about the Baader-Meinhof gang (and this was many years after they were in the news). The next day, my friend phoned me and referred me to an article in that day's newspaper in which the Baader-Meinhof gang was mentioned.
  29. Michael I. Norton, Daniel Mochon, Dan Ariely (2011). The "IKEA Effect": When Labor Leads to Love. Harvard Business School
  30. Lebowitz S (2 December 2016). "Harness the power of the 'Ben Franklin Effect' to get someone to like you". Business Insider. Retrieved 2018-10-10.
  31. Oswald ME, Grosjean S (2004). "Confirmation Bias". In Pohl RF (ed.). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove, UK: Psychology Press. pp.  79–96. ISBN   978-1-84169-351-4. OCLC   55124398 via Internet Archive.
  32. Sanna LJ, Schwarz N, Stocker SL (2002). "When debiasing backfires: Accessible content and accessibility experiences in debiasing hindsight" (PDF). Journal of Experimental Psychology: Learning, Memory, and Cognition. 28 (3): 497–502. CiteSeerX   10.1.1.387.5964 . doi:10.1037/0278-7393.28.3.497. ISSN   0278-7393. PMID   12018501.
  33. Jeng M (2006). "A selected history of expectation bias in physics". American Journal of Physics. 74 (7): 578–583. arXiv: physics/0508199 . Bibcode:2006AmJPh..74..578J. doi:10.1119/1.2186333. S2CID   119491123.
  34. Schacter DL, Gilbert DT, Wegner DM (2011). Psychology (2nd ed.). Macmillan. p. 254. ISBN   978-1-4292-3719-2.
  35. Pronin E, Kugler MB (July 2007). "Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot". Journal of Experimental Social Psychology. 43 (4): 565–578. doi:10.1016/j.jesp.2006.05.011. ISSN   0022-1031.
  36. Marks G, Miller N (1987). "Ten years of research on the false-consensus effect: An empirical and theoretical review". Psychological Bulletin. 102 (1): 72–90. doi:10.1037/0033-2909.102.1.72.
  37. "False Uniqueness Bias (Social PsychologyY) – IResearchNet". 2016-01-13.
  38. "The Barnum Demonstration". psych.fullerton.edu. Retrieved 2018-10-10.
  39. Pronin E, Kruger J, Savitsky K, Ross L (October 2001). "You don't know me, but I know you: the illusion of asymmetric insight". Journal of Personality and Social Psychology. 81 (4): 639–656. doi:10.1037/0022-3514.81.4.639. PMID   11642351.
  40. Thompson SC (1999). "Illusions of Control: How We Overestimate Our Personal Influence". Current Directions in Psychological Science. 8 (6): 187–190. doi:10.1111/1467-8721.00044. ISSN   0963-7214. JSTOR   20182602. S2CID   145714398.
  41. Dierkes M, Antal AB, Child J, Nonaka I (2003). Handbook of Organizational Learning and Knowledge. Oxford University Press. p. 22. ISBN   978-0-19-829582-2 . Retrieved 9 September 2013.
  42. Hoorens V (1993). "Self-enhancement and Superiority Biases in Social Comparison". European Review of Social Psychology. 4 (1): 113–139. doi:10.1080/14792779343000040.
  43. Adams PA, Adams JK (December 1960). "Confidence in the recognition and reproduction of words difficult to spell". The American Journal of Psychology. 73 (4): 544–552. doi:10.2307/1419942. JSTOR   1419942. PMID   13681411.
  44. Hoffrage U (2004). "Overconfidence". In Pohl R (ed.). Cognitive Illusions: a handbook on fallacies and biases in thinking, judgement and memory . Psychology Press. ISBN   978-1-84169-351-4.
  45. Sutherland 2007 , pp. 172–178
  46. 1 2 Sanna LJ, Schwarz N (July 2004). "Integrating temporal biases: the interplay of focal thoughts and accessibility experiences". Psychological Science. 15 (7): 474–481. doi:10.1111/j.0956-7976.2004.00704.x. PMID   15200632. S2CID   10998751.
  47. Baron 1994 , pp. 224–228
  48. Västfjäll D, Slovic P, Mayorga M, Peters E (18 June 2014). "Compassion fade: affect and charity are greatest for a single child in need". PLOS ONE. 9 (6): e100115. Bibcode:2014PLoSO...9j0115V. doi: 10.1371/journal.pone.0100115 . PMC   4062481 . PMID   24940738.
  49. Fisk JE (2004). "Conjunction fallacy". In Pohl RF (ed.). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove, UK: Psychology Press. pp.  23–42. ISBN   978-1-84169-351-4. OCLC   55124398.
  50. Barbara L. Fredrickson and Daniel Kahneman (1993). Duration Neglect in Retrospective Evaluations of Affective Episodes. Journal of Personality and Social Psychology. 65 (1) pp. 45–55. Archived 2017-08-08 at the Wayback Machine
  51. Laibson D (1997). "Golden Eggs and Hyperbolic Discounting". Quarterly Journal of Economics . 112 (2): 443–477. CiteSeerX   10.1.1.337.3544 . doi:10.1162/003355397555253. S2CID   763839.
  52. Baron 1994 , p. 353
  53. Goddard K, Roudsari A, Wyatt JC (2011). "Automation Bias – A Hidden Issue for Clinical Decision Support System Use". International Perspectives in Health Informatics. Studies in Health Technology and Informatics. Vol. 164. IOS Press. pp. 17–22. doi:10.3233/978-1-60750-709-3-17.
  54. Tackling social norms: a game changer for gender inequalities (Gender Social Norms Index). 2020 Human Development Perspectives. United Nations Development Programme. Retrieved 2020-06-10.
  55. Bian L, Leslie SJ, Cimpian A (December 2018). "Evidence of bias against girls and women in contexts that emphasize intellectual ability". The American Psychologist. 73 (9): 1139–1153. doi: 10.1037/amp0000427 . PMID   30525794.
  56. Hamilton MC (1991). "Masculine Bias in the Attribution of Personhood: People = Male, Male = People". Psychology of Women Quarterly. 15 (3): 393–402. doi:10.1111/j.1471-6402.1991.tb00415.x. ISSN   0361-6843. S2CID   143533483.
  57. Plous 1993 , pp. 38–41
  58. "Evolution and cognitive biases: the decoy effect". FutureLearn. Retrieved 2018-10-10.
  59. "The Default Effect: How to Leverage Bias and Influence Behavior". Influence at Work. 2012-01-11. Retrieved 2018-10-10.
  60. Why We Spend Coins Faster Than Bills by Chana Joffe-Walt. All Things Considered, 12 May 2009.
  61. Hsee CK, Zhang J (May 2004). "Distinction bias: misprediction and mischoice due to joint evaluation". Journal of Personality and Social Psychology. 86 (5): 680–695. CiteSeerX   10.1.1.484.9171 . doi:10.1037/0022-3514.86.5.680. PMID   15161394.
  62. Mike K, Hazzan O (2022). "What Is Common to Transportation and Health in Machine Learning Education? The Domain Neglect Bias". IEEE Transactions on Education. 66 (3): 226–233. doi:10.1109/TE.2022.3218013. ISSN   0018-9359. S2CID   253402007.
  63. Binah-Pollak, Avital; Hazzan, Orit; Mike, Koby; Hacohen, Ronit Lis (2024-01-05). "Anthropological thinking in data science education: Thinking within context". Education and Information Technologies. doi:10.1007/s10639-023-12444-7. ISSN   1573-7608.
  64. "Berkson's Paradox | Brilliant Math & Science Wiki". brilliant.org. Retrieved 2018-10-10.
  65. Kristal AS, Santos LR, G.I. Joe Phenomena: Understanding the Limits of Metacognitive Awareness on Debiasing (PDF), Harvard Business School
  66. Investopedia Staff (2006-10-29). "Gambler's Fallacy/Monte Carlo Fallacy". Investopedia. Retrieved 2018-10-10.
  67. Tuccio W (2011-01-01). "Heuristics to Improve Human Factors Performance in Aviation". Journal of Aviation/Aerospace Education & Research. 20 (3). doi: 10.15394/jaaer.2011.1640 . ISSN   2329-258X.
  68. Baron, J. (in preparation). Thinking and Deciding, 4th edition. New York: Cambridge University Press.
  69. Baron 1994 , p. 372
  70. de Meza D, Dawson C (January 24, 2018). "Wishful Thinking, Prudent Behavior: The Evolutionary Origin of Optimism, Loss Aversion and Disappointment Aversion". SSRN   3108432.
  71. Dawson C, Johnson SG (8 April 2021). "Dread Aversion and Economic Preferences". SSRN   3822640.
  72. ( Kahneman, Knetsch & Thaler 1991 , p. 193) Richard Thaler coined the term "endowment effect."
  73. ( Kahneman, Knetsch & Thaler 1991 , p. 193) Daniel Kahneman, together with Amos Tversky, coined the term "loss aversion."
  74. Hardman 2009 , p. 137
  75. Kahneman, Knetsch & Thaler 1991 , p. 193
  76. Baron 1994 , p. 382
  77. 1 2 Kruger J, Dunning D (December 1999). "Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments". Journal of Personality and Social Psychology. 77 (6): 1121–1134. CiteSeerX   10.1.1.64.2655 . doi:10.1037/0022-3514.77.6.1121. PMID   10626367. S2CID   2109278.
  78. Van Boven L, Loewenstein G, Dunning D, Nordgren LF (2013). "Changing Places: A Dual Judgment Model of Empathy Gaps in Emotional Perspective Taking" (PDF). In Zanna MP, Olson JM (eds.). Advances in Experimental Social Psychology. Vol. 48. Academic Press. pp. 117–171. doi:10.1016/B978-0-12-407188-9.00003-X. ISBN   978-0-12-407188-9. Archived from the original (PDF) on 2016-05-28.
  79. Lichtenstein S, Fischhoff B (1977). "Do those who know more also know more about how much they know?". Organizational Behavior and Human Performance. 20 (2): 159–183. doi:10.1016/0030-5073(77)90001-0.
  80. Merkle EC (February 2009). "The disutility of the hard-easy effect in choice confidence". Psychonomic Bulletin & Review. 16 (1): 204–213. doi: 10.3758/PBR.16.1.204 . PMID   19145033.
  81. Juslin P, Winman A, Olsson H (April 2000). "Naive empiricism and dogmatism in confidence research: a critical examination of the hard-easy effect". Psychological Review. 107 (2): 384–396. doi:10.1037/0033-295x.107.2.384. PMID   10789203.
  82. Waytz A (26 January 2022). "2017 : What scientific term or concept ought to be more widely known?". Edge.org . Retrieved 26 January 2022.
  83. 1 2 Rozenblit L, Keil F (September 2002). "The misunderstood limits of folk science: an illusion of explanatory depth". Cognitive Science. 26 (5). Wiley: 521–562. doi:10.1207/s15516709cog2605_1. PMC   3062901 . PMID   21442007.
  84. Mills CM, Keil FC (January 2004). "Knowing the limits of one's understanding: the development of an awareness of an illusion of explanatory depth". Journal of Experimental Child Psychology. 87 (1). Elsevier BV: 1–32. doi:10.1016/j.jecp.2003.09.003. PMID   14698687.
  85. "Imposter Syndrome | Psychology Today".
  86. "Objectivity illusion". APA Dictionary of Psychology. Washington, DC: American Psychological Association. n.d. Retrieved 2022-01-15.
  87. Klauer KC, Musch J, Naumer B (October 2000). "On belief bias in syllogistic reasoning". Psychological Review. 107 (4): 852–884. doi:10.1037/0033-295X.107.4.852. PMID   11089409.
  88. "Why do we prefer doing something to doing nothing". The Decision Lab. 30 September 2021. Retrieved 30 November 2021.
  89. Patt A, Zeckhauser R (July 2000). "Action Bias and Environmental Decisions" . Journal of Risk and Uncertainty. 21: 45–72. doi:10.1023/A:1026517309871. S2CID   154662174 . Retrieved 30 November 2021.
  90. Gupta S (7 April 2021). "People add by default even when subtraction makes more sense". Science News. Retrieved 10 May 2021.
  91. Adams GS, Converse BA, Hales AH, Klotz LE (April 2021). "People systematically overlook subtractive changes". Nature. 592 (7853): 258–261. Bibcode:2021Natur.592..258A. doi:10.1038/s41586-021-03380-y. PMID   33828317. S2CID   233185662.
  92. Ackerman MS, ed. (2003). Sharing expertise beyond knowledge management (online ed.). Cambridge, Massachusetts: MIT Press. p.  7. ISBN   978-0-262-01195-2.
  93. Quartz SR, The State Of The World Isn't Nearly As Bad As You Think, Edge Foundation, Inc. , retrieved 2016-02-17
  94. Quoidbach J, Gilbert DT, Wilson TD (January 2013). "The end of history illusion" (PDF). Science. 339 (6115): 96–98. Bibcode:2013Sci...339...96Q. doi:10.1126/science.1229294. PMID   23288539. S2CID   39240210. Archived from the original (PDF) on 2013-01-13. Young people, middle-aged people, and older people all believed they had changed a lot in the past but would change relatively little in the future.
  95. Haring KS, Watanabe K, Velonaki M, Tossell CC, Finomore V (2018). "FFAB-The Form Function Attribution Bias in Human Robot Interaction". IEEE Transactions on Cognitive and Developmental Systems. 10 (4): 843–851. doi: 10.1109/TCDS.2018.2851569 . S2CID   54459747.
  96. Kara-Yakoubian M (2022-07-29). "Psychologists uncover evidence of a fundamental pain bias". PsyPost. Retrieved 2022-11-27.
  97. Prati A (2017). "Hedonic recall bias. Why you should not ask people how much they earn". Journal of Economic Behavior & Organization. 143: 78–97. doi:10.1016/j.jebo.2017.09.002.
  98. Pohl RF (2004). "Hindsight Bias". In Pohl RF (ed.). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove, UK: Psychology Press. pp.  363–378. ISBN   978-1-84169-351-4. OCLC   55124398.
  99. Baron 1994 , pp. 258–259
  100. Danziger S, Levav J, Avnaim-Pesso L (April 2011). "Extraneous factors in judicial decisions". Proceedings of the National Academy of Sciences of the United States of America. 108 (17): 6889–6892. Bibcode:2011PNAS..108.6889D. doi: 10.1073/pnas.1018033108 . PMC   3084045 . PMID   21482790.
  101. Zaman J, De Peuter S, Van Diest I, Van den Bergh O, Vlaeyen JW (November 2016). "Interoceptive cues predicting exteroceptive events". International Journal of Psychophysiology. 109: 100–106. doi:10.1016/j.ijpsycho.2016.09.003. PMID   27616473.
  102. Barrett LF, Simmons WK (July 2015). "Interoceptive predictions in the brain". Nature Reviews. Neuroscience. 16 (7): 419–429. doi:10.1038/nrn3950. PMC   4731102 . PMID   26016744.
  103. Damasio AR (October 1996). "The somatic marker hypothesis and the possible functions of the prefrontal cortex". Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences. 351 (1346): 1413–1420. doi:10.1098/rstb.1996.0125. PMID   8941953. S2CID   1841280.
  104. Shafir E, Diamond P, Tversky A (2000). "Money Illusion". In Kahneman D, Tversky A (eds.). Choices, values, and frames. Cambridge University Press. pp. 335–355. ISBN   978-0-521-62749-8.
  105. Marcatto F, Cosulich A, Ferrante D (2015). "Once bitten, twice shy: Experienced regret and non-adaptive choice switching". PeerJ. 3: e1035. doi: 10.7717/peerj.1035 . PMC   4476096 . PMID   26157618.
  106. Bornstein RF, Crave-Lemley C (2004). "Mere exposure effect". In Pohl RF (ed.). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove, UK: Psychology Press. pp.  215–234. ISBN   978-1-84169-351-4. OCLC   55124398.
  107. Baron 1994 , p. 386
  108. Baron 1994 , p. 44
  109. Hardman 2009 , p. 104
  110. O'Donoghue T, Rabin M (1999). "Doing it now or later". American Economic Review. 89 (1): 103–124. doi:10.1257/aer.89.1.103. S2CID   5115877.
  111. Balas B, Momsen JL (September 2014). Holt EA (ed.). "Attention "blinks" differently for plants and animals". CBE: Life Sciences Education. 13 (3): 437–443. doi:10.1187/cbe.14-05-0080. PMC   4152205 . PMID   25185227.
  112. Safi R, Browne GJ, Naini AJ (2021). "Mis-spending on information security measures: Theory and experimental evidence". International Journal of Information Management. 57 (102291): 102291. doi:10.1016/j.ijinfomgt.2020.102291. S2CID   232041220.
  113. Hsee CK, Hastie R (January 2006). "Decision and experience: why don't we choose what makes us happy?" (PDF). Trends in Cognitive Sciences. 10 (1): 31–37. CiteSeerX   10.1.1.178.7054 . doi:10.1016/j.tics.2005.11.007. PMID   16318925. S2CID   12262319. Archived (PDF) from the original on 2015-04-20.
  114. Trofimova I (October 1999). "An investigation of how people of different age, sex, and temperament estimate the world". Psychological Reports. 85 (2): 533–552. doi:10.2466/pr0.1999.85.2.533. PMID   10611787. S2CID   8335544.
  115. 1 2 3 Trofimova I (2014). "Observer bias: an interaction of temperament traits with biases in the semantic perception of lexical material". PLOS ONE. 9 (1): e85677. Bibcode:2014PLoSO...985677T. doi: 10.1371/journal.pone.0085677 . PMC   3903487 . PMID   24475048.
  116. Leman PJ, Cinnirella M (2007). "A major event has a major cause: Evidence for the role of heuristics in reasoning about conspiracy theories". Social Psychological Review. 9 (2): 18–28. doi:10.53841/bpsspr.2007.9.2.18. S2CID   245126866.
  117. Buckley T (2015). "Why Do Some People Believe in Conspiracy Theories?". Scientific American Mind. 26 (4): 72. doi:10.1038/scientificamericanmind0715-72a . Retrieved 26 July 2020.
  118. "Use Cognitive Biases to Your Advantage, Institute for Management Consultants, #721, December 19, 2011". Archived from the original on October 24, 2020. Retrieved April 15, 2021.
  119. Fiedler K, Unkelbach C (2014-10-01). "Regressive Judgment: Implications of a Universal Property of the Empirical World". Current Directions in Psychological Science. 23 (5): 361–367. doi:10.1177/0963721414546330. ISSN   0963-7214. S2CID   146376950.
  120. Kelemen D, Rottman J, Seston R (2013). "Professional Physical Scientists Display Tenacious Teleological Tendencies: Purpose-Based Reasoning as a Cognitive Default". Journal of Experimental Psychology: General. 142 (4): 1074–1083. doi:10.1037/a0030399. PMID   23067062..
  121. Kelemen D, Rosset E (2009). "The Human Function Compunction: teleological explanation in adults". Cognition. 111 (1): 138–143. doi:10.1016/j.cognition.2009.01.001. PMID   19200537. S2CID   2569743.
  122. "Unconscious Bias". Vanderbilt University. Retrieved 2020-11-09.
  123. "Penn Psychologists Believe 'Unit Bias' Determines The Acceptable Amount To Eat". ScienceDaily (November 21, 2005)
  124. Talboy A, Schneider S (2022-03-17). "Reference Dependence in Bayesian Reasoning: Value Selection Bias, Congruence Effects, and Response Prompt Sensitivity". Frontiers in Psychology. 13: 729285. doi: 10.3389/fpsyg.2022.729285 . PMC   8970303 . PMID   35369253.
  125. Talboy AN, Schneider SL (December 2018). "Focusing on what matters: Restructuring the presentation of Bayesian reasoning problems". Journal of Experimental Psychology: Applied. 24 (4): 440–458. doi:10.1037/xap0000187. PMID   30299128. S2CID   52943395.
  126. Milgram S (October 1963). "Behavioral Study of Obedience". Journal of Abnormal Psychology. 67 (4): 371–378. doi:10.1037/h0040525. PMID   14049516. S2CID   18309531.
  127. Walker D, Vul E (January 2014). "Hierarchical encoding makes individuals in a group seem more attractive". Psychological Science. 25 (1): 230–235. doi:10.1177/0956797613497969. PMID   24163333. S2CID   16309135.
  128. Baron 1994 , p. 275
  129. 1 2 Sutherland 2007 , pp. 138–139
  130. Anderson KB, Graham LM (2007). "Hostile Attribution Bias". Encyclopedia of Social Psychology. Sage Publications, Inc. pp. 446–447. doi:10.4135/9781412956253. ISBN   978-1-4129-1670-7.
  131. Rosset E (2008-09-01). "It's no accident: Our bias for intentional explanations". Cognition. 108 (3): 771–780. doi:10.1016/j.cognition.2008.07.001. ISSN   0010-0277. PMID   18692779. S2CID   16559459.
  132. Kokkoris M (2020-01-16). "The Dark Side of Self-Control". Harvard Business Review. Retrieved 17 January 2020.
  133. Plous 1993 , p. 185
  134. Kuran T, Sunstein CR (1998). "Availability Cascades and Risk Regulation". Stanford Law Review. 51 (4): 683–768. doi:10.2307/1229439. JSTOR   1229439. S2CID   3941373.
  135. Colman A (2003). Oxford Dictionary of Psychology. New York: Oxford University Press. p.  77. ISBN   978-0-19-280632-1.
  136. Ciccarelli S, White J (2014). Psychology (4th ed.). Pearson Education, Inc. p. 62. ISBN   978-0-205-97335-4.
  137. Dalton D, Ortegren M (2011). "Gender differences in ethics research: The importance of controlling for the social desirability response bias". Journal of Business Ethics. 103 (1): 73–93. doi:10.1007/s10551-011-0843-8. S2CID   144155599.
  138. McCornack S, Parks M (1986). "Deception Detection and Relationship Development: The Other Side of Trust". Annals of the International Communication Association . 9: 377–389. doi:10.1080/23808985.1986.11678616.
  139. Levine T (2014). "Truth-Default Theory (TDT): A Theory of Human Deception and Deception Detection". Journal of Language and Social Psychology. 33: 378–392. doi:10.1177/0261927X14535916. S2CID   146916525.
  140. Plous 1993 , p. 206
  141. "Assumed similarity bias". APA Dictionary of Psychology. Washington, DC: American Psychological Association. n.d. Retrieved 2022-01-15.
  142. "Intellectual Precursors, Major Postulates, and Practical Relevance of System Justification Theory", A Theory of System Justification, Harvard University Press, pp. 49–69, 2020-07-14, doi:10.2307/j.ctv13qfw6w.6, S2CID   243130432 , retrieved 2023-12-05
  143. Garcia SM, Song H, Tesser A (November 2010). "Tainted recommendations: The social comparison bias". Organizational Behavior and Human Decision Processes. 113 (2): 97–101. doi:10.1016/j.obhdp.2010.06.002. ISSN   0749-5978.
  144. Forsyth DR (2009). Group Dynamics (5th ed.). Pacific Grove, CA: Brooks/Cole.
  145. Kruger J (August 1999). "Lake Wobegon be gone! The "below-average effect" and the egocentric nature of comparative ability judgments". Journal of Personality and Social Psychology. 77 (2): 221–232. doi:10.1037/0022-3514.77.2.221. PMID   10474208.
  146. Payne BK, Cheng CM, Govorun O, Stewart BD (September 2005). "An inkblot for attitudes: affect misattribution as implicit measurement". Journal of Personality and Social Psychology. 89 (3): 277–293. CiteSeerX   10.1.1.392.4775 . doi:10.1037/0022-3514.89.3.277. PMID   16248714.
  147. Schacter DL (2001). The Seven Sins of Memory. New York, NY: Houghton Mifflin Company.
  148. 1 2 Schacter, Daniel Lawrence (March 1999). "The Seven Sins of Memory: Insights from psychology and cognitive neuroscience". The American Psychologist. 54 (3): 182–203. doi:10.1037/0003-066X.54.3.182. PMID   10199218. S2CID   14882268.
  149. Butera F, Levine JM, Vernet J (August 2009). "Influence without credit: How successful minorities respond to social cryptomnesia". Coping with Minority Status. Cambridge University Press. pp. 311–332. doi:10.1017/cbo9780511804465.015. ISBN   978-0-511-80446-5.
  150. Lieberman DA (2011). Human Learning and Memory. Cambridge University Press. p. 432. ISBN   978-1-139-50253-5.
  151. McDunn BA, Siddiqui AP, Brown JM (April 2014). "Seeking the boundary of boundary extension". Psychonomic Bulletin & Review. 21 (2): 370–375. doi:10.3758/s13423-013-0494-0. PMID   23921509. S2CID   2876131.
  152. Mather M, Shafir E, Johnson MK (March 2000). "Misremembrance of options past: source monitoring and choice" (PDF). Psychological Science. 11 (2): 132–138. doi:10.1111/1467-9280.00228. PMID   11273420. S2CID   2468289. Archived (PDF) from the original on 2009-01-17.
  153. Attneave F (August 1953). "Psychological probability as a function of experienced frequency". Journal of Experimental Psychology. 46 (2): 81–86. doi:10.1037/h0057955. PMID   13084849.
  154. Fischhoff B, Slovic P, Lichtenstein S (1977). "Knowing with certainty: The appropriateness of extreme confidence". Journal of Experimental Psychology: Human Perception and Performance. 3 (4): 552–564. doi:10.1037/0096-1523.3.4.552. S2CID   54888532.
  155. Cacioppo J (2002). Foundations in social neuroscience. Cambridge, MA: MIT Press. pp. 130–132. ISBN   978-0-262-53195-5.
  156. Cacciatore MA (April 2021). "Misinformation and public opinion of science and health: Approaches, findings, and future directions". Proceedings of the National Academy of Sciences of the United States of America. 118 (15): e1912437117. Bibcode:2021PNAS..11812437C. doi: 10.1073/pnas.1912437117 . PMC   8053916 . PMID   33837143. p. 4: The CIE refers to the tendency for information that is initially presented as true, but later revealed to be false, to continue to affect memory and reasoning
  157. Schmidt SR (July 1994). "Effects of humor on sentence memory" (PDF). Journal of Experimental Psychology: Learning, Memory, and Cognition. 20 (4): 953–967. doi:10.1037/0278-7393.20.4.953. PMID   8064254. Archived from the original (PDF) on 2016-03-15. Retrieved 2015-04-19.
  158. Schmidt SR (2003). "Life Is Pleasant – and Memory Helps to Keep It That Way!" (PDF). Review of General Psychology. 7 (2): 203–210. doi:10.1037/1089-2680.7.2.203. S2CID   43179740.
  159. Fiedler K (1991). "The tricky nature of skewed frequency tables: An information loss account of distinctiveness-based illusory correlations". Journal of Personality and Social Psychology. 60 (1): 24–36. doi:10.1037/0022-3514.60.1.24.
  160. Koriat A, Goldsmith M, Pansky A (2000). "Toward a psychology of memory accuracy". Annual Review of Psychology . 51 (1): 481–537. doi:10.1146/annurev.psych.51.1.481. PMID   10751979.
  161. Craik & Lockhart, 1972
  162. Kinnell A, Dennis S (February 2011). "The list length effect in recognition memory: an analysis of potential confounds". Memory & Cognition. 39 (2): 348–63. doi: 10.3758/s13421-010-0007-6 . PMID   21264573.
  163. Weiten W (2010). Psychology: Themes and Variations. Cengage Learning. p. 338. ISBN   978-0-495-60197-5.
  164. Haizlip J, May N, Schorling J, Williams A, Plews-Ogan M (September 2012). "Perspective: the negativity bias, medical education, and the culture of academic medicine: why culture change is hard". Academic Medicine. 87 (9): 1205–1209. doi: 10.1097/ACM.0b013e3182628f03 . PMID   22836850.
  165. Weiten W (2007). Psychology: Themes and Variations. Cengage Learning. p. 260. ISBN   978-0-495-09303-9.
  166. Slamecka NJ (April 1968). "An examination of trace storage in free recall". Journal of Experimental Psychology. 76 (4): 504–513. doi:10.1037/h0025695. PMID   5650563.
  167. Shepard RN (1967). "Recognition memory for words, sentences, and pictures". Journal of Learning and Verbal Behavior. 6: 156–163. doi:10.1016/s0022-5371(67)80067-7.
  168. McBride DM, Dosher BA (2002). "A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis". Consciousness and Cognition. 11 (3): 423–460. doi:10.1016/s1053-8100(02)00007-7. PMID   12435377. S2CID   2813053.
  169. Defetyer MA, Russo R, McPartlin PL (2009). "The picture superiority effect in recognition memory: a developmental study using the response signal procedure". Cognitive Development. 24 (3): 265–273. doi:10.1016/j.cogdev.2009.05.002.
  170. Whitehouse AJ, Maybery MT, Durkin K (2006). "The development of the picture-superiority effect". British Journal of Developmental Psychology. 24 (4): 767–773. doi:10.1348/026151005X74153.
  171. Ally BA, Gold CA, Budson AE (January 2009). "The picture superiority effect in patients with Alzheimer's disease and mild cognitive impairment". Neuropsychologia. 47 (2): 595–598. doi:10.1016/j.neuropsychologia.2008.10.010. PMC   2763351 . PMID   18992266.
  172. Curran T, Doyle J (May 2011). "Picture superiority doubly dissociates the ERP correlates of recollection and familiarity". Journal of Cognitive Neuroscience. 23 (5): 1247–1262. doi:10.1162/jocn.2010.21464. PMID   20350169. S2CID   6568038.
  173. Kruger, J. (1999). Lake Wobegon be gone! The "below-average effect" and the egocentric nature of comparative ability judgments" Journal of Personality and Social Psychology 77(2),
  174. O'Brien EJ, Myers JL (1985). "When comprehension difficulty improves memory for text". Journal of Experimental Psychology: Learning, Memory, and Cognition. 11 (1): 12–21. doi:10.1037/0278-7393.11.1.12. S2CID   199928680.
  175. Rubin, Wetzler & Nebes, 1986; Rubin, Rahhal & Poon, 1998
  176. Liang, Tingchang; Lin, Zhao; Souma, Toshihiko (2021). "How Group Perception Affects What People Share and How People Feel: The Role of Entitativity and Epistemic Trust in the "Saying-Is-Believing" Effect". Frontiers in Psychology. 12: 728864. doi: 10.3389/fpsyg.2021.728864 . PMC   8494462 . PMID   34630240.
  177. Martin GN, Carlson NR, Buskist W (2007). Psychology (3rd ed.). Pearson Education. pp. 309–310. ISBN   978-0-273-71086-8.
  178. Morton, Crowder & Prussin, 1971
  179. Pitt I, Edwards AD (2003). Design of Speech-Based Devices: A Practical Guide. Springer. p. 26. ISBN   978-1-85233-436-9.
  180. Tversky A, Koehler DJ (1994). "Support theory: A nonextensional representation of subjective probability" (PDF). Psychological Review. 101 (4): 547–567. doi:10.1037/0033-295X.101.4.547. Archived from the original (PDF) on 2017-01-09. Retrieved 2021-12-10.
  181. Stetson C, Fiesta MP, Eagleman DM (December 2007). "Does time really slow down during a frightening event?". PLOS ONE. 2 (12): e1295. Bibcode:2007PLoSO...2.1295S. doi: 10.1371/journal.pone.0001295 . PMC   2110887 . PMID   18074019.
  182. Goldstein ED (2010-06-21). Cognitive Psychology: Connecting Mind, Research and Everyday Experience. Cengage Learning. p. 231. ISBN   978-1-133-00912-2.
  183. "Not everyone is in such awe of the internet". Evening Standard. 2011-03-23. Retrieved 28 October 2015.
  184. Poppenk, Walia, Joanisse, Danckert, & Köhler, 2006
  185. Von Restorff H (1933). "Über die Wirkung von Bereichsbildungen im Spurenfeld (The effects of field formation in the trace field)"". Psychological Research. 18 (1): 299–342. doi:10.1007/bf02409636. S2CID   145479042.

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

<span class="mw-page-title-main">Fundamental attribution error</span> Psychological phenomenon

In social psychology, the fundamental attribution error is a cognitive attribution bias in which observers underemphasize situational and environmental factors for the behavior of an actor while overemphasizing dispositional or personality factors. In other words, observers tend to overattribute the behaviors of others to their personality and underattribute them to the situation or context. Although personality traits and predispositions are considered to be observable facts in psychology, the fundamental attribution error is an error because it misinterprets their effects.

Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, is the common tendency for people to perceive past events as having been more predictable than they were.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of a known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

Trait ascription bias is the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable in their personal traits across different situations. More specifically, it is a tendency to describe one's own behaviour in terms of situational factors while preferring to describe another's behaviour by ascribing fixed dispositions to their personality. This may occur because peoples' own internal states are more readily observable and available to them than those of others.

In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to "see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances". In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.

The illusion of control is the tendency for people to overestimate their ability to control events. It was named by U.S. psychologist Ellen Langer and is thought to influence gambling behavior and belief in the paranormal. Along with illusory superiority and optimism bias, the illusion of control is one of the positive illusions.

The conjunction fallacy is an inference that a conjoint set of two or more specific conclusions is likelier than any single member of that same set, in violation of the laws of probability. It is a type of formal fallacy.

<span class="mw-page-title-main">Thomas Gilovich</span> American psychologist (born 1954)

Thomas Dashiff Gilovich is an American psychologist who is the Irene Blecker Rosenfeld Professor of Psychology at Cornell University. He has conducted research in social psychology, decision making, and behavioral economics, and has written popular books on these subjects. Gilovich has collaborated with Daniel Kahneman, Richard Nisbett, Lee Ross and Amos Tversky. His articles in peer-reviewed journals on subjects such as cognitive biases have been widely cited. In addition, Gilovich has been quoted in the media on subjects ranging from the effect of purchases on happiness to people's most common regrets, to perceptions of people and social groups. Gilovich is a fellow of the Committee for Skeptical Inquiry.

In the psychology of affective forecasting, the impact bias, a form of which is the durability bias, is the tendency for people to overestimate the length or the intensity of future emotional states.

<span class="mw-page-title-main">Planning fallacy</span> Cognitive bias of underestimating time needed

The planning fallacy is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed. This phenomenon sometimes occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned. The bias affects predictions only about one's own tasks. On the other hand, when outside observers predict task completion times, they tend to exhibit a pessimistic bias, overestimating the time needed. The planning fallacy involves estimates of task completion times more optimistic than those encountered in similar projects in the past.

In psychology, illusory correlation is the phenomenon of perceiving a relationship between variables even when no such relationship exists. A false association may be formed because rare or novel occurrences are more salient and therefore tend to capture one's attention. This phenomenon is one way stereotypes form and endure. Hamilton & Rose (1980) found that stereotypes can lead people to expect certain groups and traits to fit together, and then to overestimate the frequency with which these correlations actually occur. These stereotypes can be learned and perpetuated without any actual contact occurring between the holder of the stereotype and the group it is about.

Affective forecasting, also known as hedonic forecasting or the hedonic forecasting mechanism, is the prediction of one's affect in the future. As a process that influences preferences, decisions, and behavior, affective forecasting is studied by both psychologists and economists, with broad applications.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

Optimism bias or optimistic bias is a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event. It is also known as unrealistic optimism or comparative optimism. It is common and transcends gender, ethnicity, nationality, and age. Autistic people are less susceptible to this kind of bias. It has also been reported in other animals, such as rats and birds.

Positive illusions are unrealistically favorable attitudes that people have towards themselves or to people that are close to them. Positive illusions are a form of self-deception or self-enhancement that feel good; maintain self-esteem; or avoid discomfort, at least in the short term. There are three general forms: inflated assessment of one's own abilities, unrealistic optimism about the future, and an illusion of control. The term "positive illusions" originates in a 1988 paper by Taylor and Brown. "Taylor and Brown's (1988) model of mental health maintains that certain positive illusions are highly prevalent in normal thought and predictive of criteria traditionally associated with mental health."

In social psychology, illusory superiority is a cognitive bias wherein people overestimate their own qualities and abilities compared to others. Illusory superiority is one of many positive illusions, relating to the self, that are evident in the study of intelligence, the effective performance of tasks and tests, and the possession of desirable personal characteristics and personality traits. Overestimation of abilities compared to an objective measure is known as the overconfidence effect.

The spotlight effect is the psychological phenomenon by which people tend to believe they are being noticed more than they really are. Being that one is constantly in the center of one's own world, an accurate evaluation of how much one is noticed by others is uncommon. The reason for the spotlight effect is the innate tendency to forget that although one is the center of one's own world, one is not the center of everyone else's. This tendency is especially prominent when one does something atypical.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Illusion of validity is a cognitive bias in which a person overestimates their ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story.

References

Further reading