Precision bias

Last updated

Precision bias also known as numeracy bias is a form of cognitive bias [1] in which an evaluator of information commits a logical fallacy as the result of confusing accuracy and precision. [2] More particularly, in assessing the merits of an argument, a measurement, or a report, an observer or assessor falls prey to precision bias when they believe that greater precision implies greater accuracy (i.e., that simply because a statement is precise, it is also true); the observer or assessor are said to provide false precision. [3] [4]

Contents

The clustering illusion [5] and the Texas sharpshooter fallacy [6] may both be treated as relatives of precision bias. In these related fallacies, precision is mistakenly considered evidence of causation, when in fact the clustered information may actually be the result of randomness.

See also


Related Research Articles

Accuracy and precision are two measures of observational error. Accuracy is how close a given set of measurements are to their true value, while precision is how close the measurements are to each other.

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.

A fallacy is the use of invalid or otherwise faulty reasoning in the construction of an argument which may appear to be a well-reasoned argument if unnoticed. The term was introduced in the Western intellectual tradition by the Aristotelian De Sophisticis Elenchis.

<span class="mw-page-title-main">Behavioral economics</span> Academic discipline

Behavioral economics studies the effects of psychological, cognitive, emotional, cultural and social factors on the decisions of individuals or institutions, such as how those decisions vary from those implied by classical economic theory.

The Texas sharpshooter fallacy is an informal fallacy which is committed when differences in data are ignored, but similarities are overemphasized. From this reasoning, a false conclusion is inferred. This fallacy is the philosophical or rhetorical application of the multiple comparisons problem and apophenia. It is related to the clustering illusion, which is the tendency in human cognition to interpret patterns where none actually exist.

Anecdotal evidence is evidence based only on personal observation, collected in a casual or non-systematic manner.

<span class="mw-page-title-main">Clustering illusion</span> Erroneously seeing patterns in randomness

The clustering illusion is the tendency to erroneously consider the inevitable "streaks" or "clusters" arising in small samples from random distributions to be non-random. The illusion is caused by a human tendency to underpredict the amount of variability likely to appear in a small sample of random or pseudorandom data.

<span class="mw-page-title-main">Base rate fallacy</span> Error in thinking which involves under-valuing base rate information

The base rate fallacy, also called base rate neglect or base rate bias, is a type of fallacy in which people tend to ignore the base rate in favor of the individuating information . Base rate neglect is a specific form of the more general extension neglect.

<span class="mw-page-title-main">Gerd Gigerenzer</span> German cognitive psychologist

Gerd Gigerenzer is a German psychologist who has studied the use of bounded rationality and heuristics in decision making. Gigerenzer is director emeritus of the Center for Adaptive Behavior and Cognition (ABC) at the Max Planck Institute for Human Development and director of the Harding Center for Risk Literacy, both in Berlin.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

<span class="mw-page-title-main">Introspection illusion</span> Cognitive bias of people thinking they understand their own mental states but others are inaccurate

The introspection illusion is a cognitive bias in which people wrongly think they have direct insight into the origins of their mental states, while treating others' introspections as unreliable. The illusion has been examined in psychological experiments, and suggested as a basis for biases in how people compare themselves to others. These experiments have been interpreted as suggesting that, rather than offering direct access to the processes underlying mental states, introspection is a process of construction and inference, much as people indirectly infer others' mental states from their behaviour.

Fuzzy-trace theory (FTT) is a theory of cognition originally proposed by Valerie F. Reyna and Charles Brainerd that draws upon dual-trace conceptions to predict and explain cognitive phenomena, particularly in memory and reasoning. The theory has been used in areas such as cognitive psychology, human development, and social psychology to explain, for instance, false memory and its development, probability judgments, medical decision making, risk perception and estimation, and biases and fallacies in decision making.

The "hot hand" is a phenomenon, previously considered a cognitive social bias, that a person who experiences a successful outcome has a greater chance of success in further attempts. The concept is often applied to sports and skill-based tasks in general and originates from basketball, where a shooter is more likely to score if their previous attempts were successful; i.e., while having the "hot hand.” While previous success at a task can indeed change the psychological attitude and subsequent success rate of a player, researchers for many years did not find evidence for a "hot hand" in practice, dismissing it as fallacious. However, later research questioned whether the belief is indeed a fallacy. Some recent studies using modern statistical analysis have observed evidence for the "hot hand" in some sporting activities; however, other recent studies have not observed evidence of the "hot hand". Moreover, evidence suggests that only a small subset of players may show a "hot hand" and, among those who do, the magnitude of the "hot hand" tends to be small.

Heuristics is the process by which humans use mental short cuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Motivated reasoning is a cognitive and social response, in which individuals actively, consciously or unconsciously, allow motivational and emotional (affective) biases to affect how new information is perceived. Individuals tend to favor evidence that coincide with their current beliefs and reject new information that contradicts them, despite contrary evidence.

Authority bias is the tendency to attribute greater accuracy to the opinion of an authority figure and be more influenced by that opinion. An individual is more influenced by the opinion of this authority figure, believing their views to be more credible, and hence place greater emphasis on the authority figure's viewpoint and are more likely to obey them. This concept is considered one of the social cognitive biases or collective cognitive biases.

Zero-sum thinking perceives situations as zero-sum games, where one person's gain would be another's loss. The term is derived from game theory. However, unlike the game theory concept, zero-sum thinking refers to a psychological construct—a person's subjective interpretation of a situation. Zero-sum thinking is captured by the saying "your gain is my loss". Rozycka-Tran et al. (2015) defined zero-sum thinking as:

A general belief system about the antagonistic nature of social relations, shared by people in a society or culture and based on the implicit assumption that a finite amount of goods exists in the world, in which one person's winning makes others the losers, and vice versa ... a relatively permanent and general conviction that social relations are like a zero-sum game. People who share this conviction believe that success, especially economic success, is possible only at the expense of other people's failures.

<span class="mw-page-title-main">Frequency illusion</span> Cognitive bias

Frequency illusion, also known as the Baader–Meinhof phenomenon or frequency bias, is a cognitive bias referring to the tendency to notice something more often after noticing it for the first time, leading to the belief that it has an increased frequency of occurrence. The illusion is a result of increased awareness of a phrase, idea, or object – for example, hearing a song more often or seeing red cars everywhere.

References

  1. Proceedings of the 33rd Annual Meeting of the Cognitive Science Society (CogSci 2011), held in Boston, USA 20-32 July 2011 / L. Carlson, C. Hoelscher and T. Shipley (eds.): pp.1521-1526
  2. "Practices of Science: Precision vs. Accuracy | manoa.hawaii.edu/ExploringOurFluidEarth". manoa.hawaii.edu. Retrieved 2022-10-22.
  3. Lim, Daniel; DeSteno, David (2020). "Past adversity protects against the numeracy bias in compassion". Emotion. 20 (8): 1344–1356. doi:10.1037/emo0000655. ISSN   1931-1516. PMID   31414833. S2CID   198166331.
  4. Jerez-Fernandez, Alexandra; Angulo, Ashley N.; Oppenheimer, Daniel M. (2014). "Show Me the Numbers: Precision as a Cue to Others' Confidence". Psychological Science. 25 (2): 633–635. doi:10.1177/0956797613504301. ISSN   0956-7976. PMID   24317423. S2CID   43824955.
  5. Howard, Jonathan (2019), "Illusionary Correlation, False Causation, and Clustering Illusion", Cognitive Errors and Diagnostic Mistakes, Cham: Springer International Publishing, pp. 265–283, doi:10.1007/978-3-319-93224-8_15, ISBN   978-3-319-93223-1, S2CID   150016878 , retrieved 2022-10-22
  6. Thompson, W. C. (2009-09-01). "Painting the target around the matching profile: the Texas sharpshooter fallacy in forensic DNA interpretation". Law, Probability and Risk. 8 (3): 257–276. doi: 10.1093/lpr/mgp013 . ISSN   1470-8396.