Subadditivity effect

Last updated

The subadditivity effect is the tendency to judge probability of the whole to be less than the probabilities of the parts. [1]

Probability measure of the expectation that an event will occur or a statement is true

Probability is a measure quantifying the likelihood that events will occur. See glossary of probability and statistics. Probability quantifies as a number between 0 and 1, where, roughly speaking, 0 indicates impossibility and 1 indicates certainty. The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes are both equally probable; the probability of "heads" equals the probability of "tails"; and since no other outcomes are possible, the probability of either "heads" or "tails" is 1/2.

Contents

Example

For instance, subjects in one experiment judged the probability of death from cancer in the United States was 18%, the probability from heart attack was 22%, and the probability of death from "other natural causes" was 33%. Other participants judged the probability of death from a natural cause was 58%. Natural causes are made up of precisely cancer, heart attack, and "other natural causes," however, the sum of the latter three probabilities was 73%, and not 58%. According to Tversky and Koehler (1994) this kind of result is observed consistently. [2]

Amos Tversky Israeli psychologist

Amos Nathan Tversky was a cognitive and mathematical psychologist, a student of cognitive science, a collaborator of Daniel Kahneman, and a figure in the discovery of systematic human cognitive bias and handling of risk.

Explanations

In a 2012 article in Psychological Bulletin it is suggested the subadditivity effect can be explained by an information-theoretic generative mechanism that assumes a noisy conversion of objective evidence (observation) into subjective estimates (judgment). [3] This explanation is different than support theory, proposed as an explanation by Tversky and Koehler, [2] which requires additional assumptions. Since mental noise is a sufficient explanation that is much simpler and straightforward than any explanation involving heuristics or behavior, Occam's razor would argue in its favor as the underlying generative mechanism (it is the hypotheses which makes the fewest assumptions). [3]

<i>Psychological Bulletin</i> journal

The Psychological Bulletin is a monthly peer-reviewed academic journal that publishes evaluative and integrative research reviews and interpretations of issues in psychology, including both qualitative (narrative) and/or quantitative (meta-analytic) aspects. The editor-in-chief is Dolores Albarracin.

Occams razor Philosophical principle of selecting the solution with the fewest assumptions

Occam's razor is the problem-solving principle that states "Entities should not be multiplied without necessity." The idea is attributed to English Franciscan friar William of Ockham, a scholastic philosopher and theologian who used a preference for simplicity to defend the idea of divine miracles. It is sometimes paraphrased by a statement like "The simplest solution is most likely the right one." but is the same as the Razor only if results match. Occam's razor says that when presented with competing hypotheses that make the same predictions, one should select the solution with the fewest assumptions, and it is not meant to be a way of choosing between hypotheses that make different predictions.

Related Research Articles

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective social reality" from their perception of the input. An individual's construction of social reality, not the objective input, may dictate their behaviour in the social world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.

In social psychology, fundamental attribution error (FAE), also known as correspondence bias or attribution effect, is the tendency for people to under-emphasize situational explanations for an individual's observed behavior while over-emphasizing dispositional and personality-based explanations for their behavior. This effect has been described as "the tendency to believe that what people do reflects who they are".

The availability heuristic is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method or decision. The availability heuristic operates on the notion that if something can be recalled, it must be important, or at least more important than alternative solutions which are not as readily recalled. Subsequently, under the availability heuristic, people tend to heavily weigh their judgments toward more recent information, making new opinions biased toward that latest news.

Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, refers to the common tendency for people to perceive events that have already occurred as having been more predictable than they actually were before the events took place. As a result, people often believe, after an event has occurred, that they would have predicted, or perhaps even would have known with a high degree of certainty, what the outcome of the event would have been, before the event occurred. Hindsight bias may cause distortions of our memories of what we knew and/or believed before an event occurred, and is a significant source of overconfidence regarding our ability to predict the outcomes of future events. Examples of hindsight bias can be seen in the writings of historians describing outcomes of battles, physicians recalling clinical trials, and in judicial systems as individuals attribute responsibility on the basis of the supposed predictability of accidents.

Decision theory is the study of an agent's choices. Decision theory can be broken into two branches: normative decision theory, which analyzes the outcomes of decisions or determines the optimal decisions given constraints and assumptions, and descriptive decision theory, which analyzes how agents actually make the decisions they do.

The representativeness heuristic is used when making judgments about the probability of an event under uncertainty. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s. Heuristics are described as "judgmental shortcuts that generally get us where we need to go – and quickly – but at the cost of occasionally sending us off course." Heuristics are useful because they use effort-reduction and simplification in decision-making.

Trait ascription bias is the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable in their personal traits across different situations. More specifically, it is a tendency to describe one's own behaviour in terms of situational factors while preferring to describe another's behaviour by ascribing fixed dispositions to their personality. This may occur because peoples' own internal states are more readily observable and available to them than those of others.

Anchoring or focalism is a cognitive bias where an individual depends too heavily on an initial piece of information offered when making decisions.

The worse-than-average effect or below-average effect is the human tendency to underestimate one's achievements and capabilities in relation to others.

Similarity refers to the psychological degree of identity of two mental representations. Research in cognitive psychology has taken a number of approaches to the concept of similarity. Each of them is related to a particular set of assumptions about knowledge representation.

Confidence is a state of being certain either that a hypothesis or prediction is correct or that a chosen course of action is the best or most effective. Confidence comes from a latin word fidere' which means "to trust"; therefore, having a self-confidence is having trust in one's self. Arrogance or hubris in this comparison is having unmerited confidence – believing something or someone is capable or correct when they are not. Overconfidence or presumptuousness is excessive belief in someone succeeding, without any regard for failure. Confidence can be a self-fulfilling prophecy as those without it may fail or not try because they lack it and those with it may succeed because they have it rather than because of an innate ability.

The overconfidence effect is a well-established bias in which a person's subjective confidence in his or her judgements is reliably greater than the objective accuracy of those judgements, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

Attribute substitution, also known as Substitution bias, is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.

Fuzzy-trace theory (FTT) is a theory of cognition originally proposed by Charles Brainerd and Valerie F. Reyna that draws upon dual-trace conceptions to predict and explain cognitive phenomena, particularly in memory and reasoning. The theory has been used in areas such as cognitive psychology, human development, and social psychology to explain, for instance, false memory and its development, probability judgments, medical decision making, risk perception and estimation, and biases and fallacies in decision making.

Heuristics are simple strategies to form judgments and make decisions by focusing on the most relevant aspects of a complex problem. As far as we know, animals have always relied on heuristics to solve adaptive problems, and so have humans.

Insensitivity to sample size is a cognitive bias that occurs when people judge the probability of obtaining a sample statistic without respect to the sample size. For example, in one study subjects assigned the same probability to the likelihood of obtaining a mean height of above six feet [183 cm] in samples of 10, 100, and 1,000 men. In other words, variation is more likely in smaller samples, but people may not expect this.

Illusion of validity is a cognitive bias in which a person overestimates his or her ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story.

In cognitive psychology and decision science, conservatism or conservatism bias is a bias in human information processing, which refers to the tendency to revise one's belief insufficiently when presented with new evidence. This bias describes human belief revision in which persons over-weigh the prior distribution and under-weigh new sample evidence when compared to Bayesian belief-revision.

References

  1. Baron, Jonathan (2009). Thinking and Deciding (4 ed.). Cambridge University Press. ISBN   978-0521680431.
  2. 1 2 Tversky, Amos; Koehler, Derek J. (1994). "Support theory: A nonextensional representation of subjective probability" (PDF). Psychological Review. 101 (4): 547–567. doi:10.1037/0033-295X.101.4.547. Archived from the original on 2016-05-06.CS1 maint: BOT: original-url status unknown (link)
  3. 1 2 Hilbert, Martin (2012). "Toward a synthesis of cognitive biases: How noisy information processing can bias human decision making" (PDF). Psychological Bulletin. 138 (2): 211–237. doi:10.1037/a0025940. PMID   22122235. Archived from the original on 2016-03-04.CS1 maint: BOT: original-url status unknown (link)