Conservatism (belief revision)

Last updated

In cognitive psychology and decision science, conservatism or conservatism bias is a bias which refers to the tendency to revise one's belief insufficiently when presented with new evidence. This bias describes human belief revision in which people over-weigh the prior distribution (base rate) and under-weigh new sample evidence when compared to Bayesian belief-revision.

Contents

According to the theory, "opinion change is very orderly, and usually proportional to the numbers of Bayes' theorem – but it is insufficient in amount". [1] In other words, people update their prior beliefs as new evidence becomes available, but they do so more slowly than they would if they used Bayes' theorem.

This bias was discussed by Ward Edwards in 1968, [1] who reported on experiments like the following one:

There are two bookbags, one containing 700 red and 300 blue chips, the other containing 300 red and 700 blue. Take one of the bags. Now, you sample, randomly, with replacement after each chip. In 12 samples, you get 8 reds and 4 blues. what is the probability that this is the predominantly red bag?

Most subjects chose an answer around .7. The correct answer according to Bayes' theorem is closer to .97 ( based on Bayes' theorem:). Edwards suggested that people updated beliefs conservatively, in accordance with Bayes' theorem, but more slowly. They updated from .5 incorrectly according to an observed bias in several experiments. [1]

In finance

In finance, evidence has been found that investors under-react to corporate events, consistent with conservatism. This includes announcements of earnings, changes in dividends, and stock splits. [2]

Possible explanations

The traditional explanation for this effect is that it is an extension of the anchoring bias, as studied by Tversky and Kahneman. The initial "anchor" is the .5 probability given when there are two choices without any other evidence, and people fail to adjust sufficiently far away. However, a recent study suggests that the belief revising conservatism can be explained by an information-theoretic generative mechanism that assumes a noisy conversion of objective evidence (observation) into subjective estimates (judgement). [3] The study explains that the estimates of conditional probabilities are conservative because of noise in the retrieval of information from memory, whereas noise is defined as the mixing of evidence. For instance, if objective evidence indicates the probability of an event occurs is 1, i.e., P(A) = 1 and P(¬A) = 0, whereas according to the memory of a subject, the probabilities are P(A') = 0.727 and P(¬A') = 0.273 respectively. When the evidence is noised by memory with probability of P(Á | A') = 0.8, p(¬Á | A') = 0.2, P(Á | ¬A') = 0.2 and P(¬Á | ¬A') = 0.8, the estimate (judgement) is smoothed to be P(Á) = 0.636 and P(¬Á)=0.364. The estimated values (0.636, 0.364) are less extreme or more conservative than the actual evidence (1 and 0). In an incentivized experimental study, it has been shown that the conservatism bias decreased in those with greater cognitive ability, though it did not disappear. [4]

See also

Related Research Articles

The gambler's fallacy, also known as the Monte Carlo fallacy or the fallacy of the maturity of chances, is the incorrect belief that, if a particular event occurs more frequently than normal during the past, it is less likely to happen in the future, when it has otherwise been established that the probability of such events does not depend on what has happened in the past. Such events, having the quality of historical independence, are referred to as statistically independent. The fallacy is commonly associated with gambling, where it may be believed, for example, that the next dice roll is more than usually likely to be six because there have recently been fewer than the expected number of sixes.

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called "Bayesian probability".

<span class="mw-page-title-main">Daniel Kahneman</span> Israeli-American psychologist

Daniel Kahneman is an Israeli-American psychologist and economist notable for his work on the psychology of judgment and decision-making, as well as behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences. His empirical findings challenge the assumption of human rationality prevailing in modern economic theory.

<span class="mw-page-title-main">Amos Tversky</span> Israeli psychologist (1937–1996)

Amos Nathan Tversky was an Israeli cognitive and mathematical psychologist and a key figure in the discovery of systematic human cognitive bias and handling of risk.

The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.

The representativeness heuristic is used when making judgments about the probability of an event under uncertainty. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". Heuristics are described as "judgmental shortcuts that generally get us where we need to go – and quickly – but at the cost of occasionally sending us off course." Heuristics are useful because they use effort-reduction and simplification in decision-making.

<span class="mw-page-title-main">Clustering illusion</span> Erroneously seeing patterns in randomness

The clustering illusion is the tendency to erroneously consider the inevitable "streaks" or "clusters" arising in small samples from random distributions to be non-random. The illusion is caused by a human tendency to underpredict the amount of variability likely to appear in a small sample of random or pseudorandom data.

The conjunction fallacy is an inference from an array of particulars, in violation of the laws of probability, that a conjoint set of two or more conclusions is likelier than any single member of that same set. It is a type of formal fallacy.

<span class="mw-page-title-main">Base rate fallacy</span> Error in thinking which involves under-valuing base rate information

The base rate fallacy, also called base rate neglect or base rate bias, is a type of fallacy in which people tend to ignore the base rate in favor of the individuating information . Base rate neglect is a specific form of the more general extension neglect.

<span class="mw-page-title-main">Simulation heuristic</span> Mental strategy

The simulation heuristic is a psychological heuristic, or simplified mental strategy, according to which people determine the likelihood of an event based on how easy it is to picture the event mentally. Partially as a result, people experience more regret over outcomes that are easier to imagine, such as "near misses". The simulation heuristic was first theorized by psychologists Daniel Kahneman and Amos Tversky as a specialized adaptation of the availability heuristic to explain counterfactual thinking and regret. However, it is not the same as the availability heuristic. Specifically the simulation heuristic is defined as "how perceivers tend to substitute normal antecedent events for exceptional ones in psychologically 'undoing' this specific outcome."

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

The subadditivity effect is the tendency to judge probability of the whole to be less than the probabilities of the parts.

Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.

Heuristics is the process by which humans use mental short cuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

<i>Thinking, Fast and Slow</i> 2011 book by Daniel Kahneman

Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.

Insensitivity to sample size is a cognitive bias that occurs when people judge the probability of obtaining a sample statistic without respect to the sample size. For example, in one study subjects assigned the same probability to the likelihood of obtaining a mean height of above six feet [183 cm] in samples of 10, 100, and 1,000 men. In other words, variation is more likely in smaller samples, but people may not expect this.

The end-of-the-day betting effect is a cognitive bias reflected in the tendency for bettors to take gambles with higher risk and higher reward at the end of their betting session to try to make up for losses. William McGlothlin (1956) and Mukhtar Ali (1977) first discovered this effect after observing the shift in betting patterns at horserace tracks. Mcglothlin and Ali noticed that people are significantly more likely to prefer longshots to conservative bets on the last race of the day. They found that the movement towards longshots, and away from favorites, is so pronounced that some studies show that conservatively betting on the favorite to show in the last race is a profitable bet despite the track’s take.

Intuitive statistics, or folk statistics, refers to the cognitive phenomenon where organisms use data to make generalizations and predictions about the world. This can be a small amount of sample data or training instances, which in turn contribute to inductive inferences about either population-level properties, future data, or both. Inferences can involve revising hypotheses, or beliefs, in light of probabilistic data that inform and motivate future predictions. The informal tendency for cognitive animals to intuitively generate statistical inferences, when formalized with certain axioms of probability theory, constitutes statistics as an academic discipline.

References

  1. 1 2 3 Edwards, Ward. "Conservatism in Human Information Processing (excerpted)". In Daniel Kahneman, Paul Slovic and Amos Tversky. (1982). Judgment under uncertainty: Heuristics and biases. New York: Cambridge University Press. ISBN   978-0521284141 Original work published 1968.
  2. Kadiyala, Padmaja; Rau, P. Raghavendra (2004). "Investor Reaction to Corporate Event Announcements: Under-reaction or Over-reaction?". The Journal of Business. 77 (4): 357–386. doi:10.1086/381273. JSTOR   10.1086/381273.. Earlier version at doi:10.2139/ssrn.249979
  3. Hilbert, Martin (2012). "Toward a synthesis of cognitive biases: How noisy information processing can bias human decision making" (PDF). Psychological Bulletin. 138 (2): 211–237. doi:10.1037/a0025940. PMID   22122235.
  4. Oechssler, Jörg; Roider, Andreas; Schmitz, Patrick W. (2009). "Cognitive abilities and behavioral biases" (PDF). Journal of Economic Behavior & Organization. 72 (1): 147–152. doi:10.1016/j.jebo.2009.04.018.