Simulation heuristic

Last updated
Daniel Kahneman Daniel Kahneman (3283955327) (cropped).jpg
Daniel Kahneman

The simulation heuristic is a psychological heuristic, or simplified mental strategy, according to which people determine the likelihood of an event based on how easy it is to picture the event mentally. Partially as a result, people experience more regret over outcomes that are easier to imagine, such as "near misses". The simulation heuristic was first theorized by psychologists Daniel Kahneman and Amos Tversky as a specialized adaptation of the availability heuristic to explain counterfactual thinking and regret. [1] However, it is not the same as the availability heuristic. Specifically the simulation heuristic is defined as "how perceivers tend to substitute normal antecedent events for exceptional ones in psychologically 'undoing' this specific outcome."

Contents

Kahneman and Tversky also believed that people used this heuristic to understand and predict other's behavior in certain circumstances and to answer questions involving counterfactual propositions. People, they believe, do this by mentally undoing events that have occurred and then running mental simulations of the events with the corresponding input values of the altered model. For example, a study was proposed that provided a group of participants with a situation describing two men who were delayed by half an hour in a traffic jam on the way to the airport. Both men were delayed enough that they both missed flights on which they were booked, one of them by half an hour and the second by only five minutes (because his flight had been delayed for 25 minutes). The results showed that a greater number of participants thought that the second man would be more upset than the first man.

Kahneman and Tversky argued that this difference could not be attributed to disappointment, because both had expected to miss their flights. They believed instead that the true explanation was that the students utilized the simulation heuristic and so it was easier for them to imagine minor alterations that would have enabled the second man to arrive in time for his flight than it was for them to devise the same alterations for the first man.

History

This heuristic was introduced by the Israeli psychologists Daniel Kahneman (born 1934) and Amos Tversky (1937–96) in a lecture in 1979. It was published as a book chapter in 1982. [1]

Difference from the availability heuristic

The subjective probability judgments of an event used in the simulation heuristic do not follow the availability heuristic, in that these judgments are not the cause of relevant examples in memory but are instead based on the ease with which situations that did not happen can be mentally simulated or imagined.

Application

The theory that underlies the simulation heuristic assumes that one's judgments are biased towards information that is easily imagined or simulated mentally. It is because of this that we see biases having to do with the overestimation of how causally plausible an event could be or the enhanced regret experienced when it is easy to mentally undo an unfortunate event, such as an accident. Significant research on simulation heuristic's application in counterfactual reasoning has been performed by Dale T Miller and Bryan Taylor.

For example, they found that if an affectively negative experience, such as a fatal car accident was brought about by an extraordinary event, such as someone who usually goes by train to work but instead drove, the simulation heuristic will cause an emotional reaction of regret. This emotional reaction is because the exceptional event is easy to mentally undo and replace with a more common one that would not have caused the accident.

Kahneman and Tversky did a study in which two individuals were given lottery tickets and then were given the opportunity to sell those same tickets back either two weeks before the drawing or an hour before the drawing. They proposed this question to some participants whose responses showed that they believed that the man who had sold his ticket an hour before the drawing would experience the greatest anticipatory regret when that ticket won.

Kahneman and Tversky explained these findings through the understanding of the norm theory, by stating that "people's anticipatory regret, along with reluctance to sell the ticket, should increase with their ease of imagining themselves still owning the winning ticket". [2] Therefore, the man who recently sold his ticket will experience more regret because the "counterfactual world", in which he is the winner, is perceived as closer for him than the man who sold his ticket two weeks ago. This example shows the bias in this type of thinking because both men had the same probability of winning if they had not sold their tickets and the time differences in which they did will not increase or decrease these chances.

Similar results were found with plane crash survivors. These individuals experienced a greater amount of anticipatory regret when they engaged in the highly mutable action of switching flights last minute. It was reasoned that this was due to a person "anticipating counterfactual thoughts that a negative event was evoked, because it tends to make the event more vivid, and so tends to make it more subjectively likely". [3]

Applications

This heuristic has shown to be a salient feature of clinical anxiety and its disorders, which are marked by heighted expectations of future negative events. A study done by David Raune and Andrew Macleod tried to tie the cognitive mechanisms that underlie this type of judgment to the simulation heuristic. [4]

Their findings showed that anxious patient's simulation heuristic scores were correlated with the subjective probability. Such that, the more reasons anxious patients could think of why negative events would happen, relative to the number why they would not happen, the higher their subjective probability judgment that the events would happen to them. Further it was found that anxious patients displayed increase access to the simulation compared to control patients.

They also found support for the hypothesis that the easier it was for anxious patients to form the visual image, the greater the subjective probability that the event would happen to them. Through this work they purposed that the main clinical implication of the simulation heuristic results is that, in order to lower elevated subjective probability in clinical anxiety, patients should be encouraged to think of more reasons why the negative events will not occur then why they will occur .

How it is affected by other heuristics

A study done by Philip Broemer was done to test the hypothesis that the subjective ease with which one can imagine a symptom will be affected by the impact of differently framed messages on attitudes toward performing health behaviors. [5]

By drawing on the simulation heuristic, he argued that the vividness of information is reflected in the subjective ease with which people can imagine having symptoms of an illness.

His results showed that the impact of message framing upon attitudes was moderated by the ease of imagination and clearly supported the congruency hypothesis for different kinds of health behavior. Finding that, negatively framed messages led to more positive attitudes when the recipients of these messages could easily imagine the relevant symptoms. Ease of imagination thus facilitates persuasion when messages emphasize potential health risks. A positive framing however, leads to more positive attitudes when symptom imagination was rather difficult.

Therefore, a message with a reassuring theme is more congruent with a recipient's state of mind when he or she cannot easily imagine the symptoms whereas a message with an aversive theme is more congruent with a recipient's state of mind when he or she can easily imagine having the symptoms .

See also

Footnotes

  1. 1 2 Kahneman, Daniel; Tversky, Amos (1998). "The simulation heuristic". In Daniel Kahneman; Paul Slovic; Amos Tversky (eds.). Judgment under uncertainty: heuristics and biases. Cambridge: Cambridge University Press. ISBN   9780521284141.
  2. Gilovich p. 372
  3. Gilovich p. 374
  4. Raune, David; MacLeod, Andrew; Holmes, Emily A. (2005). "The simulation heuristic and visual imagery in pessimism for future negative events in anxiety". Clinical Psychology & Psychotherapy. 12 (4): 313–25. doi:10.1002/cpp.455.
  5. Broemer, Philip (2004). "Ease of imagination moderates reactions to differently framed health messages". European Journal of Social Psychology. 34 (2): 103–119. doi:10.1002/ejsp.185.

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

A heuristic (; from Ancient Greek εὑρίσκω 'method of discovery', or heuristic technique is any approach to problem solving that employs a pragmatic method that is not fully optimized, perfected, or rationalized, but is nevertheless "good enough" as an approximation or attribute substitution. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Heuristic reasoning is often based on induction, or on analogy[.] [...] Induction is the process of discovering general laws [...] Induction tries to find regularity and coherence [...] Its most conspicuous instruments are generalization, specialization, analogy. [...] Heuristic discusses human behavior in the face of problems [...that have been] preserved in the wisdom of proverbs.

<span class="mw-page-title-main">Daniel Kahneman</span> Israeli-American psychologist and economist (1934–2024)

Daniel Kahneman was an Israeli-American cognitive scientist best-known for his work on the psychology of judgment and decision-making. He is also known for his work in behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences together with Vernon L. Smith. Kahneman's published empirical findings challenge the assumption of human rationality prevailing in modern economic theory. Kahneman became known as the "grandfather of behavioral economics."

<span class="mw-page-title-main">Amos Tversky</span> Israeli psychologist (1937–1996)

Amos Nathan Tversky was an Israeli cognitive and mathematical psychologist and a key figure in the discovery of systematic human cognitive bias and handling of risk.

The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.

<span class="mw-page-title-main">Decision theory</span> Branch of applied probability theory

Decision theory is a branch of applied probability theory and analytic philosophy concerned with the theory of making decisions based on assigning probabilities to various factors and assigning numerical consequences to the outcome.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of a known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

<span class="mw-page-title-main">Clustering illusion</span> Erroneously seeing patterns in randomness

The clustering illusion is the tendency to erroneously consider the inevitable "streaks" or "clusters" arising in small samples from random distributions to be non-random. The illusion is caused by a human tendency to underpredict the amount of variability likely to appear in a small sample of random or pseudorandom data.

<span class="mw-page-title-main">Thomas Gilovich</span> American psychologist (born 1954)

Thomas Dashiff Gilovich an American psychologist who is the Irene Blecker Rosenfeld Professor of Psychology at Cornell University. He has conducted research in social psychology, decision making, and behavioral economics, and has written popular books on these subjects. Gilovich has collaborated with Daniel Kahneman, Richard Nisbett, Lee Ross and Amos Tversky. His articles in peer-reviewed journals on subjects such as cognitive biases have been widely cited. In addition, Gilovich has been quoted in the media on subjects ranging from the effect of purchases on happiness to people's most common regrets, to perceptions of people and social groups. Gilovich is a fellow of the Committee for Skeptical Inquiry.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

In psychology, a heuristic is an easy-to-compute procedure or rule of thumb that people use when forming beliefs, judgments or decisions. The familiarity heuristic was developed based on the discovery of the availability heuristic by psychologists Amos Tversky and Daniel Kahneman; it happens when the familiar is favored over novel places, people, or things. The familiarity heuristic can be applied to various situations that individuals experience in day-to-day life. When these situations appear similar to previous situations, especially if the individuals are experiencing a high cognitive load, they may regress to the state of mind in which they have felt or behaved before. This heuristic is useful in most situations and can be applied to many fields of knowledge; however, there are both positives and negatives to this heuristic as well.

In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.

Counterfactual thinking is a concept in psychology that involves the human tendency to create possible alternatives to life events that have already occurred; something that is contrary to what actually happened. Counterfactual thinking is, as it states: "counter to the facts". These thoughts consist of the "What if?" and the "If only..." that occur when thinking of how things could have turned out differently. Counterfactual thoughts include things that – in the present – could not have happened because they are dependent on events that did not occur in the past.

Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

<i>Thinking, Fast and Slow</i> 2011 book by Daniel Kahneman

Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.

Illusion of validity is a cognitive bias in which a person overestimates their ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story.

Social heuristics are simple decision making strategies that guide people's behavior and decisions in the social environment when time, information, or cognitive resources are scarce. Social environments tend to be characterised by complexity and uncertainty, and in order to simplify the decision-making process, people may use heuristics, which are decision making strategies that involve ignoring some information or relying on simple rules of thumb.

Intuitive statistics, or folk statistics, is the cognitive phenomenon where organisms use data to make generalizations and predictions about the world. This can be a small amount of sample data or training instances, which in turn contribute to inductive inferences about either population-level properties, future data, or both. Inferences can involve revising hypotheses, or beliefs, in light of probabilistic data that inform and motivate future predictions. The informal tendency for cognitive animals to intuitively generate statistical inferences, when formalized with certain axioms of probability theory, constitutes statistics as an academic discipline.

References

Further reading