Attribute substitution

Last updated

Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment (of a target attribute) that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. [1] This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean. [2]

Contents

The theory of attribute substitution unifies a number of separate explanations of reasoning errors in terms of cognitive heuristics. [1] In turn, the theory is subsumed by an effort-reduction framework proposed by Anuj K. Shah and Daniel M. Oppenheimer, which states that people use a variety of techniques to reduce the effort of making decisions. [3]

History

Daniel Kahneman Daniel Kahneman (3283955327) (cropped).jpg
Daniel Kahneman

In a 1974 paper, psychologists Amos Tversky and Daniel Kahneman argued that a broad family of biases (systematic errors in judgment and decision) were explainable in terms of a few heuristics (information-processing shortcuts), including availability and representativeness.

In 1975, psychologist Stanley Smith Stevens proposed that the strength of a stimulus (e.g., the brightness of a light, the severity of a crime) is encoded neurally in a way that is independent of modality.[ citation needed ] Kahneman and Frederick built on this idea, arguing that the target attribute and heuristic attribute could be unrelated. [2]

In a 2002 revision of the theory, Kahneman and Shane Frederick proposed attribute substitution as a process underlying these and other effects. [2]

Conditions

[P]eople are not accustomed to thinking hard, and are often content to trust a plausible judgment that comes to mind.

Daniel Kahneman, American Economic Review93 (5) December 2003, p. 1450

Kahneman and Frederick propose three conditions for attribute substitution: [2]

  1. The target attribute is relatively inaccessible. Substitution is not expected to take place in answering factual questions that can be retrieved directly from memory ("What is your birthday?") or about current experience ("Do you feel thirsty now?).
  2. An associated attribute is highly accessible. This might be because it is evaluated automatically in normal perception or because it has been primed. For example, someone who has been thinking about their love life who is then asked about their happiness might substitute how happy they are with their love life rather than answer the question as asked.
  3. The substitution is not detected and corrected by the reflective system. For example, when asked "A bat and a ball together cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?" many subjects incorrectly answer $0.10. [4] An explanation in terms of attribute substitution is that, rather than work out the sum, subjects parse the sum of $1.10 into a large amount and a small amount, which is easy to do. Whether they feel that is the right answer will depend on whether they check the calculation with their reflective system.

Examples

This illusion works because 3D (perspective) size is substituted for 2D size (all pairs are equal in size). Opt taeuschung groesse.jpg
This illusion works because 3D (perspective) size is substituted for 2D size (all pairs are equal in size).

Optical illusions

Attribute substitution explains the persistence of some illusions. For example, when subjects judge the size of two figures in a perspective picture, their apparent sizes can be distorted by the 3D context, making a convincing optical illusion. The theory states that the three-dimensional size of the figure (which is accessible because it is automatically computed by the visual system) is substituted for its two-dimensional size on the page. Experienced painters and photographers are less susceptible to this illusion, because the two-dimensional size is more accessible to their perception. [4]

Valuing insurance

Kahneman gives an example where some Americans were offered insurance against their own death in a terrorist attack while on a trip to Europe, while another group were offered insurance that would cover death of any kind on the trip. The former group were willing to pay more even though "death of any kind" includes "death in a terrorist attack", Kahneman suggests that the attribute of fear is being substituted for a calculation of the total risks of travel. [5] Fear of terrorism for these subjects was stronger than a general fear of dying on a foreign trip.

Stereotypes

Stereotypes can be a source of heuristic attributes. [2] In a face-to-face conversation with a stranger, judging their intelligence is more computationally complex than judging the colour of their skin. So if the subject has a stereotype about the relative intelligence of whites, blacks, and Asians, that racial attribute might substitute for the more intangible attribute of intelligence. The pre-conscious, intuitive nature of attribute substitution explains how subjects can be influenced by the stereotype while thinking that they have made an honest, unbiased evaluation of the other person's intelligence.

Morality and fairness

Sunstein argued that attribute substitution is pervasive when people reason about moral, political, or legal matters. [6] Given a difficult, novel problem in these areas, people search for a more familiar, related problem (a "prototypical case") and apply its solution as the solution to the harder problem. According to Sunstein, the opinions of trusted political or religious authorities can serve as heuristic attributes when people are asked their own opinions on a matter. Another source of heuristic attributes is emotion: people's moral opinions on sensitive subjects like sexuality and human cloning may be driven by reactions such as disgust, rather than by reasoned principles. [7] Critics demanded more evidence from Sunstein. [3]

The beautiful-is-familiar effect

Monin reports a series of experiments in which subjects, looking at photographs of faces, have to judge whether they have seen those faces before. It is repeatedly found that attractive faces are more likely to be mistakenly labeled as familiar. [8] Monin interprets this result in terms of attribute substitution. The heuristic attribute in this case is a "warm glow"; a positive feeling towards someone that might either be due to their being familiar or being attractive. This interpretation has been criticised, because not all the variance in the familiarity data is accounted for by attractiveness. [3]

Evidence

The most direct evidence, according to Kahneman, [4] is a 1973 experiment that used a psychological profile of Tom W., a fictional graduate student. [9] One group of subjects had to rate Tom's similarity to a typical student in each of nine academic areas (Law, Engineering, Library Science etc.). Another group had to rate how likely it is that Tom specialised in each area. If these ratings of likelihood are governed by probability, then they should resemble the base rates, i.e., the proportion of students in each of the nine areas (which had been separately estimated by a third group). A probabilistic judgment would say that Tom is more likely to be a Humanities student than Library Science, because many more students study Humanities, and the additional information in the profile is vague and unreliable. Instead, the ratings of likelihood matched the ratings of similarity almost perfectly, both in this study and a similar one where subjects judged the likelihood of a fictional woman taking different careers. This suggests that rather than estimating probability using base rates, subjects had substituted the more accessible attribute of similarity.

See also

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

A heuristic, or heuristic technique, is any approach to problem solving that employs a practical method that is not fully optimized, perfected, or rationalized, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

<span class="mw-page-title-main">Daniel Kahneman</span> Israeli-American psychologist

Daniel Kahneman is an Israeli-American author, psychologist and economist notable for his work on hedonic psychology, psychology of judgment and decision-making. He is also known for his work in behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences. His empirical findings challenge the assumption of human rationality prevailing in modern economic theory.

<span class="mw-page-title-main">Amos Tversky</span> Israeli psychologist (1937–1996)

Amos Nathan Tversky was an Israeli cognitive and mathematical psychologist and a key figure in the discovery of systematic human cognitive bias and handling of risk.

The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

The conjunction fallacy is an inference that a conjoint set of two or more specific conclusions is likelier than any single member of that same set, in violation of the laws of probability. It is a type of formal fallacy.

<span class="mw-page-title-main">Thomas Gilovich</span> American psychologist (born 1954)

Thomas Dashiff Gilovich an American psychologist who is the Irene Blecker Rosenfeld Professor of Psychology at Cornell University. He has conducted research in social psychology, decision making, behavioral economics, and has written popular books on these subjects. Gilovich has collaborated with Daniel Kahneman, Richard Nisbett, Lee Ross and Amos Tversky. His articles in peer-reviewed journals on subjects such as cognitive biases have been widely cited. In addition, Gilovich has been quoted in the media on subjects ranging from the effect of purchases on happiness to people's most common regrets, to perceptions of people and social groups. Gilovich is a fellow of the Committee for Skeptical Inquiry.

<span class="mw-page-title-main">Simulation heuristic</span> Mental strategy

The simulation heuristic is a psychological heuristic, or simplified mental strategy, according to which people determine the likelihood of an event based on how easy it is to picture the event mentally. Partially as a result, people experience more regret over outcomes that are easier to imagine, such as "near misses". The simulation heuristic was first theorized by psychologists Daniel Kahneman and Amos Tversky as a specialized adaptation of the availability heuristic to explain counterfactual thinking and regret. However, it is not the same as the availability heuristic. Specifically the simulation heuristic is defined as "how perceivers tend to substitute normal antecedent events for exceptional ones in psychologically 'undoing' this specific outcome."

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

In psychology, a heuristic is an easy-to-compute procedure or rule of thumb that people use when forming beliefs, judgments or decisions. The familiarity heuristic was developed based on the discovery of the availability heuristic by psychologists Amos Tversky and Daniel Kahneman; it happens when the familiar is favored over novel places, people, or things. The familiarity heuristic can be applied to various situations that individuals experience in day-to-day life. When these situations appear similar to previous situations, especially if the individuals are experiencing a high cognitive load, they may regress to the state of mind in which they have felt or behaved before. This heuristic is useful in most situations and can be applied to many fields of knowledge; however, there are both positives and negatives to this heuristic as well.

In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.

Shane Frederick is a tenured professor at the Yale School of Management. He earlier worked at Massachusetts Institute of Technology. He is the creator of the cognitive reflection test, which has been found to be "predictive of the types of choices that feature prominently in tests of decision-making theories, like expected utility theory and prospect theory. People who score high on the CRT are less vulnerable to various biases, and show more patience in intertemporal choice tasks.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

<i>Thinking, Fast and Slow</i> 2011 book by Daniel Kahneman

Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.

Insensitivity to sample size is a cognitive bias that occurs when people judge the probability of obtaining a sample statistic without respect to the sample size. For example, in one study, subjects assigned the same probability to the likelihood of obtaining a mean height of above six feet [183 cm] in samples of 10, 100, and 1,000 men. In other words, variation is more likely in smaller samples, but people may not expect this.

Illusion of validity is a cognitive bias in which a person overestimates their ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story.

Intuition in the context of decision-making is defined as a "non-sequential information-processing mode." It is distinct from insight and can be contrasted with the deliberative style of decision-making. Intuition can influence judgment through either emotion or cognition, and there has been some suggestion that it may be a means of bridging the two. Individuals use intuition and more deliberative decision-making styles interchangeably, but there has been some evidence that people tend to gravitate to one or the other style more naturally. People in a good mood gravitate toward intuitive styles, while people in a bad mood tend to become more deliberative. The specific ways in which intuition actually influences decisions remain poorly understood.

Social heuristics are simple decision making strategies that guide people's behavior and decisions in the social environment when time, information, or cognitive resources are scarce. Social environments tend to be characterised by complexity and uncertainty, and in order to simplify the decision-making process, people may use heuristics, which are decision making strategies that involve ignoring some information or relying on simple rules of thumb.

References

  1. 1 2 Newell, Benjamin R.; Lagnado, David A.; Shanks, David R. (2007). Straight choices: the psychology of decision making. Routledge. pp. 71–74. ISBN   978-1-84169-588-4.
  2. 1 2 3 4 5 Kahneman, Daniel; Frederick, Shane (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich; Dale Griffin; Daniel Kahneman (eds.). Heuristics and Biases: The Psychology of Intuitive Judgment . Cambridge: Cambridge University Press. pp.  49–81. ISBN   978-0-521-79679-8. OCLC   47364085.
  3. 1 2 3 Shah, Anuj K.; Oppenheimer, Daniel M. (March 2008). "Heuristics Made Easy: An Effort-Reduction Framework". Psychological Bulletin. 134 (2): 207–222. doi:10.1037/0033-2909.134.2.207. ISSN   1939-1455. PMID   18298269.
  4. 1 2 3 Kahneman, Daniel (December 2003). "Maps of Bounded Rationality: Psychology for Behavioral Economics". American Economic Review. 93 (5): 1449–1475. CiteSeerX   10.1.1.194.6554 . doi:10.1257/000282803322655392. ISSN   0002-8282.
  5. Kahneman, Daniel (2007). "Short Course in Thinking About Thinking". Edge.org. Edge Foundation. Retrieved 2009-06-03.
  6. Sunstein, Cass R. (2005). "Moral Heuristics". Behavioral and Brain Sciences. 28 (4): 531–542. doi:10.1017/S0140525X05000099. ISSN   0140-525X. PMID   16209802. S2CID   231738548.
  7. Sunstein, Cass R. (2009). "Some Effects of Moral Indignation on Law" (PDF). Vermont Law Review. Vermont Law School. 33 (3): 405–434. Archived from the original on 2011-06-10. Retrieved 2009-09-15.{{cite journal}}: CS1 maint: bot: original URL status unknown (link)
  8. Monin, Benoît; Oppenheimer, Daniel M. (2005). "Correlated Averages vs. Averaged Correlations: Demonstrating the Warm Glow Heuristic Beyond Aggregation" (PDF). Social Cognition. 23 (3): 257–278. doi:10.1521/soco.2005.23.3.257. ISSN   0278-016X. Archived from the original (PDF) on 2016-05-27. Retrieved 2009-06-01.
  9. Kahneman, Daniel; Tversky, Amos (July 1973). "On the Psychology of Prediction". Psychological Review. 80 (4): 237–51. doi:10.1037/h0034747. ISSN   0033-295X.

Further reading