This article needs additional citations for verification .(March 2023) |
Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment (of a target attribute) that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. [1] This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean. [2]
The theory of attribute substitution unifies a number of separate explanations of reasoning errors in terms of cognitive heuristics. [1] In turn, the theory is subsumed by an effort-reduction framework proposed by Anuj K. Shah and Daniel M. Oppenheimer, which states that people use a variety of techniques to reduce the effort of making decisions. [3]
In a 1974 paper, psychologists Amos Tversky and Daniel Kahneman argued that a broad family of biases (systematic errors in judgment and decision) were explainable in terms of a few heuristics (information-processing shortcuts), including availability and representativeness.
In 1975, psychologist Stanley Smith Stevens proposed that the strength of a stimulus (e.g., the brightness of a light, the severity of a crime) is encoded neurally in a way that is independent of modality.[ citation needed ] Kahneman and Frederick built on this idea, arguing that the target attribute and heuristic attribute could be unrelated. [2]
In a 2002 revision of the theory, Kahneman and Shane Frederick proposed attribute substitution as a process underlying these and other effects. [2]
[P]eople are not accustomed to thinking hard, and are often content to trust a plausible judgment that comes to mind.
Daniel Kahneman, American Economic Review93 (5) December 2003, p. 1450
Kahneman and Frederick propose three conditions for attribute substitution: [2]
Attribute substitution explains the persistence of some illusions. For example, when subjects judge the size of two figures in a perspective picture, their apparent sizes can be distorted by the 3D context, making a convincing optical illusion. The theory states that the three-dimensional size of the figure (which is accessible because it is automatically computed by the visual system) is substituted for its two-dimensional size on the page. Experienced painters and photographers are less susceptible to this illusion, because the two-dimensional size is more accessible to their perception. [4]
Kahneman gives an example where some Americans were offered insurance against their own death in a terrorist attack while on a trip to Europe, while another group were offered insurance that would cover death of any kind on the trip. The former group were willing to pay more even though "death of any kind" includes "death in a terrorist attack", Kahneman suggests that the attribute of fear is being substituted for a calculation of the total risks of travel. [5] Fear of terrorism for these subjects was stronger than a general fear of dying on a foreign trip.
Stereotypes can be a source of heuristic attributes. [2] In a face-to-face conversation with a stranger, judging their intelligence is more computationally complex than judging the colour of their skin. So if the subject has a stereotype about the relative intelligence of whites, blacks, and Asians, that racial attribute might substitute for the more intangible attribute of intelligence. The pre-conscious, intuitive nature of attribute substitution explains how subjects can be influenced by the stereotype while thinking that they have made an honest, unbiased evaluation of the other person's intelligence.
Sunstein argued that attribute substitution is pervasive when people reason about moral, political, or legal matters. [6] Given a difficult, novel problem in these areas, people search for a more familiar, related problem (a "prototypical case") and apply its solution as the solution to the harder problem. According to Sunstein, the opinions of trusted political or religious authorities can serve as heuristic attributes when people are asked their own opinions on a matter. Another source of heuristic attributes is emotion: people's moral opinions on sensitive subjects like sexuality and human cloning may be driven by reactions such as disgust, rather than by reasoned principles. [7] Critics demanded more evidence from Sunstein. [3]
Monin reports a series of experiments in which subjects, looking at photographs of faces, have to judge whether they have seen those faces before. It is repeatedly found that attractive faces are more likely to be mistakenly labeled as familiar. [8] Monin interprets this result in terms of attribute substitution. The heuristic attribute in this case is a "warm glow"; a positive feeling towards someone that might either be due to their being familiar or being attractive. This interpretation has been criticised, because not all the variance in the familiarity data is accounted for by attractiveness. [3]
The most direct evidence, according to Kahneman, [4] is a 1973 experiment that used a psychological profile of Tom W., a fictional graduate student. [9] One group of subjects had to rate Tom's similarity to a typical student in each of nine academic areas (Law, Engineering, Library Science etc.). Another group had to rate how likely it is that Tom specialised in each area. If these ratings of likelihood are governed by probability, then they should resemble the base rates, i.e., the proportion of students in each of the nine areas (which had been separately estimated by a third group). A probabilistic judgment would say that Tom is more likely to be a Humanities student than Library Science, because many more students study Humanities, and the additional information in the profile is vague and unreliable. Instead, the ratings of likelihood matched the ratings of similarity almost perfectly, both in this study and a similar one where subjects judged the likelihood of a fictional woman taking different careers. This suggests that rather than estimating probability using base rates, subjects had substituted the more accessible attribute of similarity.
A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.
A heuristic, or heuristic technique, is any approach to problem solving that employs a practical method that is not fully optimized, perfected, or rationalized, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.
Daniel Kahneman is an Israeli-American author, psychologist and economist notable for his work on hedonic psychology, psychology of judgment and decision-making. He is also known for his work in behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences. His empirical findings challenge the assumption of human rationality prevailing in modern economic theory.
Amos Nathan Tversky was an Israeli cognitive and mathematical psychologist and a key figure in the discovery of systematic human cognitive bias and handling of risk.
The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.
The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.
The conjunction fallacy is an inference that a conjoint set of two or more specific conclusions is likelier than any single member of that same set, in violation of the laws of probability. It is a type of formal fallacy.
Thomas Dashiff Gilovich an American psychologist who is the Irene Blecker Rosenfeld Professor of Psychology at Cornell University. He has conducted research in social psychology, decision making, behavioral economics, and has written popular books on these subjects. Gilovich has collaborated with Daniel Kahneman, Richard Nisbett, Lee Ross and Amos Tversky. His articles in peer-reviewed journals on subjects such as cognitive biases have been widely cited. In addition, Gilovich has been quoted in the media on subjects ranging from the effect of purchases on happiness to people's most common regrets, to perceptions of people and social groups. Gilovich is a fellow of the Committee for Skeptical Inquiry.
The simulation heuristic is a psychological heuristic, or simplified mental strategy, according to which people determine the likelihood of an event based on how easy it is to picture the event mentally. Partially as a result, people experience more regret over outcomes that are easier to imagine, such as "near misses". The simulation heuristic was first theorized by psychologists Daniel Kahneman and Amos Tversky as a specialized adaptation of the availability heuristic to explain counterfactual thinking and regret. However, it is not the same as the availability heuristic. Specifically the simulation heuristic is defined as "how perceivers tend to substitute normal antecedent events for exceptional ones in psychologically 'undoing' this specific outcome."
The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.
In psychology, a heuristic is an easy-to-compute procedure or rule of thumb that people use when forming beliefs, judgments or decisions. The familiarity heuristic was developed based on the discovery of the availability heuristic by psychologists Amos Tversky and Daniel Kahneman; it happens when the familiar is favored over novel places, people, or things. The familiarity heuristic can be applied to various situations that individuals experience in day-to-day life. When these situations appear similar to previous situations, especially if the individuals are experiencing a high cognitive load, they may regress to the state of mind in which they have felt or behaved before. This heuristic is useful in most situations and can be applied to many fields of knowledge; however, there are both positives and negatives to this heuristic as well.
In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.
Shane Frederick is a tenured professor at the Yale School of Management. He earlier worked at Massachusetts Institute of Technology. He is the creator of the cognitive reflection test, which has been found to be "predictive of the types of choices that feature prominently in tests of decision-making theories, like expected utility theory and prospect theory. People who score high on the CRT are less vulnerable to various biases, and show more patience in intertemporal choice tasks.
Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.
Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.
Insensitivity to sample size is a cognitive bias that occurs when people judge the probability of obtaining a sample statistic without respect to the sample size. For example, in one study, subjects assigned the same probability to the likelihood of obtaining a mean height of above six feet [183 cm] in samples of 10, 100, and 1,000 men. In other words, variation is more likely in smaller samples, but people may not expect this.
Illusion of validity is a cognitive bias in which a person overestimates their ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story.
Intuition in the context of decision-making is defined as a "non-sequential information-processing mode." It is distinct from insight and can be contrasted with the deliberative style of decision-making. Intuition can influence judgment through either emotion or cognition, and there has been some suggestion that it may be a means of bridging the two. Individuals use intuition and more deliberative decision-making styles interchangeably, but there has been some evidence that people tend to gravitate to one or the other style more naturally. People in a good mood gravitate toward intuitive styles, while people in a bad mood tend to become more deliberative. The specific ways in which intuition actually influences decisions remain poorly understood.
Social heuristics are simple decision making strategies that guide people's behavior and decisions in the social environment when time, information, or cognitive resources are scarce. Social environments tend to be characterised by complexity and uncertainty, and in order to simplify the decision-making process, people may use heuristics, which are decision making strategies that involve ignoring some information or relying on simple rules of thumb.
{{cite journal}}
: CS1 maint: bot: original URL status unknown (link)