Debiasing

Last updated

Debiasing is the reduction of bias, particularly with respect to judgment and decision making. Biased judgment and decision making is that which systematically deviates from the prescriptions of objective standards such as facts, logic, and rational behavior or prescriptive norms. Biased judgment and decision making exists in consequential domains such as medicine, law, policy, and business, as well as in everyday life. Investors, for example, tend to hold onto falling stocks too long and sell rising stocks too quickly. Employers exhibit considerable discrimination in hiring and employment practices, [1] and some parents continue to believe that vaccinations cause autism despite knowing that this link is based on falsified evidence. [2] At an individual level, people who exhibit less decision bias have more intact social environments, reduced risk of alcohol and drug use, lower childhood delinquency rates, and superior planning and problem solving abilities. [3]

Contents

Debiasing can occur within the decision maker. For example, a person may learn or adopt better strategies by which to make judgments and decisions. [2] [4] Debiasing can also occur as a result of changes in external factors, such as changing the incentives relevant to a decision or the manner in which the decision is made. [5]

There are three general approaches to debiasing judgment and decision making, and the costly errors with which biased judgment and decision making is associated: changing incentives, nudging, and training. Each approach has strengths and weaknesses. For more details, see Morewedge and colleagues (2015). [2]

General approaches

Incentives

Changing incentives can be an effective means to debias judgment and decision making. This approach is generally derived from economic theories suggesting that people act in their self-interest by seeking to maximize their utility over their lifetime. Many decision making biases may occur simply because they are more costly to eliminate than to ignore. [6] Making people more accountable for their decisions (increasing incentives), for example, can increase the extent to which they invest cognitive resources in making decisions, leading to less biased decision making when people generally have an idea of how a decision should be made. [7] However, "bias" might not be the appropriate term for these types of decision making errors. These "strategy-based" errors occur simply because the necessary effort outweighs the benefit. [6] If a person makes a suboptimal choice based on an actual bias, then incentives may exacerbate the issue. [7] An incentive in this case may simply cause the person to perform the suboptimal behavior more enthusiastically. [6]

Incentives can be calibrated to change preferences toward more beneficial behavior. Price cuts on healthy foods increase their consumption in school cafeterias, [8] and soda taxes appear to reduce soda consumption by the public. People often are willing to use incentives to change their behavior through the means of a commitment device. Shoppers, for example, were willing to forego a cash back rebate on healthy food items if they did not increase the percentage of healthy foods in their shopping baskets. [9]

Incentives can backfire when they are miscalibrated or are weaker than social norms that were preventing undesirable behavior. Large incentives can also lead people to choke under pressure. [10]

Nudges

Nudges, changes in information presentation or the manner by which judgments and decisions are elicited, is another means to debiasing. People may choose healthier foods if they are better able to understand their nutritional contents, [11] and may choose lower-calorie meals if they are explicitly asked if they would like to downsize their side orders. [12] Other examples of nudges include changing which option is the default option to which people will be assigned if they do not choose an alternative option, placing a limit on the serving size of soda, or automatically enrolling employees in a retirement savings program.

Training

Training can effectively debias decision makers over the long term. [2] [13] [14] Training, to date, has received less attention by academics and policy makers than incentives and nudges because initial debiasing training efforts resulted in mixed success (see Fischhoff, 1982 in Kahneman et al. [15] ). Decision makers could be effectively debiased through training in specific domains. For example, experts can be trained to make very accurate decisions when decision making entails recognizing patterns and applying appropriate responses in domains such as firefighting, chess, and weather forecasting. Evidence of more general debiasing, across domains and different kinds of problems, however, was not discovered until recently. The reason for the lack of more domain-general debiasing was attributed to experts failing to recognize the underlying "deep structure" of problems in different formats and domains. Weather forecasters are able to predict rain with high accuracy, for example, but show the same overconfidence in their answers to basic trivia questions as other people. An exception was graduate training in scientific fields heavily reliant on statistics such as psychology. [16]

Experiments by Morewedge and colleagues (2015) have found interactive computer games and instructional videos can result in long-term debiasing at a general level. In a series of experiments, training with interactive computer games that provided players with personalized feedback, mitigating strategies, and practice, reduced six cognitive biases by more than 30% immediately and by more than 20% as long as three months later. The biased reduced were anchoring, bias blind spot, confirmation bias, fundamental attribution error, projection bias, and representativeness. [2] [13]

Training in reference class forecasting may also improve outcomes. Reference class forecasting is a method for systematically debiasing estimates and decisions, based on what Daniel Kahneman calls the outside view. As pointed out by Kahneman in Thinking, Fast and Slow (p. 252), one of the reasons reference class forecasting is effective for debiasing is that, in contrast to conventional forecasting methods, it takes into account the so-called "unknown unknowns." According to Kahneman, reference class forecasting is effective for debiasing and "has come a long way" in practical implementation since he originally proposed the idea with Amos Tversky (p. 251).

Sometimes effective strategies

Incentives

Nudges

Training

See also

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

A heuristic, or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Bounded rationality is the idea that rationality is limited when individuals make decisions, and under these limitations, rational individuals will select a decision that is satisfactory rather than optimal.

<span class="mw-page-title-main">Behavioral economics</span> Academic discipline

Behavioral economics is the study of the psychological, cognitive, emotional, cultural and social factors involved in the decisions of individuals or institutions, and how these decisions deviate from those implied by classical economic theory.

Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, is the common tendency for people to perceive past events as having been more predictable than they were.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of known protyical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

<span class="mw-page-title-main">Loss aversion</span> Overall description of loss aversion theory

Loss aversion is a psychological and economic concept which refers to how outcomes are interpreted as gains and losses where losses are subject to more sensitivity in people's responses compared to equivalent gains acquired. Kahneman and Tversky (1992) have suggested that losses can be twice as powerful, psychologically, as gains. When defined in terms of the utility function shape as in the Cumulative Prospect Theory (CPT), losses have a steeper utility than gains, thus being more "painful" than the satisfaction from a comparable gain as shown in Figure 1. Loss aversion was first proposed by Amos Tversky and Daniel Kahneman as an important framework for Prospect Theory - an analysis of decision under risk.

The peak–end rule is a psychological heuristic in which people judge an experience largely based on how they felt at its peak and at its end, rather than based on the total sum or average of every moment of the experience. The effect occurs regardless of whether the experience is pleasant or unpleasant. To the heuristic, other information aside from that of the peak and end of the experience is not lost, but it is not used. This includes net pleasantness or unpleasantness and how long the experience lasted. The peak–end rule is thereby a specific form of the more general extension neglect and duration neglect.

The conjunction fallacy is an inference that a conjoint set of two or more specific conclusions is likelier than any single member of that same set, in violation of the laws of probability. It is a type of formal fallacy.

The anchoring effect is a psychological phenomenon in which an individual's judgements or decisions are influenced by a reference point or "anchor" which can be completely irrelevant. Both numeric and non-numeric anchoring have been reported in research. In numeric anchoring, once the value of the anchor is set, subsequent arguments, estimates, etc. made by an individual may change from what they would have otherwise been without the anchor. For example, an individual may be more likely to purchase a car if it is placed alongside a more expensive model. Prices discussed in negotiations that are lower than the anchor may seem reasonable, perhaps even cheap to the buyer, even if said prices are still relatively higher than the actual market value of the car. Another example may be when estimating the orbit of Mars, one might start with the Earth's orbit and then adjust upward until they reach a value that seems reasonable.

In psychology and behavioral economics, the endowment effect is the finding that people are more likely to retain an object they own than acquire that same object when they do not own it. The endowment theory can be defined as "an application of prospect theory positing that loss aversion associated with ownership explains observed exchange asymmetries."

Affective forecasting is the prediction of one's affect in the future. As a process that influences preferences, decisions, and behavior, affective forecasting is studied by both psychologists and economists, with broad applications.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

Salience is that property by which some thing stands out. Salient events are an attentional mechanism by which organisms learn and survive; those organisms can focus their limited perceptual and cognitive resources on the pertinent subset of the sensory data available to them.

Optimism bias is a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event. It is also known as unrealistic optimism or comparative optimism.

Choice architecture is the design of different ways in which choices can be presented to decision makers, and the impact of that presentation on decision-making. For example, each of the following:

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Duration neglect is the psychological observation that people's judgments of the unpleasantness of painful experiences depend very little on the duration of those experiences. Multiple experiments have found that these judgments tend to be affected by two factors: the peak and how quickly the pain diminishes. If it diminishes more slowly, the experience is judged to be less painful. Hence, the term "peak–end rule" describes this process of evaluation.

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

References

  1. Mullainathan, Sendhil (January 3, 2015). "Racial Bias, Even When We Have Good Intentions" . The New York Times. Retrieved July 25, 2016.
  2. 1 2 3 4 5 6 Morewedge, C. K.; Yoon, H.; Scopelliti, I.; Symborski, C. W.; Korris, J. H.; Kassam, K. S. (13 August 2015). "Debiasing Decisions: Improved Decision Making With a Single Training Intervention" (PDF). Policy Insights from the Behavioral and Brain Sciences. 2 (1): 129–140. doi:10.1177/2372732215600886. S2CID   4848978.
  3. Parker, Andrew M.; Fischhoff, Baruch (January 2005). "Decision-making competence: External validation through an individual-differences approach". Journal of Behavioral Decision Making. 18 (1): 1–27. doi: 10.1002/bdm.481 .
  4. Larrick, Richard (2004). "Debiasing". Blackwell handbook of judgment and decision making (1st ed.). Malden, Mass. [u.a.]: Blackwell. p. 316. ISBN   978-1-4051-0746-4.
  5. Sunstein, Richard H. Thaler, Cass R. (2008). Nudge: improving decisions about health, wealth, and happiness (Revised and expanded ed.). New Haven, CT: Yale University Press. ISBN   9780300122237.{{cite book}}: CS1 maint: multiple names: authors list (link)
  6. 1 2 3 Arkes, Hal R. (1991). "Costs and benefits of judgment errors: Implications for debiasing" . Psychological Bulletin. 110 (3): 486–498. doi:10.1037/0033-2909.110.3.486.
  7. 1 2 Lerner, Jennifer S.; Tetlock, Philip E. (1999). "Accounting for the effects of accountability". Psychological Bulletin. 125 (2): 255–275. doi:10.1037/0033-2909.125.2.255. PMID   10087938.
  8. French, SA (2003). "Pricing effects on food choices". The Journal of Nutrition. 133 (3): 841S–843S. doi: 10.1093/jn/133.3.841S . PMID   12612165.
  9. 1 2 Schwartz, J.; Mochon, D.; Wyper, L.; Maroba, J.; Patel, D.; Ariely, D. (3 January 2014). "Healthier by Precommitment". Psychological Science. 25 (2): 538–546. doi:10.1177/0956797613510950. PMID   24390824. S2CID   5113311.
  10. Ariely, Dan; Gneezy, Uri; Loewenstein, George; Mazar, Nina (April 2009). "Large Stakes and Big Mistakes" (PDF). Review of Economic Studies. 76 (2): 451–469. doi:10.1111/j.1467-937X.2009.00534.x. Archived from the original on 2016-03-13.{{cite journal}}: CS1 maint: bot: original URL status unknown (link)
  11. 1 2 Trudel, Remi; Murray, Kyle B.; Kim, Soyoung; Chen, Shuo (2015). "The impact of traffic light color-coding on food health perceptions and choice". Journal of Experimental Psychology: Applied. 21 (3): 255–275. doi:10.1037/xap0000049. PMID   26121372.
  12. Schwartz, J.; Riis, J.; Elbel, B.; Ariely, D. (8 February 2012). "Inviting Consumers To Downsize Fast-Food Portions Significantly Reduces Calorie Consumption". Health Affairs. 31 (2): 399–407. doi: 10.1377/hlthaff.2011.0224 . PMID   22323171.
  13. 1 2 Morewedge, Carey K. (2015-10-13). "How a Video Game Helped People Make Better Decisions". Harvard Business Review. Retrieved 2015-10-17.
  14. Dhami, Mandeep (2013). Judgment and Decision Making as a Skill: Learning, Development and Evolution . Cambridge University Press. ISBN   9781107676527.
  15. Fischhoff, Baruch (1982-04-30). "Debiasing". In Kahneman, Daniel; Slovic, Paul; Tversky, Amos (eds.). Judgment Under Uncertainty: Heuristics and Biases. Cambridge University Press. ISBN   9780521284141.
  16. 1 2 Nisbett, R. E.; Fong, G. T.; Lehman, D. R.; Cheng, P. W. (1987-10-30). "Teaching reasoning". Science. 238 (4827): 625–631. Bibcode:1987Sci...238..625N. doi:10.1126/science.3672116. ISSN   0036-8075. PMID   3672116.
  17. Simmons, Joseph P.; LeBoeuf, Robyn A.; Nelson, Leif D. (2010). "The effect of accuracy motivation on anchoring and adjustment: Do people adjust from provided anchors?". Journal of Personality and Social Psychology. 99 (6): 917–932. doi:10.1037/a0021540. PMID   21114351.
  18. Hirt, Edward R.; Markman, Keith D. (1995). "Multiple explanation: A consider-an-alternative strategy for debiasing judgments". Journal of Personality and Social Psychology. 69 (6): 1069–1086. doi:10.1037/0022-3514.69.6.1069.
  19. Hershfield, Hal E; Goldstein, Daniel G; Sharpe, William F; Fox, Jesse; Yeykelis, Leo; Carstensen, Laura L; Bailenson, Jeremy N (2011-11-01). "Increasing Saving Behavior Through Age-Progressed Renderings of the Future Self". Journal of Marketing Research. 48 (SPL): S23–S37. doi:10.1509/jmkr.48.SPL.S23. ISSN   0022-2437. PMC   3949005 . PMID   24634544.