Outcome bias

Last updated

The outcome bias is an error made in evaluating the quality of a decision when the outcome of that decision is already known. Specifically, the outcome effect occurs when the same "behavior produce[s] more ethical condemnation when it happen[s] to produce bad rather than good outcome, even if the outcome is determined by chance." [1]

Contents

While similar to the hindsight bias, the two phenomena are markedly different. Hindsight bias focuses on memory distortion to favor the actor, while the outcome bias focuses exclusively on weighting the outcome heavier than other pieces of information in deciding if a past decision was correct.

Overview

One will often judge a past decision by its ultimate outcome instead of based on the quality of the decision at the time it was made, given what was known at that time. This is an error because no decision-maker ever knows whether or not a calculated risk will turn out for the best. The actual outcome of the decision will often be determined by chance, with some risks working out and others not. Individuals whose judgments are influenced by outcome bias are seemingly holding decision-makers responsible for events beyond their control.

Baron and Hershey (1988) presented subjects with hypothetical situations in order to test this. [2] One such example involved a surgeon deciding whether or not to do a risky surgery on a patient. The surgery had a known probability of success. Subjects were presented with either a good or bad outcome (in this case living or dying), and asked to rate the quality of the surgeon's pre-operation decision. Those presented with bad outcomes rated the decision worse than those who had good outcomes. "The ends justify the means" is an often-used aphorism to express the outcome effect when the outcome is desirable.

This mistake occurs when currently available information is incorporated when evaluating a past decision. To avoid the influence of outcome bias, one would evaluate a decision by ignoring information collected after the fact and focusing on what the right answer is, or was at the time the decision was made.

Outside of psychological experiments, the outcome bias has been found to be substantially present in real world situations. A study looking at the evaluation of football players' performance by coaches and journalists found that players' performance is judged to be substantially better—over a whole match—if the player had a lucky goal rather than an unlucky miss (after a player's shot hit one of the goal posts). [3] Another study found that professional basketball coaches are "more likely to revise their strategy after a loss than a win... even when a loss was expected and even when failure is due to factors beyond the team's control." [4]

The outcome bias is closely related to the philosophical concept of moral luck as in both concepts, the evaluation of actions is influenced by factors that are not logically justifiable. [5]

See also

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

In common usage, evaluation is a systematic determination and assessment of a subject's merit, worth and significance, using criteria governed by a set of standards. It can assist an organization, program, design, project or any other intervention or initiative to assess any aim, realisable concept/proposal, or any alternative, to help in decision-making; or to generate the degree of achievement or value in regard to the aim and objectives and results of any such action that has been completed.

<span class="mw-page-title-main">Decision-making</span> Cognitive process to choose a course of action or belief

In psychology, decision-making is regarded as the cognitive process resulting in the selection of a belief or a course of action among several possible alternative options. It could be either rational or irrational. The decision-making process is a reasoning process based on assumptions of values, preferences and beliefs of the decision-maker. Every decision-making process produces a final choice, which may or may not prompt action.

Moral reasoning is the study of how people think about right and wrong and how they acquire and apply moral rules. It is a subdiscipline of moral psychology that overlaps with moral philosophy, and is the foundation of descriptive ethics.

Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, is the common tendency for people to perceive past events as having been more predictable than they were.

In psychology, an attribution bias or attributional errors is a cognitive bias that refers to the systematic errors made when people evaluate or try to find reasons for their own and others' behaviors. It refers to the systematic patterns of deviation from norm or rationality in judgment, often leading to perceptual distortions, inaccurate assessments, or illogical interpretations of events and behaviors.

The group attribution error refers to people's tendency to believe either

  1. the characteristics of an individual group member are reflective of the group as a whole, or
  2. a group's decision outcome must reflect the preferences of individual group members, even when external information is available suggesting otherwise.

The illusion of control is the tendency for people to overestimate their ability to control events. It was named by U.S. psychologist Ellen Langer and is thought to influence gambling behavior and belief in the paranormal. Along with illusory superiority and optimism bias, the illusion of control is one of the positive illusions.

The halo effect is the tendency for positive or negative impressions of a person, company, country, brand, or product in one area to positively or negatively influence one's opinion or feelings. Halo effect is "the name given to the phenomenon whereby evaluators tend to be influenced by their previous judgments of performance or personality." The halo effect can be a cognitive bias which can possibly prevent someone from accepting a person, a product or a brand based on the idea of an unfounded belief on what is good or bad. A halo effect may not be a bias if there is a good reason to think attributes are correlated. For instance, if a student has a good grade in one subject it would not be an error to think they are likely to have good grades in other subjects.

The historian's fallacy is an informal fallacy that occurs when one assumes that decision makers of the past viewed events from the same perspective and having the same information as those subsequently analyzing the decision. It is not to be confused with presentism, a similar but distinct mode of historical analysis in which present-day ideas are projected into the past. The idea was first articulated by British literary critic Matthew Arnold in 1880 and later named and defined by American historian David Hackett Fischer in 1970.

In the psychology of affective forecasting, the impact bias, a form of which is the durability bias, is the tendency for people to overestimate the length or the intensity of future emotional states.

Affective forecasting is the prediction of one's affect in the future. As a process that influences preferences, decisions, and behavior, affective forecasting is studied by both psychologists and economists, with broad applications.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

Omission bias is the phenomenon in which people prefer omission (inaction) over commission (action) and people tend to judge harm as a result of commission more negatively than harm as a result of omission. It can occur due to a number of processes, including psychological inertia, the perception of transaction costs, and the perception that commissions are more causal than omissions. In social political terms the Universal Declaration of Human Rights establishes how basic human rights are to be assessed in article 2, as "without distinction of any kind, such as race, colour, sex, language, religion, political or other opinion, national or social origin, property, birth or other status." criteria that are often subject to one or another form of omission bias. It is controversial as to whether omission bias is a cognitive bias or is often rational. The bias is often showcased through the trolley problem and has also been described as an explanation for the endowment effect and status quo bias.

In psychology, a heuristic is an easy-to-compute procedure or rule of thumb that people use when forming beliefs, judgments or decisions. The familiarity heuristic was developed based on the discovery of the availability heuristic by psychologists Amos Tversky and Daniel Kahneman; it happens when the familiar is favored over novel places, people, or things. The familiarity heuristic can be applied to various situations that individuals experience in day-to-day life. When these situations appear similar to previous situations, especially if the individuals are experiencing a high cognitive load, they may regress to the state of mind in which they have felt or behaved before. This heuristic is useful in most situations and can be applied to many fields of knowledge; however, there are both positives and negatives to this heuristic as well.

One way of thinking holds that the mental process of decision-making is rational: a formal process based on optimizing utility. Rational thinking and decision-making does not leave much room for emotions. In fact, emotions are often considered irrational occurrences that may distort reasoning.

Choice architecture is the design of different ways in which choices can be presented to decision makers, and the impact of that presentation on decision-making. For example, each of the following:

<span class="mw-page-title-main">Naïve cynicism</span> Cognitive bias

Naïve cynicism is a philosophy of mind, cognitive bias and form of psychological egoism that occurs when people naïvely expect more egocentric bias in others than actually is the case.

The curse of knowledge is a cognitive bias that occurs when an individual, who is communicating with others, assumes that others have information that is only available to themselves, assuming they all share a background and understanding. This bias is also called by some authors the curse of expertise.

References

  1. Gino, Francesca; Moore, Don A.; Bazerman, Max H. (2009). "No Harm, No Foul: The Outcome Bias in Ethical Judgments" (PDF). SSRN   1099464. Archived from the original (PDF) on 2014-06-30. Retrieved 2013-04-04. Harvard Business School Working Paper, No. 08-080.
  2. Baron, Jonathan; Hershey, John C. (1988). "Outcome bias in decision evaluation" (PDF). Journal of Personality and Social Psychology. 54 (4): 569–579. doi:10.1037/0022-3514.54.4.569. PMID   3367280. S2CID   26091312.
  3. Gauriot, Romain; Page, Lionel (2019). "Fooled by Performance Randomness: Overrewarding Luck" (PDF). The Review of Economics and Statistics. 101 (4): 658–666. doi:10.1162/rest_a_00783. S2CID   151803360.
  4. Lefgren, Lars; Platt, Brennan; Price, Joseph (2015). "Sticking with What (Barely) Worked: A Test of Outcome Bias". Management Science. 61 (5): 1121–1136. doi:10.1287/mnsc.2014.1966. JSTOR   24550666.
  5. Max, Raphael; Uhl, Matthias (2023). "Moral luck in investment contexts: We consciously find unprofitable investments less moral". PLOS ONE. 18 (1): e0278677. doi: 10.1371/journal.pone.0278677 . PMC   9844880 . PMID   36649364.