Omission bias

Last updated

Omission bias is the phenomenon in which people prefer omission (inaction) over commission (action) and people tend to judge harm as a result of commission more negatively than harm as a result of omission. [1] [2] [3] It can occur due to a number of processes, including psychological inertia, [4] the perception of transaction costs, and the perception that commissions are more causal than omissions. [5] In social political terms the Universal Declaration of Human Rights establishes how basic human rights are to be assessed in article 2, as "without distinction of any kind, such as race, colour, sex, language, religion, political or other opinion, national or social origin, property, birth or other status." criteria that are often subject to one or another form of omission bias. It is controversial as to whether omission bias is a cognitive bias or is often rational. [4] [6] The bias is often showcased through the trolley problem and has also been described as an explanation for the endowment effect and status quo bias. [2] [7]

Contents

Examples and applications

Spranca, Minsk and Baron extended the omission bias to judgments of morality of choices. In one scenario, John, a tennis player, would be facing a tough opponent the next day in a decisive match. John knows his opponent is allergic to a food substance. Subjects were presented with two conditions: John recommends the food containing the allergen to hurt his opponent's performance, or the opponent himself orders the allergenic food, and John says nothing. A majority of people judged that John's action of recommending the allergenic food as being more immoral than John's inaction of not informing the opponent of the allergenic substance. [8]

The effect has also held in real world athletic arenas: NBA statistics showcased referees called 50 percent fewer fouls in the final moments of close games. [9] [ clarification needed ]

An additional real-world example is when parents decide not to vaccinate their children because of the potential chance of death—even when the probability the vaccination will cause death is much less likely than death from the disease prevented. [10]

See also

Related Research Articles

Bias is a disproportionate weight in favor of or against an idea or thing, usually in a way that is, inaccurate, closed-minded, prejudicial, or unfair. Biases can be innate or learned. People may develop biases for or against an individual, a group, or a belief. In science and engineering, a bias is a systematic error. Statistical bias results from an unfair sampling of a population, or from an estimation process that does not give accurate results on average.

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills.

In social psychology, fundamental attribution error, also known as correspondence bias or attribution effect, is a cognitive attribution bias where observers underemphasize situational and environmental factors for the behavior of an actor while overemphasizing dispositional or personality factors. In other words, observers tend to overattribute the behaviors of others to their personality and underattribute them to the situation or context. Although personality traits and predispositions are considered to be observable facts in psychology, the fundamental attribution error is an error because it misinterprets their effects.

In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to "see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances". In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.

Status quo bias is an emotional bias; a preference for the maintenance of one's current or previous state of affairs, or a preference to not undertake any action to change this current or previous state. The current baseline is taken as a reference point, and any change from that baseline is perceived as a loss or gain. Corresponding to different alternatives, this current baseline or default option is perceived and evaluated by individuals as a positive.

Moral psychology is a field of study in both philosophy and psychology. Historically, the term "moral psychology" was used relatively narrowly to refer to the study of moral development. Moral psychology eventually came to refer more broadly to various topics at the intersection of ethics, psychology, and philosophy of mind. Some of the main topics of the field are moral judgment, moral reasoning, moral sensitivity, moral responsibility, moral motivation, moral identity, moral action, moral development, moral diversity, moral character, altruism, psychological egoism, moral luck, moral forecasting, moral emotion, affective forecasting, and moral disagreement.

Integrative complexity is a research psychometric that refers to the degree to which thinking and reasoning involve the recognition and integration of multiple perspectives and possibilities and their interrelated contingencies.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

Jonathan Baron is an American psychologist. He is a professor emeritus of psychology at the University of Pennsylvania in the science of decision-making.

<span class="mw-page-title-main">Regret</span> Negative conscious and emotional reaction to personal past acts and behaviours

Regret is the emotion of wishing one had made a different decision in the past, because the consequences of the decision one did make were unfavorable.

In ethics and social sciences, value denotes the degree of importance of some thing or action, with the aim of determining which actions are best to do or what way is best to live, or to describe the significance of different actions. Value systems are prospective and prescriptive beliefs; they affect the ethical behavior of a person or are the basis of their intentional activities. Often primary values are strong and secondary values are suitable for changes. What makes an action valuable may in turn depend on the ethical values of the objects it increases, decreases, or alters. An object with "ethic value" may be termed an "ethic or philosophic good".

The outcome bias is an error made in evaluating the quality of a decision when the outcome of that decision is already known. Specifically, the outcome effect occurs when the same "behavior produce[s] more ethical condemnation when it happen[s] to produce bad rather than good outcome, even if the outcome is determined by chance."

The ambiguity effect is a cognitive tendency where decision making is affected by a lack of information, or "ambiguity". The effect implies that people tend to select options for which the probability of a favorable outcome is known, over an option for which the probability of a favorable outcome is unknown. The effect was first described by Daniel Ellsberg in 1961.

<span class="mw-page-title-main">Jonathan Haidt</span> American social psychologist (born 1963)

Jonathan David Haidt is an American social psychologist and author. He is the Thomas Cooley Professor of Ethical Leadership at the New York University Stern School of Business. His main areas of study are the psychology of morality and moral emotions.

Shared information bias is known as the tendency for group members to spend more time and energy discussing information that all members are already familiar with, and less time and energy discussing information that only some members are aware of. Harmful consequences related to poor decision-making can arise when the group does not have access to unshared information in order to make a well-informed decision.

Neal Roese holds the SC Johnson Chair in Global Marketing at the Kellogg School of Management at Northwestern University. Trained as a social psychologist, he is most well known for his work on judgment and decision making, counterfactual thinking, and regret.

Psychological inertia is the tendency to maintain the status quo unless compelled by a psychological motive to intervene or reject this.

Action bias is the psychological phenomenon where people tend to favor action over inaction, even when there is no indication that doing so would point towards a better result. It is an automatic response, similar to a reflex or an impulse and is not based on rational thinking. One of the first appearances of the term "action bias" in scientific journals was in a 2000 paper by Patt and Zechenhauser titled "Action Bias and Environmental Decisions", where its relevance in politics was expounded.

References

  1. Yeung, Siu Kit; Yay, Tijen; Feldman, Gilad (9 September 2021). "Action and Inaction in Moral Judgments and Decisions: Meta-Analysis of Omission Bias Omission-Commission Asymmetries". Personality and Social Psychology Bulletin. 48 (10): 1499–1515. doi:10.1177/01461672211042315. PMID   34496694. S2CID   237453626.
  2. 1 2 Ritov, Ilana; Baron, Jonathan (February 1992). "Status-quo and omission biases". Journal of Risk and Uncertainty. 5 (1). doi:10.1007/BF00208786. S2CID   143857417.
  3. Baron, Jonathan; Ritov, Ilana (September 1994). "Reference Points and Omission Bias". Organizational Behavior and Human Decision Processes. 59 (3): 475–498. doi: 10.1006/obhd.1994.1070 .
  4. 1 2 Gal, David (July 2006). "A Psychological Law of Inertia and the Illusion of Loss Aversion" (PDF). Judgment and Decision Making. 1: 23–32. doi:10.1017/S1930297500000322.
  5. Yeung, Siu Kit; Yay, Tijen; Feldman, Gilad (9 September 2021). "Action and Inaction in Moral Judgments and Decisions: Meta-Analysis of Omission Bias Omission-Commission Asymmetries". Personality and Social Psychology Bulletin. 48 (10): 1499–1515. doi:10.1177/01461672211042315. PMID   34496694. S2CID   237453626.
  6. Howard-Snyder, Frances (2011). "Doing vs. Allowing Harm". The Stanford Encyclopedia of Philosophy .
  7. Gal, David; Rucker, Derek D.; Shavitt, Sharon (July 2018). "The Loss of Loss Aversion: Will It Loom Larger Than Its Gain?". Journal of Consumer Psychology. 28 (3): 497–516. doi: 10.1002/jcpy.1047 . S2CID   148956334.
  8. Spranca, Mark; Minsk, Elisa; Baron, Jonathan (1991). "Omission and commission in judgment and choice". Journal of Experimental Social Psychology. 27 (1): 76–105. CiteSeerX   10.1.1.137.9435 . doi:10.1016/0022-1031(91)90011-T.
  9. Moskowitz, Tobias; Wertheim, L. Jon (2011). Scorecasting: The Hidden Influences Behind How Sports Are Played and Games Are Won . Crown Publishing Group. p.  24. ISBN   978-0-307-59181-4.
  10. Ritov, Ilana; Baron, Jonathan (October 1990). "Reluctance to vaccinate: Omission bias and ambiguity". Journal of Behavioral Decision Making. 3 (4): 263–277. doi:10.1002/bdm.3960030404.

Bibliography