Omission bias

Last updated

Omission bias is the phenomenon in which people prefer omission (inaction) over commission (action), and tend to judge harm as a result of commission more negatively than harm as a result of omission. [1] [2] [3] It can occur due to a number of processes, including psychological inertia, [4] the perception of transaction costs, and the perception that commissions are more causal than omissions. [5]

Contents

In social political terms the Universal Declaration of Human Rights establishes how basic human rights are to be assessed in article 2, as "without distinction of any kind, such as race, colour, sex, language, religion, political or other opinion, national or social origin, property, birth or other status." criteria that are often subject to one or another form of omission bias. It is controversial as to whether omission bias is a cognitive bias or is often rational. [4] [6] The bias is often showcased through the trolley problem and has also been described as an explanation for the endowment effect and status quo bias. [2] [7]


Examples and applications

Taoism may gnomically promote inaction: "If you follow the Way you shall do less each day. You shall do less and less until you do nothing at all. And if you do nothing at all, there is nothing that is left undone." [8]

Spranca, Minsk and Baron extended the omission bias to judgments of morality of choices. In one scenario, John, a tennis player, would be facing a tough opponent the next day in a decisive match. John knows his opponent is allergic to a food substance. Subjects were presented with two conditions: John recommends the food containing the allergen to hurt his opponent's performance, or the opponent himself orders the allergenic food, and John says nothing. A majority of people judged that John's action of recommending the allergenic food as more immoral than John's inaction of not informing the opponent of the allergenic substance. [9]

The effect has also held in real-world athletic arenas: NBA statistics showcased that referees called 50 percent fewer fouls in the final moments of close games. [10] [ clarification needed ]

An additional real-world example is when parents decide not to vaccinate their children because of the potential chance of death—even when the probability the vaccination will cause death is much less likely than death from the disease prevented. [11]

See also

Related Research Articles

Bias is a disproportionate weight in favor of or against an idea or thing, usually in a way that is inaccurate, closed-minded, prejudicial, or unfair. Biases can be innate or learned. People may develop biases for or against an individual, a group, or a belief. In science and engineering, a bias is a systematic error. Statistical bias results from an unfair sampling of a population, or from an estimation process that does not give accurate results on average.

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs.

In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to "see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances". In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.

A status quo bias or default bias is a cognitive bias which results from a preference for the maintenance of one's existing state of affairs. The current baseline is taken as a reference point, and any change from that baseline is perceived as a loss or gain. Corresponding to different alternatives, this current baseline or default option is perceived and evaluated by individuals as a positive.

Moral Psychology is the study of human thought and behavior in ethical contexts. Historically, the term "moral psychology" was used relatively narrowly to refer to the study of moral development. This field of study is interdisciplinary between the application of philosophy and psychology. Moral psychology eventually came to refer more broadly to various topics at the intersection of ethics, psychology, and philosophy of mind. Some of the main topics of the field are moral judgment, moral reasoning, moral satisficing, moral sensitivity, moral responsibility, moral motivation, moral identity, moral action, moral development, moral diversity, moral character, altruism, psychological egoism, moral luck, moral forecasting, moral emotion, affective forecasting, and moral disagreement.

Integrative complexity is a research psychometric that refers to the degree to which thinking and reasoning involve the recognition and integration of multiple perspectives and possibilities and their interrelated contingencies.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

Jonathan Baron is an American psychologist. He is a professor emeritus of psychology at the University of Pennsylvania in the science of decision-making.

<span class="mw-page-title-main">Regret</span> Negative conscious and emotional reaction to personal past acts and behaviours

Regret is the emotion of wishing one had made a different decision in the past, because the consequences of the decision one did make were unfavorable.

In ethics and social sciences, value denotes the degree of importance of some thing or action, with the aim of determining which actions are best to do or what way is best to live, or to describe the significance of different actions. Value systems are proscriptive and prescriptive beliefs; they affect the ethical behavior of a person or are the basis of their intentional activities. Often primary values are strong and secondary values are suitable for changes. What makes an action valuable may in turn depend on the ethical values of the objects it increases, decreases, or alters. An object with "ethic value" may be termed an "ethic or philosophic good".

The ambiguity effect is a cognitive tendency where decision making is affected by a lack of information, or "ambiguity". The effect implies that people tend to select options for which the probability of a favorable outcome is known, over an option for which the probability of a favorable outcome is unknown. The effect was first described by Daniel Ellsberg in 1961.

The negativity bias, also known as the negativity effect, is a cognitive bias that, even when positive or neutral things of equal intensity occur, things of a more negative nature have a greater effect on one's psychological state and processes than neutral or positive things. In other words, something very positive will generally have less of an impact on a person's behavior and cognition than something equally emotional but negative. The negativity bias has been investigated within many different domains, including the formation of impressions and general evaluations; attention, learning, and memory; and decision-making and risk considerations.

Motivated reasoning is a cognitive and social response in which individuals, consciously or sub-consciously, allow emotion-loaded motivational biases to affect how new information is perceived. Individuals tend to favor evidence that coincides with their current beliefs and reject new information that contradicts them, despite contrary evidence.

<span class="mw-page-title-main">Decision fatigue</span> Process of decline in quality of decisions over time

In decision making and psychology, decision fatigue refers to the deteriorating quality of decisions made by an individual after a long session of decision making. It is now understood as one of the causes of irrational trade-offs in decision making. Decision fatigue may also lead to consumers making poor choices with their purchases.

In social psychology, naïve realism is the human tendency to believe that we see the world around us objectively, and that people who disagree with us must be uninformed, irrational, or biased.

Neal Roese holds the SC Johnson Chair in Global Marketing at the Kellogg School of Management at Northwestern University. Trained as a social psychologist, he is most well known for his work on judgment and decision making, counterfactual thinking, and regret.

Automation bias is the propensity for humans to favor suggestions from automated decision-making systems and to ignore contradictory information made without automation, even if it is correct. Automation bias stems from the social psychology literature that found a bias in human-human interaction that showed that people assign more positive evaluations to decisions made by humans than to a neutral object. The same type of positivity bias has been found for human-automation interaction, where the automated decisions are rated more positively than neutral. This has become a growing problem for decision making as intensive care units, nuclear power plants, and aircraft cockpits have increasingly integrated computerized system monitors and decision aids to mostly factor out possible human error. Errors of automation bias tend to occur when decision-making is dependent on computers or other automated aids and the human is in an observatory role but able to make decisions. Examples of automation bias range from urgent matters like flying a plane on automatic pilot to such mundane matters as the use of spell-checking programs.

Action bias is the psychological phenomenon where people tend to favor action over inaction, even when there is no indication that doing so would point towards a better result. It is an automatic response, similar to a reflex or an impulse and is not based on rational thinking. One of the first appearances of the term "action bias" in scientific journals was in a 2000 paper by Patt and Zechenhauser titled "Action Bias and Environmental Decisions", where its relevance in politics was expounded.

References

  1. Yeung, Siu Kit; Yay, Tijen; Feldman, Gilad (9 September 2021). "Action and Inaction in Moral Judgments and Decisions: Meta-Analysis of Omission Bias Omission-Commission Asymmetries". Personality and Social Psychology Bulletin. 48 (10): 1499–1515. doi:10.1177/01461672211042315. PMID   34496694. S2CID   237453626.
  2. 1 2 Ritov, Ilana; Baron, Jonathan (February 1992). "Status-quo and omission biases". Journal of Risk and Uncertainty. 5 (1). doi:10.1007/BF00208786. S2CID   143857417.
  3. Baron, Jonathan; Ritov, Ilana (September 1994). "Reference Points and Omission Bias". Organizational Behavior and Human Decision Processes. 59 (3): 475–498. doi: 10.1006/obhd.1994.1070 .
  4. 1 2 Gal, David (July 2006). "A Psychological Law of Inertia and the Illusion of Loss Aversion" (PDF). Judgment and Decision Making. 1: 23–32. doi:10.1017/S1930297500000322.
  5. Yeung, Siu Kit; Yay, Tijen; Feldman, Gilad (9 September 2021). "Action and Inaction in Moral Judgments and Decisions: Meta-Analysis of Omission Bias Omission-Commission Asymmetries". Personality and Social Psychology Bulletin. 48 (10): 1499–1515. doi:10.1177/01461672211042315. PMID   34496694. S2CID   237453626.
  6. Howard-Snyder, Frances (2011). "Doing vs. Allowing Harm". The Stanford Encyclopedia of Philosophy .
  7. Gal, David; Rucker, Derek D.; Shavitt, Sharon (July 2018). "The Loss of Loss Aversion: Will It Loom Larger Than Its Gain?". Journal of Consumer Psychology. 28 (3): 497–516. doi: 10.1002/jcpy.1047 . S2CID   148956334.
  8. Tao Te Ching , chapter 48.
  9. Spranca, Mark; Minsk, Elisa; Baron, Jonathan (1991). "Omission and commission in judgment and choice". Journal of Experimental Social Psychology. 27 (1): 76–105. CiteSeerX   10.1.1.137.9435 . doi:10.1016/0022-1031(91)90011-T.
  10. Moskowitz, Tobias; Wertheim, L. Jon (2011). Scorecasting: The Hidden Influences Behind How Sports Are Played and Games Are Won . Crown Publishing Group. p.  24. ISBN   978-0-307-59181-4.
  11. Ritov, Ilana; Baron, Jonathan (October 1990). "Reluctance to vaccinate: Omission bias and ambiguity". Journal of Behavioral Decision Making. 3 (4): 263–277. doi:10.1002/bdm.3960030404.

Bibliography