This article needs additional citations for verification .(March 2023) |
The neglect of probability, a type of cognitive bias, is the tendency to disregard probability when making a decision under uncertainty and is one simple way in which people regularly violate the normative rules for decision making. Small risks are typically either neglected entirely or hugely overrated. The continuum between the extremes is ignored. The term probability neglect was coined by Cass Sunstein. [1]
There are many related ways in which people violate the normative rules of decision making with regard to probability including the hindsight bias, the neglect of prior base rates effect, and the gambler's fallacy. However, this bias is different, in that, rather than incorrectly using probability, the actor disregards it.
"We have no intuitive grasp of risk and thus distinguish poorly among different threats," Dobelli has written. "The more serious the threat and the more emotional the topic (such as radioactivity), the less reassuring a reduction in risk seems to us." [2]
In a 1972 experiment, participants were divided into two groups, with the former being told they would receive a mild electric shock and the latter told that there was a 50 percent chance they would receive such a shock. When the subjects' physical anxiety was measured, there was no difference between the two groups. This lack of difference remained even when the second group's chance of being shocked was lowered to 20 percent, then ten, then five. The conclusion: "we respond to the expected magnitude of an event...but not to its likelihood. In other words: We lack an intuitive grasp of probability." [2]
Baron (2000) suggests that the bias manifests itself among adults especially when it comes to difficult choices, such as medical decisions. This bias could make actors drastically violate expected-utility theory in their decision making, especially when a decision must be made in which one possible outcome has a much lower or higher utility but a small probability of occurring (e.g. in medical or gambling situations). In this aspect, the neglect of probability bias is similar to the neglect of prior base rates effect.
Cass Sunstein has cited the history of Love Canal in upstate New York, which became world-famous in the late 1970s owing to widely publicized public concerns about abandoned waste that was supposedly causing medical problems in the area. In response to these concerns, the U.S. federal government set in motion "an aggressive program for cleaning up abandoned hazardous waste sites, without examining the probability that illness would actually occur," and legislation was passed that did not reflect serious study of the actual degree of danger. Furthermore, when controlled studies were publicized showing little evidence that the waste represented a menace to public health, the anxiety of local residents did not diminish. [3]
One University of Chicago study showed that people are as afraid of a 1% chance as of a 99% chance of contamination by poisonous chemicals. [2] In another example of near-total neglect of probability, Rottenstreich and Hsee (2001) found that the typical subject was willing to pay $10 to avoid a 99% chance of a painful electric shock, and $7 to avoid a 1% chance of the same shock. They suggest that probability is more likely to be neglected when the outcomes are emotion-arousing.
In 2013, Tom Cagley noted that neglect of probability is "common in IT organizations that are planning and estimating projects or in risk management." He pointed out that there are available techniques, such as the Monte Carlo analysis, to study probability, but too often "the continuum of probability is ignored." [4]
In 2016, Rolf Dobelli presented a choice between two games of chance. In one, you have a one in 100 million chance of winning $10 million; in the other, you have a one in 10,000 chance of winning $10,000. It is more reasonable to choose the second game; but most people would choose the first. For this reason, jackpots in lotteries are growing. [2]
Dobelli has described the United States Food Additives Amendment of 1958 as a "classic example" of neglect of probability. The law – which prohibited carcinogenic substances in food, no matter how low the probability that they would in fact result in cancer – led to the substitution of those substances by ingredients that, while not causing cancer, stood a far greater chance of causing some sort of medical harm. [2]
In 2001, there was widespread panic around the U.S. over shark attacks, even though there was no evidence to show any increase in their occurrence. Legislation was actually enacted to address the issue. [3] Neglect of probability also figures in the purchase of lottery tickets. [4] Cass Sunstein has pointed out that terrorism is effective partly because of probability neglect. [3] "Terrorists show a working knowledge of probability neglect," he wrote in 2003, "producing public fear that might greatly exceed the discounted harm." [5]
The probability bias is especially pronounced among children. In a 1993 study, Baron, Granato, Spranca, and Teubal presented children with the following situation:
Susan and Jennifer are arguing about whether they should wear seat belts when they ride in a car. Susan says that you should. Jennifer says you shouldn't... Jennifer says that she heard of an accident where a car fell into a lake and a woman was kept from getting out in time because of wearing her seat belt, and another accident where a seat belt kept someone from getting out of the car in time when there was a fire. What do you think about this?
Jonathan Baron (2000) notes that subject X responded in the following manner:
It is clear that subject X completely disregards the probability of an accident happening versus the probability of getting hurt by the seat belt in making the decision. A normative model for this decision would advise the use of expected-utility theory to decide which option would likely maximize utility. This would involve weighing the changes in utility in each option by the probability that each option will occur, something that subject X ignores.
Another subject responded to the same question:
Again, the subject disregards the probability in making the decision by treating each possible outcome as equal in his reasoning.
Cass Sunstein has noted that for a long time after 9/11, many people refused to fly because they felt a heightened sense of fear or peril, even though, statistically, most of them "were not at significantly more risk after the attacks than they were before." Indeed, those who chose to drive long distances instead of flying thereby put themselves at an increased risk, given that driving is the less safe form of transportation. [3]
In a 2001 paper, Sunstein addressed the question of how the law should respond to the neglect of probability. He emphasized that it is important for government to "create institutions designed to ensure that genuine risks, rather than tiny ones, receive the most concern". While government policies in regard to potential dangers should focus on statistics and probabilities, government efforts to raise public awareness of these dangers should emphasize worst-case scenarios in order to be maximally effective. Moreover, while it seems advisable for government to "attempt to educate and inform people, rather than capitulating to unwarranted public fear", that fear will remain a real phenomenon and may thus cause serious problems, for example leading citizens to undertake "wasteful and excessive private precautions". In such cases, certain kinds of government regulation may be justified not because they address serious dangers but because they reduce fear. At the same time, government should "treat its citizens with respect" and "not treat them as objects to be channeled in government's preferred directions", so focusing on worst-case scenarios that feed on irrational fears would amount to "unacceptable manipulation". [3] In a 2003 article, however, Sunstein concluded that "As a normative matter, government should reduce even unjustified fear, if the benefits of the response can be shown to outweigh the costs." [5]
In economics and business decision-making, a sunk cost is a cost that has already been incurred and cannot be recovered. Sunk costs are contrasted with prospective costs, which are future costs that may be avoided if action is taken. In other words, a sunk cost is a sum paid in the past that is no longer relevant to decisions about the future. Even though economists argue that sunk costs are no longer relevant to future rational decision-making, people in everyday life often take previous expenditures in situations, such as repairing a car or house, into their future decisions regarding those properties.
Behavioral economics is the study of the psychological, cognitive, emotional, cultural and social factors involved in the decisions of individuals or institutions, and how these decisions deviate from those implied by classical economic theory.
Prospect theory is a theory of behavioral economics, judgment and decision making that was developed by Daniel Kahneman and Amos Tversky in 1979. The theory was cited in the decision to award Kahneman the 2002 Nobel Memorial Prize in Economics.
Decision theory is a branch of applied probability theory and analytic philosophy concerned with the theory of making decisions based on assigning probabilities to various factors and assigning numerical consequences to the outcome.
The St. Petersburg paradox or St. Petersburg lottery is a paradox involving the game of flipping a coin where the expected payoff of the lottery game is infinite but nevertheless seems to be worth only a very small amount to the participants. The St. Petersburg paradox is a situation where a naïve decision criterion that takes only the expected value into account predicts a course of action that presumably no actual person would be willing to take. Several resolutions to the paradox have been proposed, including the impossible amount of money a casino would need to continue the game indefinitely.
The expected utility hypothesis is a foundational assumption in mathematical economics concerning decision making under uncertainty. It postulates that rational agents maximize utility, meaning the subjective desirability of their actions. Rational choice theory, a cornerstone of microeconomics, builds this postulate to model aggregate social behaviour.
A status quo bias is a cognitive bias which results from a preference for the maintenance of one's existing state of affairs. The current baseline is taken as a reference point, and any change from that baseline is perceived as a loss or gain. Corresponding to different alternatives, this current baseline or default option is perceived and evaluated by individuals as a positive.
Cass Robert Sunstein is an American legal scholar known for his work in constitutional law, administrative law, environmental law, and behavioral economics. He is also The New York Times best-selling author of The World According to Star Wars (2016) and Nudge (2008). He was the administrator of the White House Office of Information and Regulatory Affairs in the Obama administration from 2009 to 2012.
Libertarian paternalism is the idea that it is both possible and legitimate for private and public institutions to affect behavior while also respecting freedom of choice, as well as the implementation of that idea. The term was coined by behavioral economist Richard Thaler and legal scholar Cass Sunstein in a 2003 article in the American Economic Review. The authors further elaborated upon their ideas in a more in-depth article published in the University of Chicago Law Review that same year. They propose that libertarian paternalism is paternalism in the sense that "it tries to influence choices in a way that will make choosers better off, as judged by themselves" ; note and consider, the concept paternalism specifically requires a restriction of choice. It is libertarian in the sense that it aims to ensure that "people should be free to opt out of specified arrangements if they choose to do so". The possibility to opt out is said to "preserve freedom of choice". Thaler and Sunstein published Nudge, a book-length defense of this political doctrine, in 2008.
The outcome bias is an error made in evaluating the quality of a decision when the outcome of that decision is already known. Specifically, the outcome effect occurs when the same "behavior produce[s] more ethical condemnation when it happen[s] to produce bad rather than good outcome, even if the outcome is determined by chance."
Zero-risk bias is a tendency to prefer the complete elimination of risk in a sub-part over alternatives with greater overall risk reduction. It often manifests in cases where decision makers address problems concerning health, safety, and the environment. Its effect on decision making has been observed in surveys presenting hypothetical scenarios.
Choice architecture is the design of different ways in which choices can be presented to decision makers, and the impact of that presentation on decision-making. For example, each of the following:
Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.
In simple terms, risk is the possibility of something bad happening. Risk involves uncertainty about the effects/implications of an activity with respect to something that humans value, often focusing on negative, undesirable consequences. Many different definitions have been proposed. One international standard definition of risk is the "effect of uncertainty on objectives".
Nudge: Improving Decisions about Health, Wealth, and Happiness is a book written by University of Chicago economist and Nobel Laureate Richard H. Thaler and Harvard Law School Professor Cass R. Sunstein, first published in 2008. In 2021, a revised edition was released, subtitled The Final Edition.
Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.
Risk aversion is a preference for a sure outcome over a gamble with higher or equal expected value. Conversely, rejection of a sure thing in favor of a gamble of lower or equal expected value is known as risk-seeking behavior.
The discipline of forensic epidemiology (FE) is a hybrid of principles and practices common to both forensic medicine and epidemiology. FE is directed at filling the gap between clinical judgment and epidemiologic data for determinations of causality in civil lawsuits and criminal prosecution and defense.
Noise: A Flaw in Human Judgment is a nonfiction book by professors Daniel Kahneman, Olivier Sibony and Cass Sunstein. It was first published on May 18, 2021. The book concerns 'noise' in human judgment and decision-making. The authors define noise in human judgment as "undesirable variability in judgments of the same problem" and focus on the statistical properties and psychological perspectives of the issue.
Action bias is the psychological phenomenon where people tend to favor action over inaction, even when there is no indication that doing so would point towards a better result. It is an automatic response, similar to a reflex or an impulse and is not based on rational thinking. One of the first appearances of the term "action bias" in scientific journals was in a 2000 paper by Patt and Zechenhauser titled "Action Bias and Environmental Decisions", where its relevance in politics was expounded.
{{cite journal}}
: Cite journal requires |journal=
(help)