Neglect of probability

Last updated

The neglect of probability, a type of cognitive bias, is the tendency to disregard probability when making a decision under uncertainty and is one simple way in which people regularly violate the normative rules for decision making. Small risks are typically either neglected entirely or hugely overrated. The continuum between the extremes is ignored. The term probability neglect was coined by Cass Sunstein. [1]

Contents

There are many related ways in which people violate the normative rules of decision making with regard to probability including the hindsight bias, the neglect of prior base rates effect, and the gambler's fallacy. However, this bias is different, in that, rather than incorrectly using probability, the actor disregards it.

"We have no intuitive grasp of risk and thus distinguish poorly among different threats," Dobelli has written. "The more serious the threat and the more emotional the topic (such as radioactivity), the less reassuring a reduction in risk seems to us." [2]

Studies

Adults

In a 1972 experiment, participants were divided into two groups, with the former being told they would receive a mild electric shock and the latter told that there was a 50 percent chance they would receive such a shock. When the subjects' physical anxiety was measured, there was no difference between the two groups. This lack of difference remained even when the second group's chance of being shocked was lowered to 20 percent, then ten, then five. The conclusion: "we respond to the expected magnitude of an event...but not to its likelihood. In other words: We lack an intuitive grasp of probability." [2]

Baron (2000) suggests that the bias manifests itself among adults especially when it comes to difficult choices, such as medical decisions. This bias could make actors drastically violate expected-utility theory in their decision making, especially when a decision must be made in which one possible outcome has a much lower or higher utility but a small probability of occurring (e.g. in medical or gambling situations). In this aspect, the neglect of probability bias is similar to the neglect of prior base rates effect.

Cass Sunstein has cited the history of Love Canal in upstate New York, which became world-famous in the late 1970s owing to widely publicized public concerns about abandoned waste that was supposedly causing medical problems in the area. In response to these concerns, the U.S. federal government set in motion "an aggressive program for cleaning up abandoned hazardous waste sites, without examining the probability that illness would actually occur," and legislation was passed that did not reflect serious study of the actual degree of danger. Furthermore, when controlled studies were publicized showing little evidence that the waste represented a menace to public health, the anxiety of local residents did not diminish. [3]

One University of Chicago study showed that people are as afraid of a 1% chance as of a 99% chance of contamination by poisonous chemicals. [2] In another example of near-total neglect of probability, Rottenstreich and Hsee (2001) found that the typical subject was willing to pay $10 to avoid a 99% chance of a painful electric shock, and $7 to avoid a 1% chance of the same shock. They suggest that probability is more likely to be neglected when the outcomes are emotion-arousing.

In 2013, Tom Cagley noted that neglect of probability is "common in IT organizations that are planning and estimating projects or in risk management." He pointed out that there are available techniques, such as the Monte Carlo analysis, to study probability, but too often "the continuum of probability is ignored." [4]

In 2016, Rolf Dobelli presented a choice between two games of chance. In one, you have a one in 100 million chance of winning $10 million; in the other, you have a one in 10,000 chance of winning $10,000. It is more reasonable to choose the second game; but most people would choose the first. For this reason, jackpots in lotteries are growing. [2]

Dobelli has described the U.S. Food Act of 1958 as a "classic example" of neglect of probability. The law – which prohibited carcinogenic substances in food, no matter how low the probability that they would in fact result in cancer – led to the substitution of those substances by ingredients that, while not causing cancer, stood a far greater chance of causing some sort of medical harm. [2]

In 2001, there was widespread panic around the U.S. over shark attacks, even though there was no evidence to show any increase in their occurrence. Legislation was actually enacted to address the issue. [3] Neglect of probability also figures in the purchase of lottery tickets. [4] Cass Sunstein has pointed out that terrorism is effective partly because of probability neglect. [3] "Terrorists show a working knowledge of probability neglect," he wrote in 2003, "producing public fear that might greatly exceed the discounted harm." [5]

Children

The probability bias is especially pronounced among children. In a 1993 study, Baron, Granato, Spranca, and Teubal presented children with the following situation:

Susan and Jennifer are arguing about whether they should wear seat belts when they ride in a car. Susan says that you should. Jennifer says you shouldn't... Jennifer says that she heard of an accident where a car fell into a lake and a woman was kept from getting out in time because of wearing her seat belt, and another accident where a seat belt kept someone from getting out of the car in time when there was a fire. What do you think about this?

Jonathan Baron (2000) notes that subject X responded in the following manner:

A: Well, in that case I don't think you should wear a seat belt.
Q (interviewer): How do you know when that's gonna happen?
A: Like, just hope it doesn't!
Q: So, should you or shouldn't you wear seat belts?
A: Well, tell-you-the-truth we should wear seat belts.
Q: How come?
A: Just in case of an accident. You won't get hurt as much as you will if you didn't wear a seat belt.
Q: OK, well what about these kinds of things, when people get trapped?
A: I don't think you should, in that case.

It is clear that subject X completely disregards the probability of an accident happening versus the probability of getting hurt by the seat belt in making the decision. A normative model for this decision would advise the use of expected-utility theory to decide which option would likely maximize utility. This would involve weighing the changes in utility in each option by the probability that each option will occur, something that subject X ignores.

Another subject responded to the same question:

A: If you have a long trip, you wear seat belts half way.
Q: Which is more likely?
A: That you'll go flyin' through the windshield.
Q: Doesn't that mean you should wear them all the time?
A: No, it doesn't mean that.
Q: How do you know if you're gonna have one kind of accident or the other?
A: You don't know. You just hope and pray that you don't.

Again, the subject disregards the probability in making the decision by treating each possible outcome as equal in his reasoning.

Practical consequences

Cass Sunstein has noted that for a long time after 9/11, many people refused to fly because they felt a heightened sense of fear or peril, even though, statistically, most of them "were not at significantly more risk after the attacks than they were before." Indeed, those who chose to drive long distances instead of flying thereby put themselves at an increased risk, given that driving is the less safe form of transportation. [3]

In a 2001 paper, Sunstein addressed the question of how the law should respond to the neglect of probability. He emphasized that it is important for government to "create institutions designed to ensure that genuine risks, rather than tiny ones, receive the most concern". While government policies in regard to potential dangers should focus on statistics and probabilities, government efforts to raise public awareness of these dangers should emphasize worst-case scenarios in order to be maximally effective. Moreover, while it seems advisable for government to "attempt to educate and inform people, rather than capitulating to unwarranted public fear", that fear will remain a real phenomenon and may thus cause serious problems, for example leading citizens to undertake "wasteful and excessive private precautions". In such cases, certain kinds of government regulation may be justified not because they address serious dangers but because they reduce fear. At the same time, government should "treat its citizens with respect" and "not treat them as objects to be channeled in government's preferred directions", so focusing on worst-case scenarios that feed on irrational fears would amount to "unacceptable manipulation". [3] In a 2003 article, however, Sunstein concluded that "As a normative matter, government should reduce even unjustified fear, if the benefits of the response can be shown to outweigh the costs." [5]

See also

Related Research Articles

The precautionary principle is a broad epistemological, philosophical and legal approach to innovations with potential for causing harm when extensive scientific knowledge on the matter is lacking. It emphasizes caution, pausing and review before leaping into new innovations that may prove disastrous. Critics argue that it is vague, self-cancelling, unscientific and an obstacle to progress.

Rationality is the quality of being guided by or based on reasons. In this regard, a person acts rationally if they have a good reason for what they do or a belief is rational if it is based on strong evidence. This quality can apply to an ability, as in rational animal, to a psychological process, like reasoning, to mental states, such as beliefs and intentions, or to persons who possess these other forms of rationality. A thing that lacks rationality is either arational, if it is outside the domain of rational evaluation, or irrational, if it belongs to this domain but does not fulfill its standards.

<span class="mw-page-title-main">Sunk cost</span> Cost that has already been incurred and cannot be recovered

In economics and business decision-making, a sunk cost is a cost that has already been incurred and cannot be recovered. Sunk costs are contrasted with prospective costs, which are future costs that may be avoided if action is taken. In other words, a sunk cost is a sum paid in the past that is no longer relevant to decisions about the future. Even though economists argue that sunk costs are no longer relevant to future rational decision-making, people in everyday life often take previous expenditures in situations, such as repairing a car or house, into their future decisions regarding those properties.

<span class="mw-page-title-main">Behavioral economics</span> Academic discipline

Behavioral economics studies the effects of psychological, cognitive, emotional, cultural and social factors on the decisions of individuals or institutions, such as how those decisions vary from those implied by classical economic theory.

<span class="mw-page-title-main">Prospect theory</span> Theory of behavioral economics and behavioral finance

Prospect theory is a theory of behavioral economics and behavioral finance that was developed by Daniel Kahneman and Amos Tversky in 1979. The theory was cited in the decision to award Kahneman the 2002 Nobel Memorial Prize in Economics.

Decision theory is a branch of applied probability theory and analytic philosophy concerned with the theory of making decisions based on assigning probabilities to various factors and assigning numerical consequences to the outcome.

The expected utility hypothesis is a popular concept in economics that serves as a reference guide for decisions when the payoff is uncertain. The theory recommends which option rational individuals should choose in a complex situation, based on their risk appetite and preferences.

<span class="mw-page-title-main">Cass Sunstein</span> American legal scholar, writer, blogger

Cass Robert Sunstein is an American legal scholar known for his studies of constitutional law, administrative law, environmental law, and behavioral economics. He is also The New York Times best-selling author of The World According to Star Wars (2016) and Nudge (2008). He was the Administrator of the White House Office of Information and Regulatory Affairs in the Obama administration from 2009 to 2012.

<span class="mw-page-title-main">Ellsberg paradox</span> Paradox in decision theory

In decision theory, the Ellsberg paradox is a paradox in which people's decisions are inconsistent with subjective expected utility theory. Daniel Ellsberg popularized the paradox in his 1961 paper, “Risk, Ambiguity, and the Savage Axioms”. John Maynard Keynes published a version of the paradox in 1921. It is generally taken to be evidence of ambiguity aversion, in which a person tends to prefer choices with quantifiable risks over those with unknown, incalculable risks.

The Allais paradox is a choice problem designed by Maurice Allais (1953) to show an inconsistency of actual observed choices with the predictions of expected utility theory.

Libertarian paternalism is the idea that it is both possible and legitimate for private and public institutions to affect behavior while also respecting freedom of choice, as well as the implementation of that idea. The term was coined by behavioral economist Richard Thaler and legal scholar Cass Sunstein in a 2003 article in the American Economic Review. The authors further elaborated upon their ideas in a more in-depth article published in the University of Chicago Law Review that same year. They propose that libertarian paternalism is paternalism in the sense that "it tries to influence choices in a way that will make choosers better off, as judged by themselves" ; note and consider, the concept paternalism specifically requires a restriction of choice. It is libertarian in the sense that it aims to ensure that "people should be free to opt out of specified arrangements if they choose to do so". The possibility to opt out is said to "preserve freedom of choice". Thaler and Sunstein published Nudge, a book-length defense of this political doctrine, in 2008.

Zero-risk bias is a tendency to prefer the complete elimination of risk in a sub-part over alternatives with greater overall risk reduction. It often manifests in cases where decision makers address problems concerning health, safety, and the environment. Its effect on decision making has been observed in surveys presenting hypothetical scenarios.

Choice architecture is the design of different ways in which choices can be presented to decision makers, and the impact of that presentation on decision-making. For example, each of the following:

Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.

In simple terms, risk is the possibility of something bad happening. Risk involves uncertainty about the effects/implications of an activity with respect to something that humans value, often focusing on negative, undesirable consequences. Many different definitions have been proposed. The international standard definition of risk for common understanding in different applications is “effect of uncertainty on objectives”.

In decision theory, the von Neumann–Morgenstern (VNM) utility theorem shows that, under certain axioms of rational behavior, a decision-maker faced with risky (probabilistic) outcomes of different choices will behave as if he or she is maximizing the expected value of some function defined over the potential outcomes at some specified point in the future. This function is known as the von Neumann–Morgenstern utility function. The theorem is the basis for expected utility theory.

Heuristics is the process by which humans use mental short cuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Risk aversion is a preference for a sure outcome over a gamble with higher or equal expected value. Conversely, the rejection of a sure thing in favor of a gamble of lower or equal expected value is known as risk-seeking behavior.

<span class="mw-page-title-main">Forensic epidemiology</span>

The discipline of forensic epidemiology (FE) is a hybrid of principles and practices common to both forensic medicine and epidemiology. FE is directed at filling the gap between clinical judgment and epidemiologic data for determinations of causality in civil lawsuits and criminal prosecution and defense.

<i>Noise: A Flaw in Human Judgment</i> 2021 book by Daniel Kahneman, Olivier Sibony, and Cass Sunstein

Noise: A Flaw in Human Judgment is a nonfiction book by professors Daniel Kahneman, Olivier Sibony and Cass Sunstein. It was first published on May 18, 2021. The book concerns 'noise' in human judgment and decision-making. The authors define noise in human judgment as "undesirable variability in judgments of the same problem" and focus on the statistical properties and psychological perspectives of the issue.

References

  1. Kahneman, D. (2011). Thinking Fast and Slow Archived 2014-07-31 at the Wayback Machine , Allen Lane 2011, p. 143 f.
  2. 1 2 3 4 5 "Why You'll Soon Be Playing Mega Trillions". Meaning Ring. Meaning Ring. 2016-03-28. Retrieved 29 April 2017.
  3. 1 2 3 4 5 Sunstein, Cass (November 2001). "Probability Neglect: Emotions, Worst Cases, and Law". SSRN   292149.{{cite journal}}: Cite journal requires |journal= (help)
  4. 1 2 Cagley, Tom (2013-07-08). "Cognitive Bias". TCagley. Retrieved 29 April 2017.
  5. 1 2 Sunstein, Cass (March 2003). "Terrorism and Probability Neglect". Journal of Risk and Uncertainty. 26 (2): 121–136. doi:10.1023/A:1024111006336. S2CID   189929493.

Bibliography