An availability cascade is a self-reinforcing cycle that explains the development of certain kinds of collective beliefs. A novel idea or insight, usually one that seems to explain a complex process in a simple or straightforward manner, gains rapid currency in the popular discourse by its very simplicity and by its apparent insightfulness. Its rising popularity triggers a chain reaction within the social network: individuals adopt the new insight because other people within the network have adopted it, and on its face it seems plausible. The reason for this increased use and popularity of the new idea involves both the availability of the previously obscure term or idea, and the need of individuals using the term or idea to appear to be current with the stated beliefs and ideas of others, regardless of whether they in fact fully believe in the idea that they are expressing. Their need for social acceptance, and the apparent sophistication of the new insight, overwhelm their critical thinking.
The idea of the availability cascade was first developed by Timur Kuran and Cass Sunstein as a variation of information cascades mediated by the availability heuristic, with the addition of reputational cascades. [1] The availability cascade concept has been highly influential in finance theory and regulatory research, particular with respect to assessing and regulating risk.
Availability cascades occur in a society via public discourse (e.g. the public sphere and the news media) or over social networks—sets of linked actors in one or more of several roles. These actors process incoming information to form their private beliefs according to various rules, both rational and semi-rational. The semi-rational rules include the heuristics, in particular the availability heuristic. The actors then behave and express their public beliefs according to self-interest, which might cause their publicly expressed beliefs to deviate from their privately held beliefs.
Kuran and Sunstein emphasize the role of availability entrepreneurs, agents willing to invest resources into promoting a belief in order to derive some personal benefit. Other availability entrepreneurs with opposing interests may wage availability counter-campaigns. Other key roles include journalists and politicians, both of which are subject to economic and reputational pressures, the former in competition in the media, the latter for political status. As resources (e.g. attention and money) are limited, beliefs compete with one another in the "availability market". A given incident and subsequent availability campaign may succeed in raising the availability of one issue at the expense of other issues. [1]
Dual process theory posits that human reasoning is divided into two systems, often called System 1 and System 2. System 1 is automatic and unconscious; other terms used for it include the implicit system, the experiential system, the associative system, and the heuristic system. System 2 is evolutionarily recent and specific to humans, performing the more slow and sequential thinking. It is also known as the explicit system, the rule-based system, the rational system, or the analytic system. In The Happiness Hypothesis, Jonathan Haidt refers to System 1 and System 2 as the elephant and the rider: while human beings incorporate reason into their beliefs, whether via direct use of facts and logic or their application as a test to hypotheses formed by other means, it is the elephant that is really in charge.
Heuristics are simple, efficient rules which people often use to form judgments and make decisions. They are mental shortcuts that replace a complex problem with a simpler one. These rules work well under most circumstances, but they can lead to systematic deviations from logic, probability or rational choice theory. The resulting errors are called "cognitive biases" and many different types have been documented. These have been shown to affect people's choices in situations like valuing a house or deciding the outcome of a legal case. Heuristics usually govern automatic, intuitive judgments but can also be used as deliberate mental strategies when working from limited information. While seemingly irrational, the cognitive biases may be interpreted as the result of bounded rationality, with human beings making decisions while economizing time and effort.
Kuran and Sunstein describe the availability heuristic as more fundamental than the other heuristics: besides being important in its own right, it enables and amplifies the others, including framing, representativeness, anchoring, and reference points. [1]
Even educated human beings are notoriously poor at thinking statistically. [2] The availability heuristic, first identified by Daniel Kahneman and Amos Tversky, is a mental shortcut that occurs when people judge the probability of events by how easy it is to think of examples. The availability heuristic operates on the notion that, "if you can think of it, it must be important." Availability can be influenced by the emotional power of examples and by their perceived frequency; while personal, first-hand incidents are more available than those that happened to others, availability can be skewed by the media. In his book Thinking, Fast and Slow , Kahneman cites the examples of celebrity divorces and airplane crashes; both are more often reported by the media, and thus tend to be exaggerated in perceived frequency. [3]
An important class of judgments is those concerning risk: the expectation of harm to result from a given threat, a function of the threat's likelihood and impact. Changes in perceived risk result in risk compensation—correspondingly more or less mitigation, including precautionary measures and support for regulation. Kuran and Sunstein offer three examples of availability cascades—Love Canal, the Alar scare, and TWA Flight 800—in which a spreading public panic led to growing calls for increasingly expensive government action to deal with risks that turned out later to be grossly exaggerated. [1] Others have used the term "culture of fear" to refer to the habitual achieving of goals via such fear appeals, notably in the case of the threat of terrorism.
In the early years of the HIV/AIDS epidemic, many believed that the disease received less attention than warranted, in part due to the stigma attached to its sufferers. Since that time advocates— availability entrepreneurs that include LGBT activists and conservative Surgeon General of the United States C. Everett Koop—have succeeded in raising awareness to achieve significant funding. Similarly, awareness and funding for breast cancer and prostate cancer are high, thanks in part to the availability of these diseases. Other prevalent diseases competing for funding but lacking the availability of HIV/AIDS or cancer include lupus, sickle-cell anemia, and tuberculosis. [4]
The MMR vaccine controversy was an example of an unwarranted health scare. It was triggered by the publication in 1998 of a paper in the medical journal The Lancet which presented apparent evidence that autism spectrum disorders could be caused by the MMR vaccine, an immunization against measles, mumps and rubella. [5] In 2004, investigations by Sunday Times journalist Brian Deer revealed that the lead author of the article, Andrew Wakefield, had multiple undeclared conflicts of interest, [6] had manipulated evidence, [7] and had broken other ethical codes. The Lancet paper was partially retracted in 2004 and fully retracted in 2010, and Wakefield was found guilty of professional misconduct. The scientific consensus is that no evidence links the vaccine to the development of autism, and that the vaccine's benefits greatly outweigh its risks. The claims in Wakefield's 1998 The Lancet article were widely reported; [8] vaccination rates in the UK and Ireland dropped sharply, [9] which was followed by significantly increased incidence of measles and mumps, resulting in deaths and severe and permanent injuries. [10] Reaction to vaccine controversies has contributed to a significant increase in preventable diseases including measles [11] and pertussis (whooping cough), which in 2011 experienced its worst outbreak in 70 years as a result of reduced vaccination rates. [12] Concerns about immunization safety often follow a pattern: some investigators suggest that a medical condition is an adverse effect of vaccination; a premature announcement is made of the alleged adverse effect; the initial study is not reproduced by other groups; and finally, it takes several years to regain public confidence in the vaccine. [13]
Extreme weather events provide opportunities to raise the availability of global warming. In the United States, the mass media devoted little coverage to global warming until the drought of 1988, and the testimony of James E. Hansen to the United States Senate, which explicitly attributed "the abnormally hot weather plaguing our nation" to global warming. [14] The global warming controversy has attracted availability entrepreneurs on both sides, e.g. the book Merchants of Doubt claiming that scientific consensus had long ago been reached, and climatologist Patrick Michaels providing the denialist viewpoint.
The media inclination to sensationalism results in a tendency to devote disproportionate coverage to sympathetic victims (e.g. missing white woman syndrome), terrifying assailants (e.g. Media coverage of the Virginia Tech massacre), and incidents with multiple victims. Although half the victims of gun violence in the United States are black, generally young urban black males, [15] media coverage and public awareness spike after suburban school shootings, as do calls for stricter gun control laws.
International adoption scandals receive disproportionate attention in the countries of adoptees' origins. As the incidents involve abuse of children, they easily spark media attention, and availability entrepreneurs (e.g. populist politicians) fan the flames of xenophobia, without making statistical comparisons of adoptee abuse in the source and target nations, or of the likelihood of abuse vs. other risks. [16]
Poisoned candy myths are urban legends that malevolent individuals could hide poison or drugs, or sharp objects such as razor blades, needles, or broken glass in candy and distribute the candy in order to harm random children, especially during Halloween trick-or-treating. Several events fostered the candy tampering myth. The first took place in 1964, when an annoyed Long Island, New York housewife started giving out packages of inedible objects to children who she believed were too old to be trick-or-treating. The packages contained items such as steel wool, dog biscuits, and ant buttons (which were clearly labeled with the word "poison"). Although nobody was injured, she was prosecuted and pleaded guilty to endangering children. The same year saw reports of lye-filled bubble gum being handed out in Detroit and rat poison being given in Philadelphia. [17]
The second milestone in the spread of the candy-tampering myths was an article published in The New York Times in 1970. It claimed that "Those Halloween goodies that children collect this weekend on their rounds of ‘trick or treating’ may bring them more horror than happiness", and provided specific examples of potential tampering. [18]
In 2008, candy was found with metal shavings and metal blades embedded in it. The candy was Pokémon Valentine's Day lollipops purchased from a Dollar General store in Polk County, Florida. The candy was determined to have been manufactured in China and not tampered with within the United States. The lollipops were pulled from the shelves after a mother reported a blade in her child's lollipop and after several more lollipops with metal shavings in them were confiscated from a local elementary school. [19] Also in 2008, some cold medicine was discovered in cases of Smarties that were handed out to children in Ontario. [20]
Over the years, various experts have tried to debunk the various candy tampering stories. Among this group is Joel Best, a University of Delaware sociologist who specializes in investigating candy tampering legends. In his studies, and the book Threatened Children: Rhetoric and Concern about Child-Victims, he researched newspapers from 1958 on in search of candy tampering. [21] Of these stories, fewer than 90 instances might have qualified as actual candy tampering. Best has found five child deaths that were initially thought by local authorities to be caused by homicidal strangers, but none of those were sustained by investigation. [22]
Despite the falsity of these claims, the news media promoted the story continuously throughout the 1980s, with local news stations featuring frequent coverage. During this time, cases of poisoning were repeatedly reported based on unsubstantiated claims or before a full investigation could be completed and often never followed up on. This one-sided coverage contributed to the overall panic and caused rival media outlets to issue reports of candy tampering as well. By 1985, the media had driven the hysteria about candy poisonings to such a point that an ABC News/The Washington Post poll that found 60% of parents feared that their children would be injured or killed because of Halloween candy sabotage.
The phenomenon of media feeding frenzies is driven by a combination of the psychology described by the availability cascade model and the financial imperatives of media organizations to retain their funding.
There are two schools of thought on how to cope with risks raised by availability cascades: technocratic and democratic. The technocratic approach, championed by Kuran and Sunstein, emphasizes assessing, prioritizing, and mitigating risks according to objective risk measures (e.g. expected costs, expected disability-adjusted life years (DALY)). The technocratic approach considers availability cascades to be phenomena of mass irrationality that can distort or hijack public policy, misallocating resources or imposing regulatory burdens whose costs exceed the expected costs of the risks they mitigate.
The democratic approach, championed by Paul Slovic, respects risk preferences as revealed by the availability market. For example, though lightning strikes kill far more people each year than shark attacks, if people genuinely consider death by shark worse than death by lightning, a disproportionate share of resources should be devoted to averting shark attacks.
Kuran and Sunstein recommend that availability cascades be recognized, and institutional safeguards be implemented in all branches of government. They recommend expanded product defamation laws, analogous to personal libel laws, to discourage availability entrepreneurs from knowingly spreading false and damaging reports about a product. They recommend that the legislative branch create a Risk Regulation Committee to assess risks in a broader context and perform cost-benefit analyses of risks and regulations, avoiding hasty responses pandering to public opinion. They recommend that the executive branch use peer review to open agency proposals to scrutiny by informed outsiders. They also recommend the creation of a Risk Information Center with a Risk Information Web Site to provide the public with objective risk measures. [1] In the United States, the Centers for Disease Control and Prevention [23] and the Federal Bureau of Investigation [24] maintain web sites that provide objective statistics on the causes of death and violent crime.
A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.
A heuristic or heuristic technique is any approach to problem solving that employs a pragmatic method that is not fully optimized, perfected, or rationalized, but is nevertheless "good enough" as an approximation or attribute substitution. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.
Heuristic reasoning is often based on induction, or on analogy[.] [...] Induction is the process of discovering general laws [...] Induction tries to find regularity and coherence [...] Its most conspicuous instruments are generalization, specialization, analogy. [...] Heuristic discusses human behavior in the face of problems [...that have been] preserved in the wisdom of proverbs.
The MMR vaccine is a vaccine against measles, mumps, and rubella, abbreviated as MMR. The first dose is generally given to children around 9 months to 15 months of age, with a second dose at 15 months to 6 years of age, with at least four weeks between the doses. After two doses, 97% of people are protected against measles, 88% against mumps, and at least 97% against rubella. The vaccine is also recommended for those who do not have evidence of immunity, those with well-controlled HIV/AIDS, and within 72 hours of exposure to measles among those who are incompletely immunized. It is given by injection.
Bounded rationality is the idea that rationality is limited when individuals make decisions, and under these limitations, rational individuals will select a decision that is satisfactory rather than optimal.
The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.
Poisoned candy myths are mostly urban legends about malevolent strangers intentionally hiding poisons, drugs, or sharp objects such as razor blades in candy, which they then distribute with the intent of harming random children, especially during Halloween trick-or-treating. These myths, originating in the United States, serve as modern cautionary tales to children and parents and repeat two themes that are common in urban legends: danger to children and contamination of food.
Brian Deer is a British investigative journalist, best known for inquiries into the drug industry, medicine, and social issues for The Sunday Times. Deer's investigative nonfiction book The Doctor Who Fooled the World, an exposé on disgraced former doctor Andrew Wakefield and the 1998 Lancet MMR autism fraud, was published in September 2020 by Johns Hopkins University Press.
Vaccine hesitancy is a delay in acceptance, or refusal, of vaccines despite the availability of vaccine services and supporting evidence. The term covers refusals to vaccinate, delaying vaccines, accepting vaccines but remaining uncertain about their use, or using certain vaccines but not others. Although adverse effects associated with vaccines are occasionally observed, the scientific consensus that vaccines are generally safe and effective is overwhelming. Vaccine hesitancy often results in disease outbreaks and deaths from vaccine-preventable diseases. Therefore, the World Health Organization characterizes vaccine hesitancy as one of the top ten global health threats.
False balance, known colloquially as bothsidesism, is a media bias in which journalists present an issue as being more balanced between opposing viewpoints than the evidence supports. Journalists may present evidence and arguments out of proportion to the actual evidence for each side, or may omit information that would establish one side's claims as baseless. False balance has been cited as a cause of misinformation.
Arthur Krigsman is a pediatrician and gastroenterologist best known for his controversial research in which he attempted to prove that the MMR vaccine caused diseases, especially autism. He specializes in the evaluation and treatment of gastrointestinal pathology in children with autism spectrum disorders, and has written in support of the diagnosis he calls autistic enterocolitis. The original study that tied the MMR vaccine to autism and GI complaints conducted by one of Krigsman's associates has been found to be fraudulent, and the diagnosis of "autistic enterocolitis" has not been accepted by the medical community.
Claims of a link between the MMR vaccine and autism have been extensively investigated and found to be false. The link was first suggested in the early 1990s and came to public notice largely as a result of the 1998 Lancet MMR autism fraud, characterised as "perhaps the most damaging medical hoax of the last 100 years". The fraudulent research paper, authored by Andrew Wakefield and published in The Lancet, falsely claimed the vaccine was linked to colitis and autism spectrum disorders. The paper was retracted in 2010 but is still cited by anti-vaccine activists.
Andrew Jeremy Wakefield is a British fraudster, discredited academic, anti-vaccine activist, and former physician.
Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.
Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.
Ecological rationality is a particular account of practical rationality, which in turn specifies the norms of rational action – what one ought to do in order to act rationally. The presently dominant account of practical rationality in the social and behavioral sciences such as economics and psychology, rational choice theory, maintains that practical rationality consists in making decisions in accordance with some fixed rules, irrespective of context. Ecological rationality, in contrast, claims that the rationality of a decision depends on the circumstances in which it takes place, so as to achieve one's goals in this particular context. What is considered rational under the rational choice account thus might not always be considered rational under the ecological rationality account. Overall, rational choice theory puts a premium on internal logical consistency whereas ecological rationality targets external performance in the world. The term ecologically rational is only etymologically similar to the biological science of ecology.
John Walker-Smith is an Australian gastroenterologist well known for his work in pediatrics. From 1985 until his retirement in 2001, he was professor of pediatric gastroenterology at the University of London. He also formerly served as the editor-in-chief of the Journal of Pediatric Gastroenterology and Nutrition.
The Lancet MMR autism fraud centered on the publication in February 1998 of a fraudulent research paper titled "Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children" in The Lancet. The paper, authored by now discredited and deregistered Andrew Wakefield, and twelve coauthors, falsely claimed causative links between the measles, mumps, and rubella (MMR) vaccine and colitis and between colitis and autism. The fraud involved data selection, data manipulation, and two undisclosed conflicts of interest. It was exposed in a lengthy Sunday Times investigation by reporter Brian Deer, resulting in the paper's retraction in February 2010 and Wakefield being struck off the UK medical register three months later. Wakefield reportedly stood to earn up to US$43 million per year selling diagnostic kits for a non-existent syndrome he claimed to have discovered. He also held a patent to a rival vaccine at the time, and he had been employed by a lawyer representing parents in lawsuits against vaccine producers.
Extensive investigation into vaccines and autism spectrum disorder has shown that there is no relationship between the two, causal or otherwise, and that vaccine ingredients do not cause autism. The American scientist Peter Hotez researched the growth of the false claim and concluded that its spread originated with Andrew Wakefield's fraudulent 1998 paper, and that no prior paper supports a link.
Misinformation related to immunization and the use of vaccines circulates in mass media and social media in spite of the fact that there is no serious hesitancy or debate within mainstream medical and scientific circles about the benefits of vaccination. Unsubstantiated safety concerns related to vaccines are often presented on the internet as being scientific information. A large proportion of internet sources on the topic are mostly inaccurate which can lead people searching for information to form misconceptions relating to vaccines.
JABS is a British pressure group launched in Wigan in January 1994. Beginning as a support group for the parents of children they claim became ill after the MMR vaccine, the group is currently against all forms of vaccination.