Availability cascade

Last updated

An availability cascade is a self-reinforcing cycle that explains the development of certain kinds of collective beliefs. A novel idea or insight, usually one that seems to explain a complex process in a simple or straightforward manner, gains rapid currency in the popular discourse by its very simplicity and by its apparent insightfulness. Its rising popularity triggers a chain reaction within the social network: individuals adopt the new insight because other people within the network have adopted it, and on its face it seems plausible. The reason for this increased use and popularity of the new idea involves both the availability of the previously obscure term or idea, and the need of individuals using the term or idea to appear to be current with the stated beliefs and ideas of others, regardless of whether they in fact fully believe in the idea that they are expressing. Their need for social acceptance, and the apparent sophistication of the new insight, overwhelm their critical thinking.

Contents

The idea of the availability cascade was first developed by Timur Kuran and Cass Sunstein as a variation of information cascades mediated by the availability heuristic, with the addition of reputational cascades. [1] The availability cascade concept has been highly influential in finance theory and regulatory research, particular with respect to assessing and regulating risk.

Cascade elements

Availability cascades occur in a society via public discourse (e.g. the public sphere and the news media) or over social networks—sets of linked actors in one or more of several roles. These actors process incoming information to form their private beliefs according to various rules, both rational and semi-rational. The semi-rational rules include the heuristics, in particular the availability heuristic. The actors then behave and express their public beliefs according to self-interest, which might cause their publicly expressed beliefs to deviate from their privately held beliefs.

Kuran and Sunstein emphasize the role of availability entrepreneurs, agents willing to invest resources into promoting a belief in order to derive some personal benefit. Other availability entrepreneurs with opposing interests may wage availability counter-campaigns. Other key roles include journalists and politicians, both of which are subject to economic and reputational pressures, the former in competition in the media, the latter for political status. As resources (e.g. attention and money) are limited, beliefs compete with one another in the "availability market". A given incident and subsequent availability campaign may succeed in raising the availability of one issue at the expense of other issues. [1]

Belief formation

Dual process theory posits that human reasoning is divided into two systems, often called System 1 and System 2. System 1 is automatic and unconscious; other terms used for it include the implicit system, the experiential system, the associative system, and the heuristic system. System 2 is evolutionarily recent and specific to humans, performing the more slow and sequential thinking. It is also known as the explicit system, the rule-based system, the rational system, or the analytic system. In The Happiness Hypothesis, Jonathan Haidt refers to System 1 and System 2 as the elephant and the rider: while human beings incorporate reason into their beliefs, whether via direct use of facts and logic or their application as a test to hypotheses formed by other means, it is the elephant that is really in charge.

Cognitive biases

Heuristics are simple, efficient rules which people often use to form judgments and make decisions. They are mental shortcuts that replace a complex problem with a simpler one. These rules work well under most circumstances, but they can lead to systematic deviations from logic, probability or rational choice theory. The resulting errors are called "cognitive biases" and many different types have been documented. These have been shown to affect people's choices in situations like valuing a house or deciding the outcome of a legal case. Heuristics usually govern automatic, intuitive judgments but can also be used as deliberate mental strategies when working from limited information. While seemingly irrational, the cognitive biases may be interpreted as the result of bounded rationality, with human beings making decisions while economizing time and effort.

Kuran and Sunstein describe the availability heuristic as more fundamental than the other heuristics: besides being important in its own right, it enables and amplifies the others, including framing, representativeness, anchoring, and reference points. [1]

Availability heuristic

Even educated human beings are notoriously poor at thinking statistically. [2] The availability heuristic, first identified by Daniel Kahneman and Amos Tversky, is a mental shortcut that occurs when people judge the probability of events by how easy it is to think of examples. The availability heuristic operates on the notion that, "if you can think of it, it must be important." Availability can be influenced by the emotional power of examples and by their perceived frequency; while personal, first-hand incidents are more available than those that happened to others, availability can be skewed by the media. In his book Thinking, Fast and Slow , Kahneman cites the examples of celebrity divorces and airplane crashes; both are more often reported by the media, and thus tend to be exaggerated in perceived frequency. [3]

Examples

An important class of judgments is those concerning risk: the expectation of harm to result from a given threat, a function of the threat's likelihood and impact. Changes in perceived risk result in risk compensation—correspondingly more or less mitigation, including precautionary measures and support for regulation. Kuran and Sunstein offer three examples of availability cascades—Love Canal, the Alar scare, and TWA Flight 800—in which a spreading public panic led to growing calls for increasingly expensive government action to deal with risks that turned out later to be grossly exaggerated. [1] Others have used the term "culture of fear" to refer to the habitual achieving of goals via such fear appeals, notably in the case of the threat of terrorism.

Disease threats

In the early years of the HIV/AIDS epidemic, many believed that the disease received less attention than warranted, in part due to the stigma attached to its sufferers. Since that time advocates— availability entrepreneurs that include LGBT activists and conservative Surgeon General of the United States C. Everett Koop—have succeeded in raising awareness to achieve significant funding. Similarly, awareness and funding for breast cancer and prostate cancer are high, thanks in part to the availability of these diseases. Other prevalent diseases competing for funding but lacking the availability of HIV/AIDS or cancer include lupus, sickle-cell anemia, and tuberculosis. [4]

Vaccination scares

The MMR vaccine controversy was an example of an unwarranted health scare. It was triggered by the publication in 1998 of a paper in the medical journal The Lancet which presented apparent evidence that autism spectrum disorders could be caused by the MMR vaccine, an immunization against measles, mumps and rubella. [5] In 2004, investigations by Sunday Times journalist Brian Deer revealed that the lead author of the article, Andrew Wakefield, had multiple undeclared conflicts of interest, [6] had manipulated evidence, [7] and had broken other ethical codes. The Lancet paper was partially retracted in 2004 and fully retracted in 2010, and Wakefield was found guilty of professional misconduct. The scientific consensus is that no evidence links the vaccine to the development of autism, and that the vaccine's benefits greatly outweigh its risks. The claims in Wakefield's 1998 The Lancet article were widely reported; [8] vaccination rates in the UK and Ireland dropped sharply, [9] which was followed by significantly increased incidence of measles and mumps, resulting in deaths and severe and permanent injuries. [10] Reaction to vaccine controversies has contributed to a significant increase in preventable diseases including measles [11] and pertussis (whooping cough), which in 2011 experienced its worst outbreak in 70 years as a result of reduced vaccination rates. [12] Concerns about immunization safety often follow a pattern: some investigators suggest that a medical condition is an adverse effect of vaccination; a premature announcement is made of the alleged adverse effect; the initial study is not reproduced by other groups; and finally, it takes several years to regain public confidence in the vaccine. [13]

Global warming

Extreme weather events provide opportunities to raise the availability of global warming. In the United States, the mass media devoted little coverage to global warming until the drought of 1988, and the testimony of James E. Hansen to the United States Senate, which explicitly attributed "the abnormally hot weather plaguing our nation" to global warming. [14] The global warming controversy has attracted availability entrepreneurs on both sides, e.g. the book Merchants of Doubt claiming that scientific consensus had long ago been reached, and climatologist Patrick Michaels providing the denialist viewpoint.

Gun violence

The media inclination to sensationalism results in a tendency to devote disproportionate coverage to sympathetic victims (e.g. missing white woman syndrome), terrifying assailants (e.g. Media coverage of the Virginia Tech massacre), and incidents with multiple victims. Although half the victims of gun violence in the United States are black, generally young urban black males, [15] media coverage and public awareness spike after suburban school shootings, as do calls for stricter gun control laws.

International adoption scandals

International adoption scandals receive disproportionate attention in the countries of adoptees' origins. As the incidents involve abuse of children, they easily spark media attention, and availability entrepreneurs (e.g. populist politicians) fan the flames of xenophobia, without making statistical comparisons of adoptee abuse in the source and target nations, or of the likelihood of abuse vs. other risks. [16]

Poisoned candy myths

Poisoned candy myths are urban legends that malevolent individuals could hide poison or drugs, or sharp objects such as razor blades, needles, or broken glass in candy and distribute the candy in order to harm random children, especially during Halloween trick-or-treating. Several events fostered the candy tampering myth. The first took place in 1964, when an annoyed Long Island, New York housewife started giving out packages of inedible objects to children who she believed were too old to be trick-or-treating. The packages contained items such as steel wool, dog biscuits, and ant buttons (which were clearly labeled with the word "poison"). Although nobody was injured, she was prosecuted and pleaded guilty to endangering children. The same year saw reports of lye-filled bubble gum being handed out in Detroit and rat poison being given in Philadelphia. [17]

The second milestone in the spread of the candy-tampering myths was an article published in The New York Times in 1970. It claimed that "Those Halloween goodies that children collect this weekend on their rounds of ‘trick or treating’ may bring them more horror than happiness", and provided specific examples of potential tampering. [18]

In 2008, candy was found with metal shavings and metal blades embedded in it. The candy was Pokémon Valentine's Day lollipops purchased from a Dollar General store in Polk County, Florida. The candy was determined to have been manufactured in China and not tampered with within the United States. The lollipops were pulled from the shelves after a mother reported a blade in her child's lollipop and after several more lollipops with metal shavings in them were confiscated from a local elementary school. [19] Also in 2008, some cold medicine was discovered in cases of Smarties that were handed out to children in Ontario. [20]

Over the years, various experts have tried to debunk the various candy tampering stories. Among this group is Joel Best, a University of Delaware sociologist who specializes in investigating candy tampering legends. In his studies, and the book Threatened Children: Rhetoric and Concern about Child-Victims, he researched newspapers from 1958 on in search of candy tampering. [21] Of these stories, fewer than 90 instances might have qualified as actual candy tampering. Best has found five child deaths that were initially thought by local authorities to be caused by homicidal strangers, but none of those were sustained by investigation. [22]

Despite the falsity of these claims, the news media promoted the story continuously throughout the 1980s, with local news stations featuring frequent coverage. During this time, cases of poisoning were repeatedly reported based on unsubstantiated claims or before a full investigation could be completed and often never followed up on. This one-sided coverage contributed to the overall panic and caused rival media outlets to issue reports of candy tampering as well. By 1985, the media had driven the hysteria about candy poisonings to such a point that an ABC News/The Washington Post poll that found 60% of parents feared that their children would be injured or killed because of Halloween candy sabotage.

Media feeding frenzy

The phenomenon of media feeding frenzies is driven by a combination of the psychology described by the availability cascade model and the financial imperatives of media organizations to retain their funding.

Policy implications

Technocracy vs. democracy

There are two schools of thought on how to cope with risks raised by availability cascades: technocratic and democratic. The technocratic approach, championed by Kuran and Sunstein, emphasizes assessing, prioritizing, and mitigating risks according to objective risk measures (e.g. expected costs, expected disability-adjusted life years (DALY)). The technocratic approach considers availability cascades to be phenomena of mass irrationality that can distort or hijack public policy, misallocating resources or imposing regulatory burdens whose costs exceed the expected costs of the risks they mitigate.

The democratic approach, championed by Paul Slovic, respects risk preferences as revealed by the availability market. For example, though lightning strikes kill far more people each year than shark attacks, if people genuinely consider death by shark worse than death by lightning, a disproportionate share of resources should be devoted to averting shark attacks.

Institutional safeguards

Kuran and Sunstein recommend that availability cascades be recognized, and institutional safeguards be implemented in all branches of government. They recommend expanded product defamation laws, analogous to personal libel laws, to discourage availability entrepreneurs from knowingly spreading false and damaging reports about a product. They recommend that the legislative branch create a Risk Regulation Committee to assess risks in a broader context and perform cost-benefit analyses of risks and regulations, avoiding hasty responses pandering to public opinion. They recommend that the executive branch use peer review to open agency proposals to scrutiny by informed outsiders. They also recommend the creation of a Risk Information Center with a Risk Information Web Site to provide the public with objective risk measures. [1] In the United States, the Centers for Disease Control and Prevention [23] and the Federal Bureau of Investigation [24] maintain web sites that provide objective statistics on the causes of death and violent crime.

Related Research Articles

Controversy is a state of prolonged public dispute or debate, usually concerning a matter of conflicting opinion or point of view. The word was coined from the Latin controversia, as a composite of controversus – "turned in an opposite direction".

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

A heuristic, or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

<span class="mw-page-title-main">MMR vaccine</span> Any of several combined vaccines against measles, mumps, and rubella

The MMR vaccine is a vaccine against measles, mumps, and rubella, abbreviated as MMR. The first dose is generally given to children around 9 months to 15 months of age, with a second dose at 15 months to 6 years of age, with at least four weeks between the doses. After two doses, 97% of people are protected against measles, 88% against mumps, and at least 97% against rubella. The vaccine is also recommended for those who do not have evidence of immunity, those with well-controlled HIV/AIDS, and within 72 hours of exposure to measles among those who are incompletely immunized. It is given by injection.

Bounded rationality is the idea that rationality is limited when individuals make decisions, and under these limitations, rational individuals will select a decision that is satisfactory rather than optimal.

<span class="mw-page-title-main">Behavioral economics</span> Academic discipline

Behavioral economics studies the effects of psychological, cognitive, emotional, cultural and social factors in the decisions of individuals or institutions, and how these decisions deviate from those implied by classical economic theory.

The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.

The representativeness heuristic is used when making judgments about the probability of an event under uncertainty. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

<span class="mw-page-title-main">Vaccine hesitancy</span> Reluctance or refusal to be vaccinated or have ones children vaccinated

Vaccine hesitancy is a delay in acceptance, or refusal, of vaccines despite the availability of vaccine services and supporting evidence. The term covers refusals to vaccinate, delaying vaccines, accepting vaccines but remaining uncertain about their use, or using certain vaccines but not others. The scientific consensus that vaccines are generally safe and effective is overwhelming. Vaccine hesitancy often results in disease outbreaks and deaths from vaccine-preventable diseases. Therefore, the World Health Organization characterizes vaccine hesitancy as one of the top ten global health threats.

Claims of a link between the MMR vaccine and autism have been extensively investigated and found to be false. The link was first suggested in the early 1990s and came to public notice largely as a result of the 1998 Lancet MMR autism fraud, characterised as "perhaps the most damaging medical hoax of the last 100 years". The fraudulent research paper authored by Andrew Wakefield and published in The Lancet falsely claimed the vaccine was linked to colitis and autism spectrum disorders. The paper was retracted in 2010 but is still cited by anti-vaxxers.

In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.

<span class="mw-page-title-main">Andrew Wakefield</span> Discredited British former doctor (born 1956)

Andrew Jeremy Wakefield is a British anti-vaccine activist, former physician, and discredited academic who was struck off the medical register for his involvement in The Lancet MMR autism fraud, a 1998 study that fraudulently claimed a link between the measles, mumps, and rubella (MMR) vaccine and autism. He has subsequently become known for anti-vaccination activism. Publicity around the 1998 study caused a sharp decline in vaccination uptake, leading to a number of outbreaks of measles around the world. He was a surgeon on the liver transplant programme at the Royal Free Hospital in London and became senior lecturer and honorary consultant in experimental gastroenterology at the Royal Free and University College School of Medicine. He resigned from his positions there in 2001, "by mutual agreement", then moved to the United States. In 2004, Wakefield co-founded and began working at the Thoughtful House research center in Austin, Texas, serving as executive director there until February 2010, when he resigned in the wake of findings against him by the British General Medical Council.

Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.

Heuristics is the process by which humans use mental short cuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

<i>Thinking, Fast and Slow</i> 2011 book by Daniel Kahneman

Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.

Social heuristics are simple decision making strategies that guide people's behavior and decisions in the social environment when time, information, or cognitive resources are scarce. Social environments tend to be characterised by complexity and uncertainty, and in order to simplify the decision-making process, people may use heuristics, which are decision making strategies that involve ignoring some information or relying on simple rules of thumb.

Ecological rationality is a particular account of practical rationality, which in turn specifies the norms of rational action – what one ought to do in order to act rationally. The presently dominant account of practical rationality in the social and behavioral sciences such as economics and psychology, rational choice theory, maintains that practical rationality consists in making decisions in accordance with some fixed rules, irrespective of context. Ecological rationality, in contrast, claims that the rationality of a decision depends on the circumstances in which it takes place, so as to achieve one's goals in this particular context. What is considered rational under the rational choice account thus might not always be considered rational under the ecological rationality account. Overall, rational choice theory puts a premium on internal logical consistency whereas ecological rationality targets external performance in the world. The term ecologically rational is only etymologically similar to the biological science of ecology.

The Lancet MMR autism fraud centered on the publication in February 1998 of a fraudulent research paper titled "Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children" in The Lancet. The paper, authored by now discredited and deregistered Andrew Wakefield, and twelve coauthors, falsely claimed causative links between the MMR vaccine and colitis and between colitis and autism. The fraud was exposed in a lengthy Sunday Times investigation by reporter Brian Deer, resulting in the paper's retraction in February 2010 and Wakefield being struck off the UK medical register three months later. Wakefield reportedly stood to earn up to $43 million per year selling diagnostic kits for a non-existent syndrome he claimed to have discovered. He also held a patent to a rival vaccine at the time, and he had been employed by a lawyer representing parents in lawsuits against vaccine producers.

Extensive investigation into vaccines and autism spectrum disorder has shown that there is no relationship between the two, causal or otherwise, and that the vaccine ingredients do not cause autism. Vaccinologist Peter Hotez researched the growth of the false claim and concluded that its spread originated with Andrew Wakefield's fraudulent 1998 paper, with no prior paper supporting a link.

JABS is a British pressure group launched in Wigan in January 1994. Beginning as a support group for the parents of children they claim became ill after the MMR vaccine, the group is currently against all forms of vaccination.

References

  1. 1 2 3 4 5 Kuran, Timur, and Sunstein, Cass, Availability Cascades and Risk Regulation, Stanford Law Review, Vol. 51, No. 4 (1999).
  2. Lehrer, Jonah (2012-06-12). "Why Smart People Are Stupid". The New Yorker.
  3. Daniel Kahneman (2011-10-25). Thinking, Fast and Slow. Macmillan. ISBN   978-1-4299-6935-2 . Retrieved 2013-02-12.
  4. Brower, Vicki (2005). "The squeaky wheel gets the grease". EMBO Reports. 6 (11): 1014–1017. doi:10.1038/sj.embor.7400564. PMC   1371042 . PMID   16264425.
  5. Wakefield A, Murch S, Anthony A, et al. (1998). "Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children". Lancet. 351 (9103): 637–41. doi:10.1016/S0140-6736(97)11096-0. PMID   9500320. S2CID   439791 . Retrieved 2007-09-05. (Retracted, see doi:10.1016/S0140-6736(10)60175-4, PMID   20137807)
  6. The Sunday Times 2004:
  7. Deer B (8 February 2009). "MMR doctor Andrew Wakefield fixed data on autism". The Sunday Times. London. Retrieved 2009-02-09.
  8. Goldacre B (30 August 2008). "The MMR hoax". The Guardian. London. Archived from the original on 6 February 2015. Retrieved 2008-08-30. Alt URL
  9. McIntyre P, Leask J (2008). "Improving uptake of MMR vaccine". BMJ. 336 (7647): 729–30. doi:10.1136/bmj.39503.508484.80. PMC   2287215 . PMID   18309963.
  10. Pepys MB (2007). "Science and serendipity". Clin Med. 7 (6): 562–78. doi:10.7861/clinmedicine.7-6-562. PMC   4954362 . PMID   18193704.
  11. Boston Children's Hospital Archived March 9, 2013, at the Wayback Machine
  12. Heffter, Emily (June 2, 2011). "State leads nation in kids who aren't getting vaccines". Seattle Times . Archived from the original on January 16, 2013. Retrieved September 28, 2012.
  13. Bonhoeffer J, Heininger U (2007). "Adverse events following immunization: perception and evidence". Current Opinion in Infectious Diseases. 20 (3): 237–46. doi:10.1097/QCO.0b013e32811ebfb0. PMID   17471032. S2CID   40669829.
  14. McCright, A.M.; Dunlap R.E. (2000). "Challenging global warming as a social problem: An analysis of the conservative movement's counter-claims" (PDF). Social Problems. 47 (4): 499–522. doi:10.1525/sp.2000.47.4.03x0305s. JSTOR   3097132. See p. 500.
  15. "Expanded Homicide Data". Uniform Crime Reports. 2010.
  16. Montgomery, Mark; Powell, Irene (2018-03-01). "International adoptions have dropped 72 percent since 2005 – here's why". The Conversation. Retrieved 2020-01-22.
  17. "Deadly 'Tricks' Given Children in 3 States". The Milwaukee Journal. United Press International. November 2, 1964. p. A18.
  18. Klemesrud, Judy (October 28, 1970). "Those Treats May Be Tricks". The New York Times. p. 56.
  19. "Metal-Filled Lollipops Seized By Deputies At Elementary School - Orlando News Story - WKMG Orlando". Local6.com. February 14, 2008. Archived from the original on April 20, 2008. Retrieved July 16, 2009.
  20. "Cold medication discovered in Halloween candy". CBC. November 7, 2008. Retrieved November 8, 2008.
  21. Best, Joel (1993). Threatened children : rhetoric and concern about child-victims. Chicago: University of Chicago Press. ISBN   0226044262.
  22. Best, Joel; Gerald T. Horiuchi (1985). "The Razor Blade in the Apple: The Social Construction of Urban Legends". Social Problems. 32 (5): 488–99. doi:10.2307/800777. JSTOR   800777.
  23. "CDC Web-based Injury Statistics Query and Reporting System". 9 February 2023.
  24. "FBI Uniform Crime Reports".

See also