This article relies largely or entirely on a single source .(April 2016) |
In public policy, outrage factor is public opposition to a policy that is not based on the knowledge of the technical details. The term "outrage factor" originates from Peter Sandman's 1993 book, Responding to Community Outrage: Strategies for Effective Risk Communication. [1] [2] [3]
"Outrage factors" are the emotional factors that influence perception of risk. The risks that are considered involuntary, industrial and unfair are often given more weight than factors that are thought of as voluntary, natural and fair.
Sandman gives the formula: [4]
Risk = Hazard + Outrage
The following are listed in Covello and Sandman's 2001 article, Risk Communication: Evolution and Revolution
Factor | Risks considered to… | Are less acceptable than… |
Voluntariness [5] | Be involuntary or imposed | Risks from voluntary activities |
Controllability [5] | Be under the control of others | Risks under individual control |
Familiarity [5] | Be unfamiliar | Risks associated with familiar activities |
Fairness [5] | Be unfair or involve unfair processes | Risks from fair activities |
Benefits [5] | Have unclear, questionable, or diffused personal or economic benefits | Risks from activities with clear benefits |
Catastrophic potential [5] | Have the potential to cause a significant number of deaths and injuries at once | Risks from activities that cause deaths and injuries at random or over a long period of time |
Understanding [5] | Be poorly understood | Well understood or self-explanatory risks |
Uncertainty [5] | Be relatively unknown or are highly uncertain | Risks from activities that appear to be relatively well known to science |
Delayed effects [5] | Have delayed effects | Risks from activities that have immediate effects |
Effects on children [5] | Put children specifically at risk | Risks that appear to primarily affect adults |
Effects on future generations [5] | Pose a threat to future generations | Risks from activities that do not |
Victim Identity [5] | Produce identifiable victims | Risks that produce statistical victims |
Dread [5] | Evoke fear, terror, or anxiety | Risks from activities that don’t arouse such feelings and emotions |
Trust [5] | Be associated with individuals, institutions, or organizations lacking in trust and credibility | Risks from activities associated with those that are trustworthy and credible |
Media attention [5] | Receive considerable media coverage | Risks from activities that receive little coverage |
Accident history [5] | Have a history of major accidents or frequent minor accidents | Risks from activities with little to no such history |
Reversibility [5] | Have potentially irreversible adverse effects | Risks from activities considered to have reversible adverse effects |
Personal stake [5] | Place people or their families personally and directly at risk | Risks from activities that pose no direct or personal threat |
Ethical/moral nature [5] | Be ethically objectionable or morally wrong | Risks from ethically neutral activities |
Human vs. natural origin [5] | Generated by human action, failure, or incompetence | Risks believed to be caused by nature or “Acts of God” |
While policy analysis by institutional stakeholders typically focuses on risk-benefit analysis and cost-benefit analysis, popular risk perception is not informed by the same concerns. The successful implementation of a policy relying on public support and cooperation must address the outrage factor when informing the public about the policy. [6]
In an interview with New York Times journalist and Freakonomics author Stephen J. Dubner, Sandman emphasized "the most important truth in risk communication is the exceedingly low correlation between whether a risk is dangerous, and whether it's upsetting". [4]
The relevance of public outrage has been acknowledged in discussions of various policy debates, including
A disaster is a serious problem occurring over a short or long period of time that causes widespread human, material, economic or environmental loss which exceeds the ability of the affected community or society to cope using its own resources. Developing countries suffer the greatest costs when a disaster hits – more than 95% of all deaths caused by hazards occur in developing countries, and losses due to natural hazards are 20 times greater in developing countries than in industrialized countries. No matter what society disasters occur in, they tend to induce change in government and social life. They may even alter the course of history by broadly affecting entire populations and exposing mismanagement or corruption regardless of how tightly information is controlled in a society.
Broadly speaking, a risk assessment is the combined effort of:
In planning and policy, a wicked problem is a problem that is difficult or impossible to solve because of incomplete, contradictory, and changing requirements that are often difficult to recognize. It refers to an idea or problem that cannot be fixed, where there is no single solution to the problem; and "wicked" denotes resistance to resolution, rather than evil. Another definition is "a problem whose social complexity means that it has no determinable stopping point". Moreover, because of complex interdependencies, the effort to solve one aspect of a wicked problem may reveal or create other problems.
The cultural theory of risk, often referred to simply as Cultural Theory, consists of a conceptual framework and an associated body of empirical studies that seek to explain societal conflict over risk. Whereas other theories of risk perception stress economic and cognitive influences, Cultural Theory asserts that structures of social organization endow individuals with perceptions that reinforce those structures in competition against alternative ones. This theory was first elaborated in the book Natural Symbols, written by anthropologist Mary Douglas in 1970. Douglas later worked closely with the political scientist Aaron Wildavsky, to clarify the theory. Cultural Theory has given rise to a diverse set of research programs that span multiple social science disciplines and that have in recent years been used to analyze policymaking conflicts generally.
Anthropogenic hazards are hazards caused by human action or inaction. They are contrasted with natural hazards. Anthropogenic hazards may adversely affect humans, other organisms, biomes, and ecosystems. They can even cause an omnicide. The frequency and severity of hazards are key elements in some risk analysis methodologies. Hazards may also be described in relation to the impact that they have. A hazard only exists if there is a pathway to exposure. As an example, the center of the earth consists of molten material at very high temperatures which would be a severe hazard if contact was made with the core. However, there is no feasible way of making contact with the core, therefore the center of the earth currently poses no hazard.
Public awareness of science (PAwS), public understanding of science (PUS), or more recently, public engagement with science and technology (PEST) are terms relating to the awareness, attitudes, behaviors, opinions, and activities that comprise the relations between the general public or lay society as a whole to scientific knowledge and organization. It is a comparatively new approach to the task of exploring the multitude of relations and linkages science, technology, and innovation have among the general public. While early work in the discipline focused on increasing or augmenting the public's knowledge of scientific topics, in line with the information deficit model of science communication, the deficit model has largely been abandoned by science communication researchers. Instead, there is an increasing emphasis on understanding how the public chooses to use scientific knowledge and on the development of interfaces to mediate between expert and lay understandings of an issue.
Risk perception is the subjective judgement that people make about the characteristics and severity of a risk. Risk perceptions are different for the real risks since they are affected by a wide range of affective, cognitive, contextual, and individual factors. Several theories have been proposed to explain why different people make different estimates of the dangerousness of risks. Three major families of theory have been developed: psychology approaches, anthropology/sociology approaches and interdisciplinary approaches.
Safety culture is the collection of the beliefs, perceptions and values that employees share in relation to risks within an organization, such as a workplace or community. Safety culture is a part of organizational culture, and has been described in a variety of ways; notably the National Academies of Science and the Association of Land Grant and Public Universities have published summaries on this topic in 2014 and 2016.
Marketisation or marketization is a restructuring process that enables state enterprises to operate as market-oriented firms by changing the legal environment in which they operate.
The cultural cognition of risk, sometimes called simply cultural cognition, is the hypothesized tendency to perceive risks and related facts in relation to personal values. Research examining this phenomenon draws on a variety of social science disciplines including psychology, anthropology, political science, sociology, and communications. The stated objectives of this research are both to understand how values shape political conflict over facts and to promote effective deliberative strategies for resolving such conflicts consistent with sound empirical data.
Tara Kirk Sell is an American former competition swimmer and breaststroke specialist who is an Olympic silver medalist. She is a former world record holder in the 100-meter breaststroke.
In its broadest sense, social vulnerability is one dimension of vulnerability to multiple stressors and shocks, including abuse, social exclusion and natural hazards. Social vulnerability refers to the inability of people, organizations, and societies to withstand adverse impacts from multiple stressors to which they are exposed. These impacts are due in part to characteristics inherent in social interactions, institutions, and systems of cultural values.
Policy studies is a subdisicipline of political science that includes the analysis of the process of policymaking and the contents of policy. Policy analysis includes substantive area research, program evaluation and impact studies, and policy design. It "involves systematically studying the nature, causes, and effects of alternative public policies, with particular emphasis on determining the policies that will achieve given goals." It emerged in the United States in the 1960s and 1970s.
Environmental epidemiology is a branch of epidemiology concerned with determining how environmental exposures impact human health. This field seeks to understand how various external risk factors may predispose to or protect against disease, illness, injury, developmental abnormalities, or death. These factors may be naturally occurring or may be introduced into environments where people live, work, and play.
In simple terms, risk is the possibility of something bad happening. Risk involves uncertainty about the effects/implications of an activity with respect to something that humans value, often focusing on negative, undesirable consequences. Many different definitions have been proposed. The international standard definition of risk for common understanding in different applications is “effect of uncertainty on objectives”.
David P. Ropeik is an international consultant, author, teacher, and speaker on risk perception and risk communication. He is also creator and director of Improving Media Coverage of Risk, a training program for journalists. He is a regular contributor to Big Think, Psychology Today, Cognoscenti, Medium and the Huffington Post. He has also written articles for other publications, including Nieman Reports.
This bibliography of Bill Clinton is a selected list of generally available published works about Bill Clinton, the 42nd president of the United States. Further reading is available on Bill Clinton, his presidency and his foreign policy, as well as in the footnotes in those articles.
Sylvia Noble Tesh, born 1937, is an American academic, professor at Yale and the University of Michigan for over two decades, and currently a professor at the University of Arizona. She earned her Ph.D. at the University of Hawaii in political science. She also served as a Fulbright professor at the Universidade Federal da Bahia, in Salvador, Brazil, in 1999. She is best known for her well-cited book Hidden Arguments: Political Ideology and Disease Prevention Policy. Her most recent book, Uncertain Hazards Environmental Activists and Scientific Proof, was published in 2000 by Cornell University Press.
Howard C. Kunreuther is an American economist. He is the James G. Dinan professor emeritus of decision sciences and public policy at the Wharton School of the University of Pennsylvania.
Marian Brooke Rogers is a British psychologist who is a Professor of Behavioural Science and Security at King's College London where she is Deputy Head of the Department of War Studies. She is a social psychologist who studies risk and threat. In 2014 she was asked to chair the Cabinet Office Behavioural Science Expert Group (BSEG). In 2019 she was appointed Chair of the Home Office Science Advisory Council (HOSAC). Professor Rogers was appointed to the Prime Minister's Council for Science and Technology in 2020.