Risk perception

Last updated
Factors of risk perceptions Factors of risk perceptions. Adapted from Godovykh et al. (2021).jpg
Factors of risk perceptions

Risk perception is the subjective judgement that people make about the characteristics and severity of a risk. [1] [2] [3] Risk perceptions often differ from statistical assessments of risk since they are affected by a wide range of affective (emotions, feelings, moods, etc.), cognitive (gravity of events, media coverage, risk-mitigating measures, etc.), contextual (framing of risk information, availability of alternative information sources, etc.), and individual (personality traits, previous experience, age, etc.) factors. [3] Several theories have been proposed to explain why different people make different estimates of the dangerousness of risks. [4] [5] Three major families of theory have been developed: psychology approaches (heuristics and cognitive), anthropology/sociology approaches (cultural theory) and interdisciplinary approaches (social amplification of risk framework).

Contents

Early theories

The study of risk perception arose out of the observation that experts and lay people often disagreed about how risky various technologies and natural hazards were.

The mid 1960s saw the rapid rise of nuclear technologies and the promise of clean and safe energy. However, public perception shifted against this new technology. Fears of both longitudinal dangers to the environment and immediate disasters creating radioactive wastelands turned the public against this new technology. The scientific and governmental communities asked why public perception was against the use of nuclear energy when all the scientific experts were declaring how safe it really was. The problem, as non-experts perceived it, was a difference between scientific facts and an exaggerated public perception of the dangers. [6]

A key early paper was written in 1969 by Chauncey Starr. [7] Starr used a revealed preference approach to find out what risks are considered acceptable by society. He assumed that society had reached equilibrium in its judgment of risks, so whatever risk levels actually existed in society were acceptable. His major finding was that people will accept risks 1,000 times greater if they are voluntary (e.g. driving a car) than if they are involuntary (e.g. a nuclear disaster).

This early approach assumed that individuals behave rationally by weighing information before making a decision, and that individuals have exaggerated fears due to inadequate or incorrect information. Implied in this assumption is that additional information can help people understand true risk and hence lessen their opinion of danger. [6] While researchers in the engineering school did pioneer research in risk perception, by adapting theories from economics, it has little use in a practical setting. Numerous studies have rejected the belief that additional information alone will shift perceptions. [8]

Psychological approach

The psychological approach began with research in trying to understand how people process information. These early works maintained that people use cognitive heuristics in sorting and simplifying information, leading to biases in comprehension. Later work built on this foundation and became the psychometric paradigm. This approach identifies numerous factors responsible for influencing individual perceptions of risk, including dread, novelty, stigma, and other factors. [9]

Research also shows that risk perceptions are influenced by the emotional state of the perceiver. [10] The valence theory of risk perception only differentiates between positive emotions, such as happiness and optimism, and negative ones, such as fear and anger. According to valence theory, positive emotions lead to optimistic risk perceptions whereas negative emotions influence a more pessimistic view of risk. [11]

Research also has found that, whereas risk and benefit tend to be positively correlated across hazardous activities in the world, they are negatively correlated in people's minds and judgements. [12]

Heuristics and biases

The earliest psychometric research was done by psychologists Daniel Kahneman and Amos Tversky, who performed a series of gambling experiments to see how people evaluated probabilities. Their major finding was that people use a number of heuristics to evaluate information. These heuristics are usually useful shortcuts for thinking, but they may lead to inaccurate judgments in some situations – in which case they become cognitive biases.

Another key finding was that the experts are not necessarily any better at estimating probabilities than lay people. Experts were often overconfident in the exactness of their estimates, and put too much stock in small samples of data. [13]

Cognitive Psychology

The majority of people in the public express a greater concern for problems which appear to possess an immediate effect on everyday life such as hazardous waste or pesticide-use than for long-term problems that may affect future generations such as climate change or population growth. [14] People greatly rely on the scientific community to assess the threat of environmental problems because they usually do not directly experience the effects of phenomena such as climate change. The exposure most people have to climate change has been impersonal; most people only have virtual experience through documentaries and news media in what may seem like a “remote” area of the world. [15] However, coupled with the population’s wait-and-see attitude, people do not understand the importance of changing environmentally destructive behaviors even when experts provide detailed and clear risks caused by climate change. [16]

Psychometric paradigm

Research within the psychometric paradigm turned to focus on the roles of affect, emotion, and stigma in influencing risk perception. Melissa Finucane and Paul Slovic have been among the key researchers here. These researchers first challenged Starr's article by examining expressed preference – how much risk people say they are willing to accept. They found that, contrary to Starr's basic assumption, people generally saw most risks in society as being unacceptably high. They also found that the gap between voluntary and involuntary risks was not nearly as great as Starr claimed.

Slovic and team found that perceived risk is quantifiable and predictable. People tend to view current risk levels as unacceptably high for most activities. [17] All things being equal, the greater people perceived a benefit, the greater the tolerance for a risk. [13] If a person derived pleasure from using a product, people tended to judge its benefits as high and its risks as low. If the activity was disliked, the judgments were opposite. [18] Research in psychometrics has proven that risk perception is highly dependent on intuition, experiential thinking, and emotions.

Psychometric research identified a broad domain of characteristics that may be condensed into three high order factors: 1) the degree to which a risk is understood, 2) the degree to which it evokes a feeling of dread, and 3) the number of people exposed to the risk. A dread risk elicits visceral feelings of terror, uncontrollable, catastrophe, inequality, and uncontrolled. An unknown risk is new and unknown to science. The more a person dreads an activity, the higher its perceived risk and the more that person wants the risk reduced. [13]

Anthropology/sociology approach

The anthropology/sociology approach posits risk perceptions as produced by and supporting social institutions. [19] In this view, perceptions are socially constructed by institutions, cultural values, and ways of life.

Cultural theory

One line of the Cultural Theory of risk is based on the work of anthropologist Mary Douglas and political scientist Aaron Wildavsky first published in 1982. [20] In cultural theory, Douglas and Wildavsky outline four “ways of life” in a grid/group arrangement. Each way of life corresponds to a specific social structure and a particular outlook on risk. Grid categorizes the degree to which people are constrained and circumscribed in their social role. The tighter binding of social constraints limits individual negotiation. Group refers to the extent to which individuals are bounded by feelings of belonging or solidarity. The greater the bonds, the less individual choice are subject to personal control. [21] Four ways of life include: Hierarchical, Individualist, Egalitarian, and Fatalist.

Risk perception researchers have not widely accepted this version of cultural theory. Even Douglas says that the theory is controversial; it poses a danger of moving out of the favored paradigm of individual rational choice of which many researchers are comfortable. [22]

On the other hand, writers who drawn upon a broader cultural theory perspective have argued that risk-perception analysis helps understand the public response to terrorism in a way that goes far beyond 'rational choice'. As John Handmer and Paul James write:

In the area of embodied risk, people are not as fearful of themselves as perhaps they should be on the issues of illicit drug use, unsafe sex and so on. Yet with the compounding of both more abstract and more embodied risk this package appears to have met its goal to generate support for government policy. Fear of 'outsiders' and of a non-specific, invisible and uncontrollable threat was a powerful motivator in shaping perception. [23]

National Culture and Risk Survey

The First National Culture and Risk Survey of cultural cognition found that a person's worldview on the two social and cultural dimensions of "hierarchy-egalitarianism," and "individualism-solidarism" was predictive of their response to risk. [24]

Interdisciplinary approach

Social amplification of risk framework

The Social Amplification of Risk Framework (SARF), combines research in psychology, sociology, anthropology, and communications theory. SARF outlines how communications of risk events pass from the sender through intermediate stations to a receiver and in the process serve to amplify or attenuate perceptions of risk. All links in the communication chain, individuals, groups, media, etc., contain filters through which information is sorted and understood.

The framework attempts to explain the process by which risks are amplified, receiving public attention, or attenuated, receiving less public attention. The framework may be used to compare responses from different groups in a single event, or analyze the same risk issue in multiple events. In a single risk event, some groups may amplify their perception of risks while other groups may attenuate, or decrease, their perceptions of risk.

The main thesis of SARF states that risk events interact with individual psychological, social and other cultural factors in ways that either increase or decrease public perceptions of risk. Behaviors of individuals and groups then generate secondary social or economic impacts while also increasing or decreasing the physical risk itself. [25]

These ripple effects caused by the amplification of risk include enduring mental perceptions, impacts on business sales, and change in residential property values, changes in training and education, or social disorder. These secondary changes are perceived and reacted to by individuals and groups resulting in third-order impacts. As each higher-order impacts are reacted to, they may ripple to other parties and locations. Traditional risk analyses neglect these ripple effect impacts and thus greatly underestimate the adverse effects from certain risk events. Public distortion of risk signals provides a corrective mechanism by which society assesses a fuller determination of the risk and its impacts to such things not traditionally factored into a risk analysis. [26]

See also

Related Research Articles

Controversy is a state of prolonged public dispute or debate, usually concerning a matter of conflicting opinion or point of view. The word was coined from the Latin controversia, as a composite of controversus – "turned in an opposite direction".

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

A heuristic or heuristic technique is any approach to problem solving that employs a pragmatic method that is not fully optimized, perfected, or rationalized, but is nevertheless "good enough" as an approximation or attribute substitution. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Heuristic reasoning is often based on induction, or on analogy[.] [...] Induction is the process of discovering general laws [...] Induction tries to find regularity and coherence [...] Its most conspicuous instruments are generalization, specialization, analogy. [...] Heuristic discusses human behavior in the face of problems [...that have been] preserved in the wisdom of proverbs.

The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.

In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to "see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances". In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.

<span class="mw-page-title-main">Paul Slovic</span> American professor of psychology (b.1938)

Paul Slovic is a professor of psychology at the University of Oregon and the president of Decision Research. Decision Research is a collection of scientists from all over the nation and in other countries that study decision-making in times when risks are involved. He was also the president for the Society of Risk Analysis until 1984. He earned his undergraduate degree at Stanford University in 1959 and his PhD in psychology at the University of Michigan in 1964 and has received honorary doctorates from the Stockholm School of Economics and the University of East Anglia. He is past president of the Society for Risk Analysis and in 1991 received its Distinguished Contribution Award. In 1993, he received the Distinguished Scientific Contribution Award from the American Psychological Association, and in 1995 he received the Outstanding Contribution to Science Award from the Oregon Academy of Science. In 2016 he was elected to the National Academy of Sciences.

<span class="mw-page-title-main">Simulation heuristic</span> Mental strategy

The simulation heuristic is a psychological heuristic, or simplified mental strategy, according to which people determine the likelihood of an event based on how easy it is to picture the event mentally. Partially as a result, people experience more regret over outcomes that are easier to imagine, such as "near misses". The simulation heuristic was first theorized by psychologists Daniel Kahneman and Amos Tversky as a specialized adaptation of the availability heuristic to explain counterfactual thinking and regret. However, it is not the same as the availability heuristic. Specifically the simulation heuristic is defined as "how perceivers tend to substitute normal antecedent events for exceptional ones in psychologically 'undoing' this specific outcome."

The cultural theory of risk, often referred to simply as Cultural Theory, consists of a conceptual framework and an associated body of empirical studies that seek to explain societal conflict over risk. Whereas other theories of risk perception stress economic and cognitive influences, Cultural Theory asserts that structures of social organization endow individuals with perceptions that reinforce those structures in competition against alternative ones. This theory was first elaborated in the book Natural Symbols, written by anthropologist Mary Douglas in 1970. Douglas later worked closely with the political scientist Aaron Wildavsky, to clarify the theory. Cultural Theory has given rise to a diverse set of research programs that span multiple social science disciplines and that have in recent years been used to analyze policymaking conflicts generally.

The affect heuristic is a heuristic, a mental shortcut that allows people to make decisions and solve problems quickly and efficiently, in which current emotion—fear, pleasure, surprise, etc.—influences decisions. In other words, it is a type of heuristic in which emotional response, or "affect" in psychological terms, plays a lead role. It is a subconscious process that shortens the decision-making process and allows people to function without having to complete an extensive search for information. It is shorter in duration than a mood, occurring rapidly and involuntarily in response to a stimulus. Reading the words "lung cancer" usually generates an affect of dread, while reading the words "mother's love" usually generates a feeling of affection and comfort. The affect heuristic is typically used while judging the risks and benefits of something, depending on the positive or negative feelings that people associate with a stimulus. It is the equivalent of "going with your gut". If their feelings towards an activity are positive, then people are more likely to judge the risks as low and the benefits high. On the other hand, if their feelings towards an activity are negative, they are more likely to perceive the risks as high and benefits low.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

The cultural cognition of risk, sometimes called simply cultural cognition, is the hypothesized tendency to perceive risks and related facts in relation to personal values. Research examining this phenomenon draws on a variety of social science disciplines including psychology, anthropology, political science, sociology, and communications. The stated objectives of this research are both to understand how values shape political conflict over facts and to promote effective deliberative strategies for resolving such conflicts consistent with sound empirical data.

Optimism bias or optimistic bias is a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event. It is also known as unrealistic optimism or comparative optimism.

The negativity bias, also known as the negativity effect, is a cognitive bias that, even when positive or neutral things of equal intensity occur, things of a more negative nature have a greater effect on one's psychological state and processes than neutral or positive things. In other words, something very positive will generally have less of an impact on a person's behavior and cognition than something equally emotional but negative. The negativity bias has been investigated within many different domains, including the formation of impressions and general evaluations; attention, learning, and memory; and decision-making and risk considerations.

Social perception is the study of how people form impressions of and make inferences about other people as sovereign personalities. Social perception refers to identifying and utilizing social cues to make judgments about social roles, rules, relationships, context, or the characteristics of others. This domain also includes social knowledge, which refers to one's knowledge of social roles, norms, and schemas surrounding social situations and interactions. People learn about others' feelings and emotions by picking up information they gather from physical appearance, verbal, and nonverbal communication. Facial expressions, tone of voice, hand gestures, and body position or movement are a few examples of ways people communicate without words. A real-world example of social perception is understanding that others disagree with what one said when one sees them roll their eyes. There are four main components of social perception: observation, attribution, integration, and confirmation.

The psychology of reasoning is the study of how people reason, often broadly defined as the process of drawing conclusions to inform how people solve problems and make decisions. It overlaps with psychology, philosophy, linguistics, cognitive science, artificial intelligence, logic, and probability theory.

<span class="mw-page-title-main">Risk</span> Possibility of something bad happening

In simple terms, risk is the possibility of something bad happening. Risk involves uncertainty about the effects/implications of an activity with respect to something that humans value, often focusing on negative, undesirable consequences. Many different definitions have been proposed. One international standard definition of risk is the "effect of uncertainty on objectives".

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Intuition in the context of decision-making is defined as a "non-sequential information-processing mode." It is distinct from insight and can be contrasted with the deliberative style of decision-making. Intuition can influence judgment through either emotion or cognition, and there has been some suggestion that it may be a means of bridging the two. Individuals use intuition and more deliberative decision-making styles interchangeably, but there has been some evidence that people tend to gravitate to one or the other style more naturally. People in a good mood gravitate toward intuitive styles, while people in a bad mood tend to become more deliberative. The specific ways in which intuition actually influences decisions remain poorly understood.

Compassion fade is the tendency to experience a decrease in empathy as the number of people in need of aid increase. As a type of cognitive bias, it has a significant effect on the prosocial behaviour from which helping behaviour generates. The term was developed by psychologist and researcher Paul Slovic.

References

Notes

  1. Slovic, Paul (2016-01-02). "Understanding Perceived Risk: 1978–2015". Environment: Science and Policy for Sustainable Development. 58 (1): 25–29. doi:10.1080/00139157.2016.1112169. ISSN   0013-9157. S2CID   155250644.
  2. Brewer, Noel T.; Weinstein, Neil D.; Cuite, Cara L.; Herrington, James E. (April 2004). "Risk perceptions and their relation to risk behavior". Annals of Behavioral Medicine. 27 (2): 125–130. doi: 10.1207/s15324796abm2702_7 . ISSN   0883-6612. PMID   15026296. S2CID   3676750.
  3. 1 2 Godovykh, Maksim; Pizam, Abraham; Bahja, Frida (2021-01-01). "Antecedents and outcomes of health risk perceptions in tourism, following the COVID-19 pandemic". Tourism Review. 76 (4): 737–748. doi: 10.1108/TR-06-2020-0257 . ISSN   1660-5373.
  4. Wildavsky, Aaron; Dake, Karl (1990). "Theories of Risk Perception: Who Fears What and Why?". Daedalus. 119 (4): 41–60. ISSN   0011-5266. JSTOR   20025337.
  5. Paek, Hye-Jin; Hove, Thomas (2017-03-29). "Risk Perceptions and Risk Characteristics". Oxford Research Encyclopedia of Communication. doi:10.1093/acrefore/9780190228613.013.283. ISBN   978-0-19-022861-3 . Retrieved 2021-02-22.
  6. 1 2 Douglas, Mary. Risk Acceptability According to the Social Sciences. Russell Sage Foundation, 1985.
  7. Starr, C. (1969). "Social Benefits versus Technological Risks". Science. 165 (3899): 1232–1238. doi:10.1126/science.165.3899.1232. PMID   5803536.
  8. Freudenburg, William R. (1993). "Risk and Recreancy: Weber, the Division of Labor, and the Rationality of Risk Perceptions". Social Forces. 71 (4): 909–932. doi:10.1093/sf/71.4.909.
  9. Tversky, Amos; Kahneman, Daniel (1974). "Judgment under Uncertainty: Heuristics and Biases". Science. 185 (4157): 1124–1131. doi:10.1126/science.185.4157.1124. PMID   17835457. S2CID   6196452.
  10. Bodenhausen, G.V. (1993). Emotions, arousal, and stereotypic judgments: A heuristic model of affect and stereotyping. In D.M. Mackie & D.L. Hamilton (Eds.), Affect, cognition, and stereotyping: Interactive processes in group perception (pp. 13-37). San Diego, CA: Academic Press.
  11. Lerner, JS; Keltner, D (2000). "Beyond valence: Toward a model of emotion-specific influences on judgment and choice". Cognition and Emotion. 14 (4): 473–493. CiteSeerX   10.1.1.318.6023 . doi:10.1080/026999300402763. S2CID   397458.
  12. Slovic, Paul (December 2006). "Risk Perception and Affect". Current Directions in Psychological Science. 15 (6): 322–325. doi:10.1111/j.1467-8721.2006.00461.x. S2CID   2721326.
  13. 1 2 3 Slovic, Paul; Fischhoff, Baruch; Lichtenstein, Sarah (1982). "Why Study Risk Perception?". Risk Analysis. 2 (2): 83–93. doi:10.1111/j.1539-6924.1982.tb01369.x.
  14. "Slimak & Dietz, 2006Koger" cited in Susan M., and Deborah Du Nann. Winter. The Psychology of Environmental Problems: Psychology for Sustainability. 3rd ed. New York: Psychology, 2010. 216-217
  15. Swim, Janet, Susan Clayton, Thomas Doherty, Robert Gifford, George Howard, Joseph Reser, Paul Stern, and Elke Weber. Psychology & Global Climate Change. Publication. American Psychological Association, 2010. Web. 10 December 2010. <http://www.apa.org/science/about/publications/climate-change-booklet.pdf>.
  16. "Sterman, 2008" cited in Koger, Susan M., and Deborah Du Nann. Winter. The Psychology of Environmental Problems: Psychology for Sustainability. 3rd ed. New York: Psychology, 2010. 219
  17. Slovic, Paul, ed. The Perception of Risk. Earthscan, Virginia. 2000.
  18. Gregory, Robin; Mendelsohn, Robert (1993). "Perceived Risk, Dread, and Benefits". Risk Analysis. 13 (3): 259–264. doi:10.1111/j.1539-6924.1993.tb01077.x.
  19. Wildavsky, Aaron; Dake, Karl (1990). "Theories of Risk Perception: Who Fears What and Why?". American Academy of Arts and Sciences (Daedalus). 119 (4): 41–60.
  20. Douglas, Mary and Aaron Wildavsky. Risk and Culture. University of California Press, 1982.
  21. Thompson, Michael, Richard Ellis, Aaron Wildavsky. Cultural theory. Westview Press, Boulder, Colorado, 1990.
  22. Douglas, Mary. Risk and Blame: Essays in Cultural theory. New York: Routledge, 1992.
  23. John Handmer and Paul James (2005). "Trust Us, and Be Scared: The Changing Nature of Contemporary Risk". Global Society. 21 (1): 119–30.
  24. "First National Risk & Culture Study". The Cultural Cognition Project at Yale Law School. Retrieved July 21, 2012.
  25. Kasperson, Roger E.; Renn, Ortwin; Slovic, Paul; Brown, Halina; Emel, Jacque; Goble, Robert; Kasperson, Jeanne; Ratick, Samuel (1988). "The Social Amplification of Risk: A Conceptual Framework" (PDF). Risk Analysis. 8 (2): 177–187. doi: 10.1111/j.1539-6924.1988.tb01168.x .
  26. Kasperson, Jeanne X., Roger E. Kasperson. The Social Contours of Risk. Volume I: Publics, Risk Communication & the Social Amplification of Risk. Earthscan, Virginia. 2005