Proportionality bias

Last updated

The proportionality bias, also known as major event/major cause heuristic, is the tendency to assume that big events have big causes. It is a type of cognitive bias and plays an important role in people's tendency to accept conspiracy theories. [1] [2] Academic psychologist Rob Brotherton summarises it as "When something big happens, we tend to assume that something big must have caused it". [3]

Contents

Underlying mechanisms

It has been suggested that proportionality bias is a cognitive response to the natural human tendency to search for patterns (Leman, 2007), [4] as well as a proclivity to seek out causality. Human beings intake data from the outside world at an extremely high rate, resulting in the development of schema (cognitive frameworks that aid us in interpreting and organising information), which aid quick information processing and judgement making. This is to account for the limited nature of the cognitive load (the amount of information that working memory can hold at one time)(Sweller, 2010). [5] By equating the magnitude of events and causes, strain on cognitive load reduces as a result of a decrease in mental processing required. Our intuition follows the notion that things should be balanced, maintaining a sense of order and logic. Such a notion satisfies key psychological motives, such as a need for certainty, control, and understanding.

This bias may be affected by an innate desire to be self-determining (Heider, 1958), [6] resulting in a reluctance towards accepting our lack of agency in many everyday situations. This bias can be associated with the rejection of the notion of ultimate powerlessness to affect events, instead opting to highlight agency and control, as ultimately, if events and causes can be understood systematically and patterns can be identified, an individual may have more ability to alter future events. Often human nature depicts the 'balancing' of cause-effect relationships by overweighting the cause, or by dismissing or diminishing the effect.

In reality, randomness plays a bigger role in the events of our lives than we would like to admit. This rejection of 'the science of probability' is the very thing that makes us so susceptible to cognitive biases such as probability bias.

Conspiracy theories

Proportionality bias has been heavily linked to the development and persistence of conspiracy theories – a direct behavioural example of the balancing of cause-effect relationships, such as through the belief that a major event was actually caused by a 'higher power'. For example, many believe that Princess Diana died as the result of conspiracy within the government and British Royal Family, arguably due to the fact that the death of such a prominent public figure being caused by something so seemingly mundane as a car crash is an uncomfortable truth to accept.

Leman and Cinirella's (2007) study [1] saw 64 participants read 1 of 4 variations of a vignette reporting, in which the assassination of a theoretical president was either attempted or successful – participants later rate the likelihood that 8 related statements were accurate. It was found that individuals were more likely to endorse a conspiracy when the consequences of an event were major (and the president dies). Such findings support the notion that conspiracy theories are an effect of probability bias.

Cultural differences

Further research has suggested that susceptibility to probability bias is not consistent, but rather it is affected by cultural norms and practices (Spina et al., 2010). [7] In general, the prevalence of probability bias in Western cultures has been heavily documented. Shultz and Ravinsky (1977) [8] demonstrated that Canadian school children attributed a loud noise to a heavy lever rather than a delicate one, suggesting that the tendency to balance cause and effect develops at a young age. More support comes from Ebel-Lam et al. (2008). [9] Canadian participants read a scenario depicting either a high-magnitude or moderate-magnitude effect following a plane fault. Participants estimated the likelihood of each respective magnitude for the causes, and were seen to match magnitude of events and causes.

However, these studies only provide research support for cultures similar to that of Canada. In contrast, Asian folk wisdom states that 'one tiny insect may be enough to destroy a nation'. Such an aphorism is representative of a key difference in reasoning between cultures – East Asian cultures have been suggested to reason more holistically, versus North America's typically analytical reasoning (Nisbett et al., 2001). [10] Furthermore, research suggests that East Asians attend to situational factors or context more than North Americans do. Masuda and Nisbett's (2001) [11] study saw American and Japanese participants engage in photo and object recall tests. When asked to recall objects from an underwater scene, Americans were more likely to describe a focal element, such as a large fish in the centre, whereas the Japanese participants mentioned more contextual elements, such as seaweed, first. Even more strikingly, Japanese participants' ability to recall focal objects from the scene was impaired when the background was altered, whereas it had little to no effect on American participant recall, suggesting distinct cultural variation in terms of attentional processes and reasoning. Morris and Peng's (1994) [12] study has further suggested that Americans tend to make more dispositional attributions (inferring that an event is due to personal factors), whereas Asian cultures show a proclivity for favouring situational attributions (context-based).

In general, Eastern cultures may have more complex causal theories, and consider a larger breadth of causal factors than Western cultures (Choi et al., 2003). [13] Choi's 10-item measure of holistic tendency found Koreans to be more holistic. The existence of such cultural variation serves as support for the underlying mechanisms that cause proportionality bias – schemas are developed based on information from external stimuli and life experiences, meaning cultural variance highlights the impact of different environments on mental frameworks, and subsequently proportionality bias.

Practical implications

The mechanisms related to proportionality bias effect behaviour in the everyday – a common example is the excessive shaking of dice in one's hands before rolling, as if the extra expenditure of effort will result in a better outcome. Similarly, the bias may seek to explain behaviour surrounding debt, which has increased by a power of 10 over the past 50 years [7] in America – a lack of attention towards routine minor expenses in preference for major expenses may explain these statistics. In contrast, Eastern cultures may underestimate the importance of major health related behaviours, as they believe multiple minor ones will compensate (e.g. lung damage caused by smoking being combated by improving general health, rather than quitting smoking). [7]

Research on probability bias is vital for combatting the negative impact it has the potential to cause, both independently and socially. Working with software, for example, may help train individuals out of the habit of magnitude matching, as when coding, syntax is central to a program, and one misplaced error may result in system malfunction. As global interconnectedness increases, understanding of this bias and its variance is hugely important, as this difference in underlying mechanisms, and subsequent behaviour, increases the possibility of misunderstanding and miscommunication, which may in turn lead to conflict.

Notes

  1. 1 2 Leman PJ, Cinnirella M (2007). "A major event has a major cause: Evidence for the role of heuristics in reasoning about conspiracy theories". Social Psychological Review. 9 (2): 18–28. doi:10.53841/bpsspr.2007.9.2.18. S2CID   245126866.
  2. Buckley, Thea (2015). "Why Do Some People Believe in Conspiracy Theories?". Scientific American Mind. 26 (4): 72. doi:10.1038/scientificamericanmind0715-72a . Retrieved 26 July 2020.
  3. "Account for Proportionality Bias: Big Events Must Have Big Causes".
  4. J. Leman, Patrick; Cinnirella, Marco (October 2007). "A major event has a major cause: Evidence for the role of heuristics in reasoning about conspiracy theories". Social Psychological Review. 9 (2): 18–28. doi:10.53841/bpsspr.2007.9.2.18. ISSN   1369-7862. S2CID   245126866.
  5. Sweller, John (2010-04-26), "Cognitive Load Theory: Recent Theoretical Advances", Cognitive Load Theory, Cambridge University Press, pp. 29–47, doi:10.1017/cbo9780511844744.004, ISBN   9780521677585 , retrieved 2023-03-28
  6. Heider, F. (2013-05-13). The Psychology of Interpersonal Relations. doi:10.4324/9780203781159. ISBN   9781134922185.
  7. 1 2 3 Spina, Roy R.; Ji, Li-Jun; Tieyuan Guo; Zhiyong Zhang; Ye Li; Fabrigar, Leandre (May 2010). "Cultural Differences in the Representativeness Heuristic: Expecting a Correspondence in Magnitude Between Cause and Effect". Personality and Social Psychology Bulletin. 36 (5): 583–597. doi:10.1177/0146167210368278. ISSN   0146-1672. PMID   20467046. S2CID   27040557.
  8. Shultz, Thomas R.; Ravinsky, Frances B. (December 1977). "Similarity as a Principle of Causal Inference". Child Development. 48 (4): 1552. doi:10.2307/1128518. ISSN   0009-3920. JSTOR   1128518.
  9. Ebel-Lam, Anna P.; Fabrigar, Leandre R.; MacDonald, Tara K.; Jones, Sarah (2010-11-18). "Balancing Causes and Consequences: The Magnitude-Matching Principle in Explanations for Complex Social Events". Basic and Applied Social Psychology. 32 (4): 348–359. doi:10.1080/01973533.2010.519245. ISSN   0197-3533. S2CID   145554825.
  10. Nisbett, Richard E.; Peng, Kaiping; Choi, Incheol; Norenzayan, Ara (2008-05-05), "Culture and Systems of Thought: Holistic versus Analytic Cognition", Reasoning, Cambridge University Press, pp. 956–985, doi:10.1017/cbo9780511814273.050, ISBN   9780521848152 , retrieved 2023-03-28
  11. Masuda, Takahiko; Nisbett, Richard E. (2001). "Attending holistically versus analytically: Comparing the context sensitivity of Japanese and Americans". Journal of Personality and Social Psychology. 81 (5): 922–934. doi:10.1037/0022-3514.81.5.922. ISSN   1939-1315. PMID   11708567.
  12. Morris, Michael W.; Peng, Kaiping (December 1994). "Culture and cause: American and Chinese attributions for social and physical events". Journal of Personality and Social Psychology. 67 (6): 949–971. doi:10.1037/0022-3514.67.6.949. ISSN   1939-1315.
  13. Choi, Incheol; Dalal, Reeshad; Kim-Prieto, Chu; Park, Hyekyung (2003). "Culture and judgement of causal relevance". Journal of Personality and Social Psychology. 84 (1): 46–59. doi:10.1037/0022-3514.84.1.46. ISSN   1939-1315. PMID   12518970.

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills.

In social psychology, fundamental attribution error, also known as correspondence bias or attribution effect, is a cognitive attribution bias where observers underemphasize situational and environmental factors for the behavior of an actor while overemphasizing dispositional or personality factors. In other words, observers tend to overattribute the behaviors of others to their personality and underattribute them to the situation or context. Although personality traits and predispositions are considered to be observable facts in psychology, the fundamental attribution error is an error because it misinterprets their effects.

Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, is the common tendency for people to perceive past events as having been more predictable than they were.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

Actor–observer asymmetry is a bias one makes when forming attributions about the behavior of others or themselves. When people judge their own behavior, they are more likely to attribute their actions to the particular situation than to their personality. However, when an observer is explaining the behavior of another person, they are more likely to attribute this behavior to the actors' personality rather than to situational factors.

In psychology, an attribution bias or attributional errors is a cognitive bias that refers to the systematic errors made when people evaluate or try to find reasons for their own and others' behaviors. It refers to the systematic patterns of deviation from norm or rationality in judgment, often leading to perceptual distortions, inaccurate assessments, or illogical interpretations of events and behaviors.

A self-serving bias is any cognitive or perceptual process that is distorted by the need to maintain and enhance self-esteem, or the tendency to perceive oneself in an overly favorable manner. It is the belief that individuals tend to ascribe success to their own abilities and efforts, but ascribe failure to external factors. When individuals reject the validity of negative feedback, focus on their strengths and achievements but overlook their faults and failures, or take more credit for their group's work than they give to other members, they are protecting their self-esteem from threat and injury. These cognitive and perceptual tendencies perpetuate illusions and error, but they also serve the self's need for esteem. For example, a student who attributes earning a good grade on an exam to their own intelligence and preparation but attributes earning a poor grade to the teacher's poor teaching ability or unfair test questions might be exhibiting a self-serving bias. Studies have shown that similar attributions are made in various situations, such as the workplace, interpersonal relationships, sports, and consumer decisions.

The group attribution error refers to people's tendency to believe either

  1. the characteristics of an individual group member are reflective of the group as a whole, or
  2. a group's decision outcome must reflect the preferences of individual group members, even when external information is available suggesting otherwise.

Trait ascription bias is the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable in their personal traits across different situations. More specifically, it is a tendency to describe one's own behaviour in terms of situational factors while preferring to describe another's behaviour by ascribing fixed dispositions to their personality. This may occur because peoples' own internal states are more readily observable and available to them than those of others.

In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to "see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances". In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.

In psychology, a dual process theory provides an account of how thought can arise in two different ways, or as a result of two different processes. Often, the two processes consist of an implicit (automatic), unconscious process and an explicit (controlled), conscious process. Verbalized explicit processes or attitudes and actions may change with persuasion or education; though implicit process or attitudes usually take a long amount of time to change with the forming of new habits. Dual process theories can be found in social, personality, cognitive, and clinical psychology. It has also been linked with economics via prospect theory and behavioral economics, and increasingly in sociology through cultural analysis.

Attribution is a term used in psychology which deals with how individuals perceive the causes of everyday experience, as being either external or internal. Models to explain this process are called Attribution theory. Psychological research into attribution began with the work of Fritz Heider in the early 20th century, and the theory was further advanced by Harold Kelley and Bernard Weiner. Heider first introduced the concept of perceived 'locus of causality' to define the perception of one's environment. For instance, an experience may be perceived as being caused by factors outside the person's control (external) or it may be perceived as the person's own doing (internal). These initial perceptions are called attributions. Psychologists use these attributions to better understand an individual's motivation and competence. The theory is of particular interest to employers who use it to increase worker motivation, goal orientation, and productivity.

Patricia Wenjie Cheng is a Chinese American psychologist. She is a leading researcher in cognitive psychology who works on human reasoning. She is best known for her psychological work on human understanding of causality. Her "power theory of the probabilistic contrast model," or power PC theory (1997) posits that people filter observations of events through a basic belief that causes have the power to generate their effects, thereby inferring specific cause-effect relations.

The ultimate attribution error is a type of attribution error which describes how attributions of outgroup behavior are more negative than ingroup behavior. As a cognitive bias, the error results in negative outgroup behavior being more likely to be attributed to factors internal and specific to the actor, such as personality, and the attribution of negative ingroup behavior to external factors such as luck or circumstance. The bias reinforces negative stereotypes and prejudice about the outgroup and favouritism of the ingroup through positive stereotypes. The theory also extends to the bias that positive acts performed by ingroup members are more likely a result of their personality.

Causal reasoning is the process of identifying causality: the relationship between a cause and its effect. The study of causality extends from ancient philosophy to contemporary neuropsychology; assumptions about the nature of causality may be shown to be functions of a previous event preceding a later one. The first known protoscientific study of cause and effect occurred in Aristotle's Physics. Causal inference is an example of causal reasoning.

<span class="mw-page-title-main">Introspection illusion</span> Cognitive bias of people thinking they understand their own mental states but others are inaccurate

The introspection illusion is a cognitive bias in which people wrongly think they have direct insight into the origins of their mental states, while treating others' introspections as unreliable. The illusion has been examined in psychological experiments, and suggested as a basis for biases in how people compare themselves to others. These experiments have been interpreted as suggesting that, rather than offering direct access to the processes underlying mental states, introspection is a process of construction and inference, much as people indirectly infer others' mental states from their behaviour.

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

Belief perseverance is maintaining a belief despite new information that firmly contradicts it.