Controversy

Last updated

Controversy is a state of prolonged public dispute or debate, usually concerning a matter of conflicting opinion or point of view. The word was coined from the Latin controversia, as a composite of controversus – "turned in an opposite direction".

Contents

In the theory of law, a controversy differs from a legal case; while legal cases include all suits, criminal as well as civil, a controversy is a purely civil proceeding.

For example, the Case or Controversy Clause of Article Three of the United States Constitution (Section 2, Clause 1) states that "the judicial Power shall extend ... to Controversies to which the United States shall be a Party". This clause has been deemed to impose a requirement that United States federal courts are not permitted to cases that do not pose an actual controversythat is, an actual dispute between adverse parties which is capable of being resolved by the [court]. In addition to setting out the scope of the jurisdiction of the federal judiciary, it also prohibits courts from issuing advisory opinions, or from hearing cases that are either unripe, meaning that the controversy has not arisen yet, or moot, meaning that the controversy has already been resolved.

Benford's law

Benford's law of controversy, as expressed by the astrophysicist and science fiction author Gregory Benford in 1980, states: Passion is inversely proportional to the amount of real information available. [1] [2] In other words, it claims that the less factual information is available on a topic, the more controversy can arise around that topic – and the more facts are available, the less controversy can arise. Thus, for example, controversies in physics would be limited to subject areas where experiments cannot be carried out yet, whereas controversies would be inherent to politics, where communities must frequently decide on courses of action based on insufficient information.

Psychological bases

Controversies are frequently thought to be a result of a lack of confidence on the part of the disputants – as implied by Benford's law of controversy, which only talks about lack of information ("passion is inversely proportional to the amount of real information available"). For example, in analyses of the political controversy over anthropogenic climate change, which is exceptionally virulent in the United States, it has been proposed that those who are opposed to the scientific consensus do so because they don't have enough information about the topic. [3] [4] A study of 1540 US adults [5] found instead that levels of scientific literacy correlated with the strength of opinion on climate change, but not on which side of the debate that they stood.

The puzzling phenomenon of two individuals being able to reach different conclusions after being exposed to the same facts has been frequently explained (particularly by Daniel Kahneman) by reference to a 'bounded rationality' – in other words, that most judgments are made using fast acting heuristics [6] [7] that work well in every day situations, but are not amenable to decision-making about complex subjects such as climate change. Anchoring has been particularly identified as relevant in climate change controversies [8] as individuals are found to be more positively inclined to believe in climate change if the outside temperature is higher, if they have been primed to think about heat, and if they are primed with higher temperatures when thinking about the future temperature increases from climate change.

In other controversies – such as that around the HPV vaccine, the same evidence seemed to license inference to radically different conclusions. [9] Kahan et al. [10] explained this by the cognitive biases of biased assimilation [11] and a credibility heuristic. [12]

Similar effects on reasoning are also seen in non-scientific controversies, for example in the gun control debate in the United States. [13] As with other controversies, it has been suggested that exposure to empirical facts would be sufficient to resolve the debate once and for all. [14] [15] In computer simulations of cultural communities, beliefs were found to polarize within isolated sub-groups, based on the mistaken belief of the community's unhindered access to ground truth. [13] Such confidence in the group to find the ground truth is explicable through the success of wisdom of the crowd based inferences. [16] However, if there is no access to the ground truth, as there was not in this model, the method will fail.

Bayesian decision theory allows these failures of rationality to be described as part of a statistically optimized system for decision making. Experiments and computational models in multisensory integration have shown that sensory input from different senses is integrated in a statistically optimal way, [17] in addition, it appears that the kind of inferences used to infer single sources for multiple sensory inputs uses a Bayesian inference about the causal origin of the sensory stimuli. [18] As such, it appears neurobiologically plausible that the brain implements decision-making procedures that are close to optimal for Bayesian inference.

Brocas and Carrillo propose a model to make decisions based on noisy sensory inputs, [19] beliefs about the state of the world are modified by Bayesian updating, and then decisions are made based on beliefs passing a threshold. They show that this model, when optimized for single-step decision making, produces belief anchoring and polarization of opinions – exactly as described in the global warming controversy context – in spite of identical evidence presented, the pre-existing beliefs (or evidence presented first) has an overwhelming effect on the beliefs formed. In addition, the preferences of the agent (the particular rewards that they value) also cause the beliefs formed to change – this explains the biased assimilation (also known as confirmation bias) shown above. This model allows the production of controversy to be seen as a consequence of a decision maker optimized for single-step decision making, rather than as a result of limited reasoning in the bounded rationality of Daniel Kahneman.

See also

Listen to this article (8 minutes)
noicon
Sound-icon.svg
This audio file was created from a revision of this article dated 27 June 2013 (2013-06-27), and does not reflect subsequent edits.

Related Research Articles

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.

Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called "Bayesian probability".

A heuristic or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Bounded rationality is the idea that rationality is limited when individuals make decisions. In other words, humans' "...preferences are determined by changes in outcomes relative to a certain reference level..." as stated by Esther-Mirjam Sent (2018) Limitations include the difficulty of the problem requiring a decision, the cognitive capability of the mind, and the time available to make the decision. Decision-makers, in this view, act as satisficers, seeking a satisfactory solution, rather than an optimal solution. Therefore, humans do not undertake a full cost-benefit analysis to determine the optimal decision, but rather, choose an option that fulfils their adequacy criteria.

Behavioral economics Academic discipline

Behavioral economics studies the effects of psychological, cognitive, emotional, cultural and social factors on the decisions of individuals and institutions and how those decisions vary from those implied by classical economic theory.

Decision theory is the study of an agent's choices. Decision theory can be broken into two branches: normative decision theory, which analyzes the outcomes of decisions or determines the optimal decisions given constraints and assumptions, and descriptive decision theory, which analyzes how agents actually make the decisions they do.

Status quo bias is an emotional bias; a preference for the current state of affairs. The current baseline is taken as a reference point, and any change from that baseline is perceived as a loss. Status quo bias should be distinguished from a rational preference for the status quo ante, as when the current state of affairs is objectively superior to the available alternatives, or when imperfect information is a significant problem. A large body of evidence, however, shows that status quo bias frequently affects human decision-making.

The overconfidence effect is a well-established bias in which a person's subjective confidence in his or her judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

The cultural cognition of risk, sometimes called simply cultural cognition, is the hypothesized tendency to perceive risks and related facts in relation to personal values. Research examining this phenomenon draws on a variety of social science disciplines including psychology, anthropology, political science, sociology, and communications. The stated objectives of this research are both to understand how values shape political conflict over facts and to promote effective deliberative strategies for resolving such conflicts consistent with sound empirical data.

In psychology, the human mind is considered to be a cognitive miser due to the tendency of people to think and solve problems in simpler and less effortful ways rather than in more sophisticated and more effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain how and why people are cognitive misers.

Heuristics are simple strategies or mental processes that humans, animals, organizations and machines use to quickly form judgments, make decisions, and find solutions to complex problems. This happens when an individual focuses on the most relevant aspects of a problem or situation to formulate a solution.

Motivated reasoning is a phenomenon studied in cognitive science and social psychology that uses emotionally biased reasoning to produce justifications or make decisions that are most desired rather than those that accurately reflect the evidence, while still reducing cognitive dissonance. In other words, motivated reasoning is the "tendency to find arguments in favor of conclusions we want to believe to be stronger than arguments for conclusions we do not want to believe".

<i>Thinking, Fast and Slow</i> 2011 book by Daniel Kahneman

Thinking, Fast and Slow is a best-selling book published in 2011 by Nobel Memorial Prize in Economic Sciences laureate Daniel Kahneman. It was the 2012 winner of the National Academies Communication Award for best creative work that helps the public understanding of topics in behavioral science, engineering and medicine.

In cognitive psychology and decision science, conservatism or conservatism bias is a bias which refers to the tendency to revise one's belief insufficiently when presented with new evidence. This bias describes human belief revision in which people over-weigh the prior distribution and under-weigh new sample evidence when compared to Bayesian belief-revision.

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

The free energy principle is a formal statement that explains how living and non-living systems remain in non-equilibrium steady-states by restricting themselves to a limited number of states. It establishes that systems minimise a free energy function of their internal states, which entail beliefs about hidden states in their environment. The implicit minimisation of free energy is formally related to variational Bayesian methods and was originally introduced by Karl Friston as an explanation for embodied perception in neuroscience, where it is also known as active inference.

Ecological rationality is a particular account of practical rationality, which in turn specifies the norms of rational action – what one ought to do in order to act rationally. The presently dominant account of practical rationality in the social and behavioral sciences such as economics and psychology, rational choice theory, maintains that practical rationality consists in making decisions in accordance with some fixed rules, irrespective of context. Ecological rationality, in contrast, claims that the rationality of a decision depends on the circumstances in which it takes place, so as to achieve one's goals in this particular context. What is considered rational under the rational choice account thus might not always be considered rational under the ecological rationality account. Overall, rational choice theory puts a premium on internal logical consistency whereas ecological rationality targets external performance in the world. The term ecologically rational is only etymologically similar to the biological science of ecology.

The gateway belief model (GBM) is a dual process theory in psychology and the communication sciences. The model suggests that public perception of the degree of normative (expert) agreement – or (scientific) consensus – on societal issues, such as climate change, vaccines, evolution, gun control, and GMO's functions as a so-called "gateway" cognition, influencing an individual's personal opinions, judgments, attitudes, and affective dispositions toward various social and scientific issues.

Intuitive statistics, or folk statistics, refers to the cognitive phenomenon where organisms use data to make generalizations and predictions about the world. This can be a small amount of sample data or training instances, which in turn contribute to inductive inferences about either population-level properties, future data, or both. Inferences can involve revising hypotheses, or beliefs, in light of probabilistic data that inform and motivate future predictions. The informal tendency for cognitive animals to intuitively generate statistical inferences, when formalized with certain axioms of probability theory, constitutes statistics as an academic discipline.

References

  1. "EFF Quotes Collection 19.6". Electronic Frontier Foundation. 2001-04-09.
  2. "Quotations: Computer Laws". SysProg. Archived from the original on 2008-08-22. Retrieved 2007-03-10.
  3. Ungar, S. (2000). "Knowledge, ignorance and the popular culture: climate change versus the ozone hole". Public Understanding of Science. 9 (3): 297–312. doi:10.1088/0963-6625/9/3/306. S2CID   7089937.
  4. Pidgeon, N.; B. Fischhoff (2011). "The role of social and decision sciences in communicating uncertain climate risks". Nature Climate Change. 1 (1): 35–41. Bibcode:2011NatCC...1...35P. doi:10.1038/nclimate1080. S2CID   85362091.
  5. Kahan, Dan M.; Maggie Wittlin; Ellen Peters; Paul Slovic; Lisa Larrimore Ouellette; Donald Braman; Gregory N. Mandel (2011). "The Tragedy of the Risk-Perception Commons: Culture Conflict, Rationality Conflict, and Climate Change". hdl: 1794/22097 . SSRN   1871503 .Cite journal requires |journal= (help)
  6. Kahneman, Daniel (2003-12-01). "Maps of Bounded Rationality: Psychology for Behavioral Economics" (PDF). The American Economic Review. 93 (5): 1449–1475. CiteSeerX   10.1.1.194.6554 . doi:10.1257/000282803322655392. ISSN   0002-8282. JSTOR   3132137. Archived from the original (PDF) on 2018-02-19. Retrieved 2017-10-24.
  7. Tversky, A.; D. Kahneman (1974). "Judgment under uncertainty: Heuristics and biases". Science. 185 (4157): 1124–31. Bibcode:1974Sci...185.1124T. doi:10.1126/science.185.4157.1124. PMID   17835457. S2CID   143452957.
  8. Joireman, Jeff; Heather Barnes Truelove; Blythe Duell (December 2010). "Effect of outdoor temperature, heat primes and anchoring on belief in global warming". Journal of Environmental Psychology. 30 (4): 358–367. doi:10.1016/j.jenvp.2010.03.004. ISSN   0272-4944.
  9. Saul, Stephanie; Andrew Pollack (2007-02-17). "Furor on Rush to Require Cervical Cancer Vaccine". The New York Times. ISSN   0362-4331 . Retrieved 2011-11-26.
  10. Kahan, Dan M.; Donald Braman; Geoffrey L. Cohen; Paul Slovic; John Gastil (2008-07-15). "Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study of the Mechanisms of Cultural Cognition". SSRN   1160654 .Cite journal requires |journal= (help)
  11. Lord, Charles G.; Lee Ross; Mark R. Lepper (1979). "Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence". Journal of Personality and Social Psychology. 37 (11): 2098–2109. CiteSeerX   10.1.1.372.1743 . doi:10.1037/0022-3514.37.11.2098. ISSN   0022-3514.
  12. HOVLAND, CARL I.; WALTER WEISS (1951-12-21). "The Influence of Source Credibility on Communication Effectiveness". Public Opinion Quarterly. 15 (4): 635–650. doi:10.1086/266350.
  13. 1 2 Braman, Donald; James Grimmelmann; Dan M. Kahan. "Modeling Cultural Cognition". SSRN   1000449 .Cite journal requires |journal= (help)
  14. Fremling, G.M.; J.R. Lott Jr (2002). "Surprising Finding That Cultural Worldviews Don't Explain People's Views on Gun Control, The". U. Pa. L. Rev. 151 (4): 1341–1348. doi:10.2307/3312932. JSTOR   3312932.
  15. Ayres, I.; J.J. Donohue III (2002). Shooting down the more guns, less crime hypothesis. National Bureau of Economic Research.
  16. Lee, M.D.; M. Steyvers; M. de Young; B.J. Miller. "A Model-Based Approach to Measuring Expertise in Ranking Tasks".Cite journal requires |journal= (help)
  17. Ernst, Marc O.; Martin S. Banks (2002-01-24). "Humans integrate visual and haptic information in a statistically optimal fashion". Nature. 415 (6870): 429–433. Bibcode:2002Natur.415..429E. doi:10.1038/415429a. ISSN   0028-0836. PMID   11807554. S2CID   47459.
  18. Wozny, D.R.; U.R. Beierholm; L. Shams (2008). "Human trimodal perception follows optimal statistical inference". Journal of Vision. 8 (3): 24.1–11. doi: 10.1167/8.3.24 . PMID   18484830.
  19. Brocas, Isabelle; Juan D. Carrillo (2012). "From perception to action: An economic model of brain processes". Games and Economic Behavior. 75: 81–103. doi:10.1016/j.geb.2011.10.001. ISSN   0899-8256.