Controversy

Last updated • 5 min readFrom Wikipedia, The Free Encyclopedia
A scene of rabbis engaging in debate in Carl Schleicher's painting A controversy from the Talmud, 19th century. Carl Schleicher Eine Streitfrage aus dem Talmud.jpg
A scene of rabbis engaging in debate in Carl Schleicher's painting A controversy from the Talmud, 19th century.

Controversy is a state of prolonged public dispute or debate, usually concerning a matter of conflicting opinion or point of view. The word was coined from the Latin controversia, as a composite of controversus – "turned in an opposite direction".

Contents

In the theory of law, a controversy differs from a legal case; while legal cases include all suits, criminal as well as civil, a controversy is a purely civil proceeding.

For example, the Case or Controversy Clause of Article Three of the United States Constitution (Section 2, Clause 1) states that "the judicial Power shall extend ... to Controversies to which the United States shall be a Party". This clause has been deemed to impose a requirement that United States federal courts are not permitted to cases that do not pose an actual controversythat is, an actual dispute between adverse parties which is capable of being resolved by the [court]. In addition to setting out the scope of the jurisdiction of the federal judiciary, it also prohibits courts from issuing advisory opinions, or from hearing cases that are either unripe, meaning that the controversy has not arisen yet, or moot, meaning that the controversy has already been

Benford's law

Benford's law of controversy, as expressed by the astrophysicist and science fiction author Gregory Benford in 1980, states: Passion is inversely proportional to the amount of real information available. [1] [2] In other words, it claims that the less factual information is available on a topic, the more controversy can arise around that topic – and the more facts are available, the less controversy can arise. Thus, for example, controversies in physics would be limited to subject areas where experiments cannot be carried out yet, whereas controversies would be inherent to politics, where communities must frequently decide on courses of action based on insufficient information.

Psychological bases

Controversies are frequently thought to be a result of a lack of confidence on the part of the disputants – as implied by Benford's law of controversy, which only talks about lack of information ("passion is inversely proportional to the amount of real information available"). For example, in analyses of the political controversy over anthropogenic climate change, which is exceptionally virulent in the United States, it has been proposed that those who are opposed to the scientific consensus do so because they don't have enough information about the topic. [3] [4] A study of 1540 US adults [5] found instead that levels of scientific literacy correlated with the strength of opinion on climate change, but not on which side of the debate that they stood.

The puzzling phenomenon of two individuals being able to reach different conclusions after being exposed to the same facts has been frequently explained (particularly by Daniel Kahneman) by reference to a 'bounded rationality' – in other words, that most judgments are made using fast acting heuristics [6] [7] that work well in every day situations, but are not amenable to decision-making about complex subjects such as climate change. Anchoring has been particularly identified as relevant in climate change controversies [8] as individuals are found to be more positively inclined to believe in climate change if the outside temperature is higher, if they have been primed to think about heat, and if they are primed with higher temperatures when thinking about the future temperature increases from climate change.

In other controversies – such as that around the HPV vaccine, the same evidence seemed to license inference to radically different conclusions. [9] Kahan et al. [10] explained this by the cognitive biases of biased assimilation [11] and a credibility heuristic. [12]

Similar effects on reasoning are also seen in non-scientific controversies, for example in the gun control debate in the United States. [13] As with other controversies, it has been suggested that exposure to empirical facts would be sufficient to resolve the debate once and for all. [14] [15] In computer simulations of cultural communities, beliefs were found to polarize within isolated sub-groups, based on the mistaken belief of the community's unhindered access to ground truth. [13] Such confidence in the group to find the ground truth is explicable through the success of wisdom of the crowd based inferences. [16] However, if there is no access to the ground truth, as there was not in this model, the method will fail.

Bayesian decision theory allows these failures of rationality to be described as part of a statistically optimized system for decision making. Experiments and computational models in multisensory integration have shown that sensory input from different senses is integrated in a statistically optimal way, [17] in addition, it appears that the kind of inferences used to infer single sources for multiple sensory inputs uses a Bayesian inference about the causal origin of the sensory stimuli. [18] As such, it appears neurobiologically plausible that the brain implements decision-making procedures that are close to optimal for Bayesian inference.

Brocas and Carrillo propose a model to make decisions based on noisy sensory inputs, [19] beliefs about the state of the world are modified by Bayesian updating, and then decisions are made based on beliefs passing a threshold. They show that this model, when optimized for single-step decision making, produces belief anchoring and polarization of opinions – exactly as described in the global warming controversy context – in spite of identical evidence presented, the pre-existing beliefs (or evidence presented first) has an overwhelming effect on the beliefs formed. In addition, the preferences of the agent (the particular rewards that they value) also cause the beliefs formed to change – this explains the biased assimilation (also known as confirmation bias) shown above. This model allows the production of controversy to be seen as a consequence of a decision maker optimized for single-step decision making, rather than a result of limited reasoning in the bounded rationality of Daniel Kahneman.

See also

Listen to this article (8 minutes)
Sound-icon.svg
This audio file was created from a revision of this article dated 27 June 2013 (2013-06-27), and does not reflect subsequent edits.

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Bayesian inference is a method of statistical inference in which Bayes' theorem is used to calculate a probability of a hypothesis, given prior evidence, and update it as more information becomes available. Fundamentally, Bayesian inference uses a prior distribution to estimate posterior probabilities. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called "Bayesian probability".

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs.

A heuristic or heuristic technique is any approach to problem solving that employs a pragmatic method that is not fully optimized, perfected, or rationalized, but is nevertheless "good enough" as an approximation or attribute substitution. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Heuristic reasoning is often based on induction, or on analogy ... Induction is the process of discovering general laws  ... Induction tries to find regularity and coherence ... Its most conspicuous instruments are generalization, specialization, analogy. [...] Heuristic discusses human behavior in the face of problems [... that have been] preserved in the wisdom of proverbs.

Bounded rationality is the idea that rationality is limited when individuals make decisions, and under these limitations, rational individuals will select a decision that is satisfactory rather than optimal.

<span class="mw-page-title-main">Behavioral economics</span> Academic discipline

Behavioral economics is the study of the psychological factors involved in the decisions of individuals or institutions, and how these decisions deviate from those implied by traditional economic theory.

<span class="mw-page-title-main">Decision-making</span> Cognitive process to choose a course of action or belief

In psychology, decision-making is regarded as the cognitive process resulting in the selection of a belief or a course of action among several possible alternative options. It could be either rational or irrational. The decision-making process is a reasoning process based on assumptions of values, preferences and beliefs of the decision-maker. Every decision-making process produces a final choice, which may or may not prompt action.

<span class="mw-page-title-main">Decision theory</span> Branch of applied probability theory

Decision theory or the theory of rational choice is a branch of probability, economics, and analytic philosophy that uses the tools of expected utility and probability to model how individuals would behave rationally under uncertainty. It differs from the cognitive and behavioral sciences in that it is mainly prescriptive and concerned with identifying optimal decisions for a rational agent, rather than describing how people actually make decisions. Despite this, the field is important to the study of real human behavior by social scientists, as it lays the foundations for the rational agent models used to mathematically model and analyze individuals in fields such as sociology, economics, criminology, cognitive science, and political science.

A status quo bias or default bias is a cognitive bias which results from a preference for the maintenance of one's existing state of affairs. The current baseline is taken as a reference point, and any change from that baseline is perceived as a loss or gain. Corresponding to different alternatives, this current baseline or default option is perceived and evaluated by individuals as a positive.

The cultural theory of risk, often referred to simply as Cultural Theory, consists of a conceptual framework and an associated body of empirical studies that seek to explain societal conflict over risk. Whereas other theories of risk perception stress economic and cognitive influences, Cultural Theory asserts that structures of social organization endow individuals with perceptions that reinforce those structures in competition against alternative ones. This theory was first elaborated in the book Natural Symbols, written by anthropologist Mary Douglas in 1970. Douglas later worked closely with the political scientist Aaron Wildavsky, to clarify the theory. Cultural Theory has given rise to a diverse set of research programs that span multiple social science disciplines and that have in recent years been used to analyze policymaking conflicts generally.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

The cultural cognition of risk, sometimes called simply cultural cognition, is the hypothesized tendency to perceive risks and related facts in relation to personal values. Research examining this phenomenon draws on a variety of social science disciplines including psychology, anthropology, political science, sociology, and communications. The stated objectives of this research are both to understand how values shape political conflict over facts and to promote effective deliberative strategies for resolving such conflicts consistent with sound empirical data.

In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

In cognitive psychology and decision science, conservatism or conservatism bias is a bias which refers to the tendency to revise one's belief insufficiently when presented with new evidence. This bias describes human belief revision in which people over-weigh the prior distribution and under-weigh new sample evidence when compared to Bayesian belief-revision.

The free energy principle is a theoretical framework suggesting that the brain reduces surprise or uncertainty by making predictions based on internal models and updating them using sensory input. It highlights the brain's objective of aligning its internal model and the external world to enhance prediction accuracy. This principle integrates Bayesian inference with active inference, where actions are guided by predictions and sensory feedback refines them. It has wide-ranging implications for comprehending brain function, perception, and action.

Ecological rationality is a particular account of practical rationality, which in turn specifies the norms of rational action – what one ought to do in order to act rationally. The presently dominant account of practical rationality in the social and behavioral sciences such as economics and psychology, rational choice theory, maintains that practical rationality consists in making decisions in accordance with some fixed rules, irrespective of context. Ecological rationality, in contrast, claims that the rationality of a decision depends on the circumstances in which it takes place, so as to achieve one's goals in this particular context. What is considered rational under the rational choice account thus might not always be considered rational under the ecological rationality account. Overall, rational choice theory puts a premium on internal logical consistency whereas ecological rationality targets external performance in the world. The term ecologically rational is only etymologically similar to the biological science of ecology.

The gateway belief model (GBM) suggests that public perception of the degree of expert or scientific consensus on an issue functions as a so-called "gateway" cognition. Perception of scientific agreement is suggested to be a key step towards acceptance of related beliefs. Increasing the perception that there is normative agreement within the scientific community can increase individual support for an issue. A perception of disagreement may decrease support for an issue.

Intuitive statistics, or folk statistics, is the cognitive phenomenon where organisms use data to make generalizations and predictions about the world. This can be a small amount of sample data or training instances, which in turn contribute to inductive inferences about either population-level properties, future data, or both. Inferences can involve revising hypotheses, or beliefs, in light of probabilistic data that inform and motivate future predictions. The informal tendency for cognitive animals to intuitively generate statistical inferences, when formalized with certain axioms of probability theory, constitutes statistics as an academic discipline.

References

  1. "EFF Quotes Collection 19.6". Electronic Frontier Foundation. 2001-04-09. Archived from the original on 2007-09-29. Retrieved 2016-12-04.
  2. "Quotations: Computer Laws". SysProg. Archived from the original on 2008-08-22. Retrieved 2007-03-10.
  3. Ungar, S. (2000). "Knowledge, ignorance and the popular culture: climate change versus the ozone hole". Public Understanding of Science. 9 (3): 297–312. doi:10.1088/0963-6625/9/3/306. S2CID   7089937.
  4. Pidgeon, N.; B. Fischhoff (2011). "The role of social and decision sciences in communicating uncertain climate risks". Nature Climate Change. 1 (1): 35–41. Bibcode:2011NatCC...1...35P. doi:10.1038/nclimate1080. S2CID   85362091.
  5. Kahan, Dan M.; Maggie Wittlin; Ellen Peters; Paul Slovic; Lisa Larrimore Ouellette; Donald Braman; Gregory N. Mandel (2011). "The Tragedy of the Risk-Perception Commons: Culture Conflict, Rationality Conflict, and Climate Change". doi:10.2139/ssrn.1871503. hdl: 1794/22097 . S2CID   73649608. SSRN   1871503.{{cite journal}}: Cite journal requires |journal= (help)
  6. Kahneman, Daniel (2003-12-01). "Maps of Bounded Rationality: Psychology for Behavioral Economics" (PDF). The American Economic Review. 93 (5): 1449–1475. CiteSeerX   10.1.1.194.6554 . doi:10.1257/000282803322655392. ISSN   0002-8282. JSTOR   3132137. Archived from the original (PDF) on 2018-02-19. Retrieved 2017-10-24.
  7. Tversky, A.; D. Kahneman (1974). "Judgment under uncertainty: Heuristics and biases". Science. 185 (4157): 1124–31. Bibcode:1974Sci...185.1124T. doi:10.1126/science.185.4157.1124. PMID   17835457. S2CID   143452957. Archived from the original on 2018-06-01. Retrieved 2017-08-30.
  8. Joireman, Jeff; Heather Barnes Truelove; Blythe Duell (December 2010). "Effect of outdoor temperature, heat primes and anchoring on belief in global warming". Journal of Environmental Psychology. 30 (4): 358–367. doi:10.1016/j.jenvp.2010.03.004. ISSN   0272-4944.
  9. Saul, Stephanie; Andrew Pollack (2007-02-17). "Furor on Rush to Require Cervical Cancer Vaccine". The New York Times. ISSN   0362-4331 . Retrieved 2011-11-26.
  10. Kahan, Dan M.; Donald Braman; Geoffrey L. Cohen; Paul Slovic; John Gastil (2008-07-15). "Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study of the Mechanisms of Cultural Cognition". Law and Human Behavior. SSRN   1160654.
  11. Lord, Charles G.; Lee Ross; Mark R. Lepper (1979). "Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence". Journal of Personality and Social Psychology. 37 (11): 2098–2109. CiteSeerX   10.1.1.372.1743 . doi:10.1037/0022-3514.37.11.2098. ISSN   0022-3514.
  12. HOVLAND, CARL I.; WALTER WEISS (1951-12-21). "The Influence of Source Credibility on Communication Effectiveness". Public Opinion Quarterly. 15 (4): 635–650. doi:10.1086/266350.
  13. 1 2 Braman, Donald; James Grimmelmann; Dan M. Kahan (20 July 2007). "Modeling Cultural Cognition". Social Justice Research. SSRN   1000449.
  14. Fremling, G.M.; J.R. Lott Jr (2002). "Surprising Finding That Cultural Worldviews Don't Explain People's Views on Gun Control, The". U. Pa. L. Rev. 151 (4): 1341–1348. doi:10.2307/3312932. JSTOR   3312932.
  15. Ayres, I.; J.J. Donohue III (2002). Shooting down the more guns, less crime hypothesis. National Bureau of Economic Research.
  16. Lee, M.D.; M. Steyvers; M. de Young; B.J. Miller. "A Model-Based Approach to Measuring Expertise in Ranking Tasks".{{cite journal}}: Cite journal requires |journal= (help)
  17. Ernst, Marc O.; Martin S. Banks (2002-01-24). "Humans integrate visual and haptic information in a statistically optimal fashion". Nature. 415 (6870): 429–433. Bibcode:2002Natur.415..429E. doi:10.1038/415429a. ISSN   0028-0836. PMID   11807554. S2CID   47459.
  18. Wozny, D.R.; U.R. Beierholm; L. Shams (2008). "Human trimodal perception follows optimal statistical inference". Journal of Vision. 8 (3): 24.1–11. doi: 10.1167/8.3.24 . PMID   18484830.
  19. Brocas, Isabelle; Juan D. Carrillo (2012). "From perception to action: An economic model of brain processes". Games and Economic Behavior. 75: 81–103. doi:10.1016/j.geb.2011.10.001. ISSN   0899-8256.