Adversarial collaboration

Last updated

In science, adversarial collaboration is a modality of collaboration wherein opposing views work together in order to jointly advance knowledge of the area under dispute. This can take the form of a scientific experiment conducted by two groups of experimenters with competing hypotheses, with the aim of constructing and implementing an experimental design in a way that satisfies both groups that there are no obvious biases or weaknesses in the experimental design. [1] Adversarial collaboration can involve a neutral moderator [2] and lead to a co-designed experiment and joint publishing of findings in order to resolve differences. [3] With its emphasis on transparency throughout the research process, adversarial collaboration has been described as sitting within the open science framework. [4]

Contents

History

One of the earliest modern examples of adversarial collaboration was a 1988 collaboration between Erez and Latham with Edwin Locke working as a neutral third party. This collaboration came about as the result of a disagreement from the field of Goal-Setting research between Erez and Latham on an aspect of goal-setting research around the effect of participation on goal commitment and performance. Latham and Erez designed four experiments which explained the differences between their individual findings, but did not coin the term adversarial collaboration. [2] Independently, to Erez, Locke and Latham whose work he was unaware of, [5] Daniel Kahneman developed a similar protocol for adversarial collaboration around ten years later and may have been the first to use the term adversarial collaboration. [6] More recently, Clark and Tetlock have proposed adversarial collaboration as a vehicle for improving how science can self-correct through exploring rival hypotheses which will ultimately expose false claims. [7] Their work has led to the University of Pennsylvania School of Arts & Sciences creating the Adversarial Collaboration Project [8] which seeks to encourage the use of adversarial collaboration as a research approach to address a variety of research questions. [9]

Benefits

Adversarial collaboration has been recommended by Daniel Kahneman [10] and others as a way of reducing the distorting impact of cognitive-motivational biases on human reasoning [11] and resolving contentious issues in fringe science. [12] It has also been recommended as a potential solution for improving academic commentaries. [13]

Philip Tetlock and Gregory Mitchell have discussed it in various articles. They argue:

Adversarial collaboration is most feasible when least needed: when the clashing camps have advanced testable theories, subscribe to common canons for testing those theories, and disagreements are robust but respectful. And adversarial collaboration is least feasible when most needed: when the scientific community lacks clear criteria for falsifying points of view, disagrees on key methodological issues, relies on second- or third-best substitute methods for testing causality, and is fractured into opposing camps that engage in ad hominem posturing and that have intimate ties to political actors who see any concession as weakness. [14]

Related Research Articles

Controversy is a state of prolonged public dispute or debate, usually concerning a matter of conflicting opinion or point of view. The word was coined from the Latin controversia, as a composite of controversus – "turned in an opposite direction".

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills.

A heuristic, or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

<span class="mw-page-title-main">Daniel Kahneman</span> Israeli-American psychologist

Daniel Kahneman is an Israeli-American author, psychologist and economist notable for his work on hedonic psychology, psychology of judgment and decision-making. He is also known for his work in behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences. His empirical findings challenge the assumption of human rationality prevailing in modern economic theory.

<span class="mw-page-title-main">Amos Tversky</span> Israeli psychologist (1937–1996)

Amos Nathan Tversky was an Israeli cognitive and mathematical psychologist and a key figure in the discovery of systematic human cognitive bias and handling of risk.

The peak–end rule is a psychological heuristic in which people judge an experience largely based on how they felt at its peak and at its end, rather than based on the total sum or average of every moment of the experience. The effect occurs regardless of whether the experience is pleasant or unpleasant. To the heuristic, other information aside from that of the peak and end of the experience is not lost, but it is not used. This includes net pleasantness or unpleasantness and how long the experience lasted. The peak–end rule is thereby a specific form of the more general extension neglect and duration neglect.

The conjunction fallacy is an inference that a conjoint set of two or more specific conclusions is likelier than any single member of that same set, in violation of the laws of probability. It is a type of formal fallacy.

The anchoring effect is a psychological phenomenon in which an individual's judgements or decisions are influenced by a reference point or "anchor" which can be completely irrelevant. Both numeric and non-numeric anchoring have been reported in research. In numeric anchoring, once the value of the anchor is set, subsequent arguments, estimates, etc. made by an individual may change from what they would have otherwise been without the anchor. For example, an individual may be more likely to purchase a car if it is placed alongside a more expensive model. Prices discussed in negotiations that are lower than the anchor may seem reasonable, perhaps even cheap to the buyer, even if said prices are still relatively higher than the actual market value of the car. Another example may be when estimating the orbit of Mars, one might start with the Earth's orbit and then adjust upward until they reach a value that seems reasonable.

<span class="mw-page-title-main">Planning fallacy</span> Cognitive bias of underestimating time needed

The planning fallacy is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed. This phenomenon sometimes occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned. The bias affects predictions only about one's own tasks. On the other hand, when outside observers predict task completion times, they tend to exhibit a pessimistic bias, overestimating the time needed. The planning fallacy involves estimates of task completion times more optimistic than those encountered in similar projects in the past.

Goal setting involves the development of an action plan designed in order to motivate and guide a person or group toward a goal. Goals are more deliberate than desires and momentary intentions. Therefore, setting goals means that a person has committed thought, emotion, and behavior towards attaining the goal. In doing so, the goal setter has established a desired future state which differs from their current state thus creating a mismatch which in turn spurs future actions. Goal setting can be guided by goal-setting criteria such as SMART criteria. Goal setting is a major component of personal-development and management literature. Studies by Edwin A. Locke and his colleagues, most notably, Gary Latham have shown that more specific and ambitious goals lead to more performance improvement than easy or general goals. The goals should be specific, time constrained and difficult. Vague goals reduce limited attention resources. Unrealistically short time limits intensify the difficulty of the goal outside the intentional level and disproportionate time limits are not encouraging. Difficult goals should be set ideally at the 90th percentile of performance,assuming that motivation and not ability is limiting attainment of that level of performance. As long as the person accepts the goal, has the ability to attain it, and does not have conflicting goals, there is a positive linear relationship between goal difficulty and task performance.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

A goal or objective is an idea of the future or desired result that a person or a group of people envision, plan and commit to achieve. People endeavour to reach goals within a finite time by setting deadlines.

<span class="mw-page-title-main">Edwin Locke</span> American psychologist (b.1938)

Edwin A. Locke is an American psychologist and a pioneer in goal-setting theory. He is a retired Dean's Professor of Motivation and Leadership at the Robert H. Smith School of Business at the University of Maryland, College Park. He was also affiliated with the Department of Psychology. As stated by the Association for Psychological Science, "Locke is the most published organizational psychologist in the history of the field. His pioneering research has advanced and enriched our understanding of work motivation and job satisfaction. The theory that is synonymous with his name—goal-setting theory—is perhaps the most widely-respected theory in industrial-organizational psychology. His 1976 chapter on job satisfaction continues to be one of the most highly-cited pieces of work in the field."

<span class="mw-page-title-main">Philip E. Tetlock</span> Canadian-American political scientist

Philip E. Tetlock is a Canadian-American political science writer, and is currently the Annenberg University Professor at the University of Pennsylvania, where he is cross-appointed at the Wharton School and the School of Arts and Sciences. He was elected a Member of the American Philosophical Society in 2019.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

<i>Thinking, Fast and Slow</i> 2011 book by Daniel Kahneman

Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.

The Good Judgment Project (GJP) is an organization dedicated to "harnessing the wisdom of the crowd to forecast world events". It was co-created by Philip E. Tetlock, decision scientist Barbara Mellers, and Don Moore, all professors at the University of Pennsylvania.

Debiasing is the reduction of bias, particularly with respect to judgment and decision making. Biased judgment and decision making is that which systematically deviates from the prescriptions of objective standards such as facts, logic, and rational behavior or prescriptive norms. Biased judgment and decision making exists in consequential domains such as medicine, law, policy, and business, as well as in everyday life. Investors, for example, tend to hold onto falling stocks too long and sell rising stocks too quickly. Employers exhibit considerable discrimination in hiring and employment practices, and some parents continue to believe that vaccinations cause autism despite knowing that this link is based on falsified evidence. At an individual level, people who exhibit less decision bias have more intact social environments, reduced risk of alcohol and drug use, lower childhood delinquency rates, and superior planning and problem solving abilities.

References

  1. Arts and Sciences, Penn. "Adversarial Collaboration Project". Adversarial Collaboration Project. Archived from the original on 2021-09-15. Retrieved 7 Jan 2022.
  2. 1 2 Latham, Gary P.; Erez, Miriam; Locke, Edwin A. (1988). "Resolving scientific disputes by the joint design of crucial experiments by the antagonists: Application to the Erez–Latham dispute regarding participation in goal setting". Journal of Applied Psychology. 73 (4): 753–772. doi:10.1037/0021-9010.73.4.753. ISSN   1939-1854.
  3. Locke, Edwin A.; Latham, Gary P.; Erez, Miriam (1988). "The Determinants of Goal Commitment". The Academy of Management Review. 13 (1): 23. doi:10.2307/258352. JSTOR   258352.
  4. Rakow, Tim (2022), O'Donohue, William; Masuda, Akihiko; Lilienfeld, Scott (eds.), "Adversarial Collaboration", Avoiding Questionable Research Practices in Applied Psychology, Cham: Springer International Publishing, pp. 359–377, doi:10.1007/978-3-031-04968-2_16, ISBN   978-3-031-04968-2 , retrieved 2023-06-20
  5. "Adversarial Collaboration: An EDGE Lecture by Daniel Kahneman | Edge.org". www.edge.org. Retrieved 2023-06-20.
  6. Berger, Michele W.; Pennsylvania, University of. "In the pursuit of scientific truth, working with adversaries can pay off". phys.org. Retrieved 2023-06-20.
  7. Clark, Cory J.; Costello, Thomas; Mitchell, Gregory; Tetlock, Philip E. (March 2022). "Keep your enemies close: Adversarial collaborations will improve behavioral science". Journal of Applied Research in Memory and Cognition. 11 (1): 1–18. doi:10.1037/mac0000004. ISSN   2211-369X. S2CID   248441364.
  8. "In the pursuit of scientific truth, working with adversaries can pay off". Penn Today. 2022-07-07. Retrieved 2023-06-20.
  9. "Research | Adversarial Collaboration Project". web.sas.upenn.edu. Retrieved 2023-06-20.
  10. Kahneman, Daniel; Klein, Gary. Conditions for intuitive expertise: A failure to disagree. American Psychologist, Vol 64(6), Sep 2009, 515-526. doi: 10.1037/a0016755
  11. Clark, C. J.; Tetlock, P. E. (2022). Adversarial collaboration The next science reform. New York: New York: Springer. pp. 2–3.
  12. Wagenmakers, E.-J., Wetzels, R., Borsboom, D., & van der Maas, H. L. J. (2010). Why psychologists must change the way they analyze their data: The case of psi. Archived 2011-01-20 at the Wayback Machine
  13. Heyman, Tom; Moors, Pieter; Rabagliati, Hugh (2020). "The benefits of adversarial collaboration for commentaries". Nature Human Behaviour. 4 (12): 1217. doi:10.1038/s41562-020-00978-6. hdl: 1887/3188822 . ISSN   2397-3374. PMID   33106628. S2CID   225083325.
  14. Tetlock, Philip & Gregory Mitchell. 2009. "Implicit Bias and Accountability Systems: What Must Organizations Do to Prevent Discrimination?" Research in Organizational Behavior 29:3-38. Earlier version at