Cherry picking

Last updated
Cherry-picking is often used in science denial such as climate change denial. For example, by deliberately cherry picking appropriate time periods, here 1998-2012, an artificial "pause" can be created, even when there is an ongoing warming trend. The same problem could occur with the zoomed-out portion of the graph; if the data from before 1880 went in an unpredicted direction, that would cause another (unintentional) cherry picking fallacy. Furthermore, the temperature average was taken from 1951 to 1980, a relatively short span of time, so perhaps the true average temperature could be far different. Global warming hiatus.gif
Cherry-picking is often used in science denial such as climate change denial. For example, by deliberately cherry picking appropriate time periods, here 1998–2012, an artificial "pause" can be created, even when there is an ongoing warming trend. The same problem could occur with the zoomed-out portion of the graph; if the data from before 1880 went in an unpredicted direction, that would cause another (unintentional) cherry picking fallacy. Furthermore, the temperature average was taken from 1951 to 1980, a relatively short span of time, so perhaps the true average temperature could be far different.

Cherry picking, suppressing evidence, or the fallacy of incomplete evidence is the act of pointing to individual cases or data that seem to confirm a particular position while ignoring a significant portion of related and similar cases or data that may contradict that position. Cherry picking may be committed intentionally or unintentionally. [2]

Contents

The term is based on the perceived process of harvesting fruit, such as cherries. The picker would be expected to select only the ripest and healthiest fruits. An observer who sees only the selected fruit may thus wrongly conclude that most, or even all, of the tree's fruit is in a likewise good condition. This can also give a false impression of the quality of the fruit (since it is only a sample and is not a representative sample). A concept sometimes confused with cherry picking is the idea of gathering only the fruit that is easy to harvest, while ignoring other fruit that is higher up on the tree and thus more difficult to obtain (see low-hanging fruit).

Cherry picking has a negative connotation as the practice neglects, overlooks or directly suppresses evidence that could lead to a complete picture.

Cherry picking can be found in many logical fallacies. For example, the "fallacy of anecdotal evidence" tends to overlook large amounts of data in favor of that known personally, "selective use of evidence" rejects material unfavorable to an argument, while a false dichotomy picks only two options when more are available. Some scholars classify cherry-picking as a fallacy of selective attention, the most common example of which is the confirmation bias. [3] Cherry picking can refer to the selection of data or data sets so a study or survey will give desired, predictable results which may be misleading or even completely contrary to reality. [4]

History

A story about the 5th century BCE atheist philosopher Diagoras of Melos says how, when shown the votive gifts of people who had supposedly escaped death by shipwreck by praying to gods, he pointed out that many people had died at sea in spite of their prayers, yet these cases were not likewise commemorated. [5] Michel de Montaigne (1533–1592) in his essay on prophecies comments on people willing to believe in the validity of supposed seers:

I see some who are mightily given to study and comment upon their almanacs, and produce them to us as an authority when anything has fallen out pat; and, for that matter, it is hardly possible but that these alleged authorities sometimes stumble upon a truth amongst an infinite number of lies. ... I think never the better of them for some such accidental hit. ... [N]obody records their flimflams and false prognostics, forasmuch as they are infinite and common; but if they chop upon one truth, that carries a mighty report, as being rare, incredible, and prodigious. [6]

In science

Cherry picking is one of the epistemological characteristics of denialism and widely used by different science denialists to seemingly contradict scientific findings. For example, it is used in climate change denial, evolution denial by creationists, denial of the negative health effects of consuming tobacco products and passive smoking. [1]

Choosing to make selective choices among competing evidence, so as to emphasize those results that support a given position, while ignoring or dismissing any findings that do not support it, is a practice known as "cherry picking" and is a hallmark of poor science or pseudo-science. [7]

Richard Somerville, Testimony before the US House of Representatives Committee on Energy and Commerce Subcommittee on Energy and Power, March 8, 2011

Rigorous science looks at all the evidence (rather than cherry picking only favorable evidence), controls for variables as to identify what is actually working, uses blinded observations so as to minimize the effects of bias, and uses internally consistent logic." [8]

Steven Novella, "A Skeptic In Oz", April 26, 2011

In medicine

In a 2002 study, a review of previous medical data found cherry picking in tests of anti-depression medication:

[researchers] reviewed 31 antidepressant efficacy trials to identify the primary exclusion criteria used in determining eligibility for participation. Their findings suggest that patients in current antidepressant trials represent only a minority of patients treated in routine clinical practice for depression. Excluding potential clinical trial subjects with certain profiles means that the ability to generalize the results of antidepressant efficacy trials lacks empirical support, according to the authors. [9]

In argumentation

In argumentation, the practice of "quote mining" is a form of cherry picking, [7] in which the debater selectively picks some quotes supporting a position (or exaggerating an opposing position) while ignoring those that moderate the original quote or put it into a different context. Cherry picking in debates is a large problem as the facts themselves are true but need to be put in context. Because research cannot be done live and is often untimely, cherry-picked facts or quotes usually stick in the public mainstream and, even when corrected, lead to widespread misrepresentation of groups targeted.

One-sided argument

A one-sided argument (also known as card stacking, stacking the deck, ignoring the counterevidence, slanting, and suppressed evidence) [10] is an informal fallacy that occurs when only the reasons supporting a proposition are supplied, while all reasons opposing it are omitted.

Philosophy professor Peter Suber has written:

The one-sidedness fallacy does not make an argument invalid. It may not even make the argument unsound. The fallacy consists in persuading readers, and perhaps ourselves, that we have said enough to tilt the scale of evidence and therefore enough to justify a judgment. If we have been one-sided, though, then we haven't yet said enough to justify a judgment. The arguments on the other side may be stronger than our own. We won't know until we examine them. So the one-sidedness fallacy doesn't mean that your premises are false or irrelevant, only that they are incomplete. [11]

With rational messages, you need to decide if you want to use a one-sided argument or a two-sided argument. A one-sided argument presents only the pro side of the argument, while a two-sided argument presents both sides. Which one you use will depend on which one meets your needs and the type of audience. Generally, one-sided arguments are better with audiences already favorable to your message. Two-sided arguments are best with audiences who are opposed to your argument, are better educated or have already been exposed to counter arguments.[ citation needed ]

Card stacking is a propaganda technique that seeks to manipulate audience perception of an issue by emphasizing one side and repressing another. [12] Such emphasis may be achieved through media bias or the use of one-sided testimonials, or by simply censoring the voices of critics. The technique is commonly used in persuasive speeches by political candidates to discredit their opponents and to make themselves seem more worthy. [13]

The term originates from the magician's gimmick of "stacking the deck", which involves presenting a deck of cards that appears to have been randomly shuffled but which is, in fact, 'stacked' in a specific order. The magician knows the order and is able to control the outcome of the trick. In poker, cards can be stacked so that certain hands are dealt to certain players. [14]

The phenomenon can be applied to any subject and has wide applications. Whenever a broad spectrum of information exists, appearances can be rigged by highlighting some facts and ignoring others. Card stacking can be a tool of advocacy groups or of those groups with specific agendas. [15] For example, an enlistment poster might focus upon an impressive picture, with words such as "travel" and "adventure", while placing the words, "enlist for two to four years" at the bottom in a smaller and less noticeable point size. [16]

See also

Related Research Articles

<i>The Mismeasure of Man</i> 1981 book by Stephen Jay Gould

The Mismeasure of Man is a 1981 book by paleontologist Stephen Jay Gould. The book is both a history and critique of the statistical methods and cultural motivations underlying biological determinism, the belief that "the social and economic differences between human groups—primarily races, classes, and sexes—arise from inherited, inborn distinctions and that society, in this sense, is an accurate reflection of biology".

<span class="mw-page-title-main">Straw man</span> Form of argument and informal fallacy

A straw man fallacy is the informal fallacy of refuting an argument different from the one actually under discussion, while not recognizing or acknowledging the distinction. One who engages in this fallacy is said to be "attacking a straw man".

<span class="mw-page-title-main">Fallacy</span> Argument that uses faulty reasoning

A fallacy, also known as paralogia in modern psychology, is the use of invalid or otherwise faulty reasoning in the construction of an argument that may appear to be well-reasoned if unnoticed. The term was introduced in the Western intellectual tradition by the Aristotelian De Sophisticis Elenchis.

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills.

<span class="mw-page-title-main">Spin (propaganda)</span> Form of propaganda in public relations and politics

In public relations and politics, spin is a form of propaganda, achieved through knowingly providing a biased interpretation of an event or campaigning to influence public opinion about some organization or public figure. While traditional public relations and advertising may manage their presentation of facts, "spin" often implies the use of disingenuous, deceptive, and manipulative tactics.

Poisoning the well is a type of informal fallacy where adverse information about a target is preemptively presented to an audience, with the intention of discrediting or ridiculing something that the target person is about to say. Poisoning the well can be a special case of argumentum ad hominem, and the term was first used with this sense by John Henry Newman in his work Apologia Pro Vita Sua (1864).

A faulty generalization is an informal fallacy wherein a conclusion is drawn about all or many instances of a phenomenon on the basis of one or a few instances of that phenomenon. It is similar to a proof by example in mathematics. It is an example of jumping to conclusions. For example, one may generalize about all people or all members of a group from what one knows about just one or a few people:

In a blind or blinded experiment, information which may influence the participants of the experiment is withheld until after the experiment is complete. Good blinding can reduce or eliminate experimental biases that arise from a participants' expectations, observer's effect on the participants, observer bias, confirmation bias, and other sources. A blind can be imposed on any participant of an experiment, including subjects, researchers, technicians, data analysts, and evaluators. In some cases, while blinding would be useful, it is impossible or unethical. For example, it is not possible to blind a patient to their treatment in a physical therapy intervention. A good clinical protocol ensures that blinding is as effective as possible within ethical and practical constraints.

Selection bias is the bias introduced by the selection of individuals, groups, or data for analysis in such a way that proper randomization is not achieved, thereby failing to ensure that the sample obtained is representative of the population intended to be analyzed. It is sometimes referred to as the selection effect. The phrase "selection bias" most often refers to the distortion of a statistical analysis, resulting from the method of collecting samples. If the selection bias is not taken into account, then some conclusions of the study may be false.

Anecdotal evidence is evidence based only on personal observation, collected in a casual or non-systematic manner.

<i>Straight and Crooked Thinking</i> Book by Robert H. Thouless

Straight and Crooked Thinking, first published in 1930 and revised in 1953, is a book by Robert H. Thouless which describes, assesses and critically analyses flaws in reasoning and argument. Thouless describes it as a practical manual, rather than a theoretical one.

In logic and philosophy, a formal fallacy, deductive fallacy, logical fallacy or non sequitur is a pattern of reasoning rendered invalid by a flaw in its logical structure that can neatly be expressed in a standard logic system, for example propositional logic. It is defined as a deductive argument that is invalid. The argument itself could have true premises, but still have a false conclusion. Thus, a formal fallacy is a fallacy where deduction goes wrong, and is no longer a logical process. This may not affect the truth of the conclusion, since validity and truth are separate in formal logic.

Quoting out of context is an informal fallacy in which a passage is removed from its surrounding matter in such a way as to distort its intended meaning. Context may be omitted intentionally or accidentally, thinking it to be non-essential. As a fallacy, quoting out of context differs from false attribution, in that the out of context quote is still attributed to the correct source.

A half-truth is a deceptive statement that includes some element of truth. The statement might be partly true, the statement may be totally true, but only part of the whole truth, or it may use some deceptive element, such as improper punctuation, or double meaning, especially if the intent is to deceive, evade, blame or misrepresent the truth.

Appeal to the stone, also known as argumentum ad lapidem, is a logical fallacy that dismisses an argument as untrue or absurd. The dismissal is made by stating or reiterating that the argument is absurd, without providing further argumentation. This theory is closely tied to proof by assertion due to the lack of evidence behind the statement and its attempt to persuade without providing any evidence.

<span class="mw-page-title-main">Propaganda techniques</span> Methods of mind manipulation, often based on logical fallacies

Propaganda techniques are methods used in propaganda to convince an audience to believe what the propagandist wants them to believe. Many propaganda techniques are based on socio-psychological research. Many of these same techniques can be classified as logical fallacies or abusive power and control tactics.

Cherry picking is the fallacy of selecting evidence that supports an argument while ignoring evidence that contradicts it.

An argument from authority, also called an appeal to authority, or argumentum ad verecundiam, is a form of argument in which the mere fact that an influential figure holds a certain position is used as evidence that the position itself is correct. While it is not a valid form of logical proof, it is a practical and sound way of obtaining knowledge that is generally likely to be correct when the authority is real, pertinent, and universally accepted.

Math on Trial: How Numbers Get Used and Abused in the Courtroom is a book on mathematical and statistical reasoning in legal argumentation, for a popular audience. It was written by American mathematician Leila Schneps and her daughter, French mathematics educator Coralie Colmez, and published in 2013 by Basic Books.

References

  1. 1 2 Sven Ove Hansson: Science denial as a form of pseudoscience. Studies in History and Philosophy of Science. 63, 2017, pp 39–47, doi : 10.1016/j.shpsa.2017.05.002.
  2. Klass, Gary. "Just Plain Data Analysis: Common Statistical Fallacies in Analyses of Social Indicator Data. Department of Politics and Government, Illinois State University" (PDF). statlit.org. ~2008. Archived from the original (PDF) on March 25, 2014. Retrieved March 25, 2014.
  3. "Fallacies | Internet Encyclopedia of Philosophy".
  4. Goldacre, Ben (2008). Bad Science. HarperCollins Publishers. pp. 97–99. ISBN   978-0-00-728319-4.
  5. Hecht, Jennifer Michael (2003). "Whatever Happened to Zeus and Hera?, 600 BCE–1 CE". Doubt: A History. Harper San Francisco. pp. 9–10. ISBN   0-06-009795-7.
  6. Michel de Montaigne (1877) [First French edition 1580]. "Chapter XI--Of Prognostications". Essays. Translated by Charles Cotton.
  7. 1 2 "Devious deception in displaying data: Cherry picking", Science or Not, April 3, 2012, retrieved 16 February 2015
  8. Novella, Steven (26 April 2011). "A Skeptic In Oz". Science-Based Medicine . Retrieved 16 February 2015.
  9. "Typical Depression Patients Excluded from Drug Trials; exclusion criteria: is it 'cherry pickin'?". The Brown University Psychopharmacology Update. Wiley Periodicals. 13 (5): 1–3. May 2002. ISSN   1068-5308. Based on the studies:
    • Posternak, MA; Zimmerman, M; Keitner, GI; Miller, IW (February 2002). "A reevaluation of the exclusion criteria used in antidepressant efficacy trials". The American Journal of Psychiatry. 159 (2): 191–200. doi:10.1176/appi.ajp.159.2.191. PMID   11823258.
    • Zimmerman, M; Mattia, JI; Posternak, MA (March 2002). "Are subjects in pharmacological treatment trials of depression representative of patients in routine clinical practice?". The American Journal of Psychiatry. 159 (3): 469–73. doi:10.1176/appi.ajp.159.3.469. PMID   11870014.
  10. "One-Sidedness - The Fallacy Files" . Retrieved 14 October 2014.
  11. Peter Suber. "The One-Sidedness Fallacy" . Retrieved 25 September 2012.
  12. The fine art of propaganda: a study of Father Coughlin's s=Institute for Propaganda Analysis. Harcourt Brace and Company. 1939. pp. 95–101. Retrieved November 24, 2010.
  13. C. S. Kim, John (1993). The art of creative critical thinking. University Press of America. pp. 317–318. ISBN   9780819188472 . Retrieved November 24, 2010.
  14. Ruchlis, Hyman; Sandra Oddo (1990). Clear thinking: a practical introduction. Prometheus Books. pp. 195–196. ISBN   9780879755942 . Retrieved November 24, 2010.
  15. James, Walene (1995). Immunization: the reality behind the myth, Volume 3. Greenwood Publishing Group. pp. 193–194. ISBN   9780897893596 . Retrieved November 24, 2010.
  16. Shabo, Magedah (2008). Techniques of Propaganda and Persuasion. Prestwick House Inc. pp. 24–29. ISBN   9781580498746 . Retrieved November 24, 2010.