Historian's fallacy

Last updated

The historian's fallacy is an informal fallacy that occurs when one assumes that decision makers of the past viewed events from the same perspective and having the same information as those subsequently analyzing the decision. It is not to be confused with presentism, a similar but distinct mode of historical analysis in which present-day ideas (such as moral standards) are projected into the past. The idea was first articulated by British literary critic Matthew Arnold in 1880 and later named and defined by American historian David Hackett Fischer in 1970.

Contents

Concept

The idea that a critic can make erroneous interpretations of past works because of knowledge of subsequent events was first articulated by Matthew Arnold. [1] [2] In his 1880 essay The Study of Poetry, he wrote: [3]

The course of development of a nation’s language, thought, and poetry, is profoundly interesting; and by regarding a poet’s work as a stage in this course of development we may easily bring ourselves to make it of more importance as poetry than in itself it really is, we may come to use a language of quite exaggerated praise in criticising it; in short, to overrate it. So arises in our poetic judgments the fallacy caused by the estimate which we may call historic.

The concept of the historian's fallacy was named and outlined in 1970 by David Hackett Fischer, who suggested it was analogous to William James's psychologist's fallacy. Fischer did not suggest that historians should refrain from retrospective analysis in their work, but he reminded historians that their subjects were not able to see into the future. As an example, he cited the well-known argument that Japan's surprise attack on Pearl Harbor should have been predictable in the United States because of the many indications that an attack was imminent. What this argument overlooks, says Fischer, citing the work of Roberta Wohlstetter, is that there were innumerable conflicting signs which suggested possibilities other than an attack on Pearl Harbor. Only in retrospect do the warning signs seem obvious; signs that pointed in other directions tend to be forgotten. (See also hindsight bias.)

In the field of military history, historians sometimes use what is known as the "fog of war technique" in hopes of avoiding the historian's fallacy. In this approach, the actions and decisions of the historical subject (such as a military commander) are evaluated primarily on the basis of what that person knew at the time, and not on future developments that the person could not have known. According to Fischer, this technique was pioneered by the American historian Douglas Southall Freeman in his influential biographies of Robert E. Lee and George Washington.

See also

Related Research Articles

The gambler's fallacy, also known as the Monte Carlo fallacy or the fallacy of the maturity of chances, is the belief that, if an event has occurred more frequently than expected, it is less likely to happen again in the future. The fallacy is commonly associated with gambling, where it may be believed, for example, that the next dice roll is more than usually likely to be six because there have recently been fewer than the expected number of sixes.

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Cultural bias is the interpretation and judgment of phenomena by the standards of one's own culture. It is sometimes considered a problem central to social and human sciences, such as economics, psychology, anthropology, and sociology. Some practitioners of these fields have attempted to develop methods and theories to compensate for or eliminate cultural bias.

<span class="mw-page-title-main">Fallacy</span> Argument that uses faulty reasoning

A fallacy is the use of invalid or otherwise faulty reasoning in the construction of an argument that may appear to be well-reasoned if unnoticed. The term was introduced in the Western intellectual tradition by the Aristotelian De Sophisticis Elenchis.

Rationality is the quality of being guided by or based on reason. In this regard, a person acts rationally if they have a good reason for what they do or a belief is rational if it is based on strong evidence. This quality can apply to an ability, as in a rational animal, to a psychological process, like reasoning, to mental states, such as beliefs and intentions, or to persons who possess these other forms of rationality. A thing that lacks rationality is either arational, if it is outside the domain of rational evaluation, or irrational, if it belongs to this domain but does not fulfill its standards.

In economics and business decision-making, a sunk cost is a cost that has already been incurred and cannot be recovered. Sunk costs are contrasted with prospective costs, which are future costs that may be avoided if action is taken. In other words, a sunk cost is a sum paid in the past that is no longer relevant to decisions about the future. Even though economists argue that sunk costs are no longer relevant to future rational decision-making, people in everyday life often take previous expenditures in situations, such as repairing a car or house, into their future decisions regarding those properties.

Behavioral economics is the study of the psychological, cognitive, emotional, cultural and social factors involved in the decisions of individuals or institutions, and how these decisions deviate from those implied by classical economic theory.

<span class="mw-page-title-main">Decision-making</span> Cognitive process to choose a course of action or belief

In psychology, decision-making is regarded as the cognitive process resulting in the selection of a belief or a course of action among several possible alternative options. It could be either rational or irrational. The decision-making process is a reasoning process based on assumptions of values, preferences and beliefs of the decision-maker. Every decision-making process produces a final choice, which may or may not prompt action.

Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, is the common tendency for people to perceive past events as having been more predictable than they were.

The following outline is provided as an overview of and topical guide to public relations:

In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to "see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances". In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.

In literary and historical analysis, presentism is a term for the introduction of present-day ideas and perspectives into depictions or interpretations of the past. Some modern historians seek to avoid presentism in their work because they consider it a form of cultural bias, and believe it creates a distorted understanding of their subject matter. The practice of presentism is regarded by some as a common fallacy when writing about the past.

"Three men make a tiger" is a Chinese proverb or chengyu. "Three men make a tiger" refers to an individual's tendency to accept absurd information as long as it is repeated by enough people. It refers to the idea that if an unfounded premise or urban legend is mentioned and repeated by many individuals, the premise will be erroneously accepted as the truth. This concept is related to communal reinforcement or the fallacy of argumentum ad populum and argumentum ad nauseam.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

The outcome bias is an error made in evaluating the quality of a decision when the outcome of that decision is already known. Specifically, the outcome effect occurs when the same "behavior produce[s] more ethical condemnation when it happen[s] to produce bad rather than good outcome, even if the outcome is determined by chance."

The furtive fallacy is an informal fallacy of emphasis in which historical outcomes are asserted to be the result of hidden misconduct or wrongdoing by decision makers. Historian David Hackett Fischer identified it as the belief that significant facts of history are necessarily sinister, and that "history itself is a story of causes mostly insidious and results mostly invidious." Although it may lead to a conspiracy theory, the fallacy itself consists in the assumption that misdeeds lurk behind every page of history. In its extreme form, the fallacy represents general paranoia.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

References

  1. Arp, Robert, ed. (2013). 1001 Ideas That Changed the Way We Think. Atria Books. p. 555. ISBN   978-1476705729 . Retrieved 15 February 2015.
  2. S. N. Radhika Lakshmi. "Matthew Arnold as a Literary Critic". Literature-Study-Online. Retrieved 26 December 2014.
  3. Matthew Arnold. "The Study of Poetry". Bartleby . Retrieved 26 December 2014.

Further reading