Anecdotal value

Last updated

In communication studies, science communication, psycholinguistics and choice theory, anecdotal value refers to the primarily social and political value of an anecdote or anecdotal evidence in promoting understanding of a social, cultural, or economic phenomenon. While anecdotal evidence is typically unscientific, in the last several decades the evaluation of anecdotes has received sustained academic scrutiny from economists and scholars such as Felix Salmon, [1] S. G. Checkland (on David Ricardo), Steven Novella, R. Charleton, Hollis Robbins, Kwamena Kwansah-Aidoo, and others. These academics seek to quantify the value of the use of anecdotes, e.g. in promoting public awareness of a disease. More recently, economists studying choice models have begun assessing anecdotal value in the context of framing; Daniel Kahneman and Amos Tversky suggest that choice models may be contingent on stories or anecdotes that frame or influence choice. [2] As an example, consider the quote, widely misattributed to Joseph Stalin: The death of one man is a tragedy, the death of millions is a statistic. [3] [4]

Contents

See also

Notes

  1. Reuters Viral as Anecdotal
  2. Freymuth, Angela K.; Ronan, George F. (1 September 2004). "Modeling Patient Decision-Making: The Role of Base-Rate and Anecdotal Information". Journal of Clinical Psychology in Medical Settings. 11 (3): 211–216. doi:10.1023/B:JOCS.0000037615.23350.f3. hdl: 2027.42/44856 .
  3. Solovyova, Julia (October 28, 1997) Mustering Most Memorable Quips Archived 2008-05-04 at the Wayback Machine , The Moscow Times states: "Russian historians have no record of the lines, 'Death of one man is a tragedy. Death of a million is a statistic,' commonly attributed by English-language dictionaries to Josef Stalin." Discussion of the book by Konstantin Dushenko, Dictionary of Modern Quotations (Словарь современных цитат: 4300 ходячих цитат и выражений ХХ века, их источники, авторы, датировка). See also Joseph Stalin in Wikiquote.
  4. Quote Investigator (May 10, 2010). "A Single Death is a Tragedy; a Million Deaths is a Statistic" . Retrieved November 9, 2017.

Related Research Articles

The term Homo economicus, or economic man, is the portrayal of humans as agents who are consistently rational and narrowly self-interested, and who pursue their subjectively defined ends optimally. It is a wordplay on Homo sapiens, used in some economic theories and in pedagogy.

<span class="mw-page-title-main">Daniel Kahneman</span> Israeli-American psychologist and economist (1934–2024)

Daniel Kahneman was an Israeli-American author, psychologist and economist notable for his work on hedonic psychology, psychology of judgment and decision-making. He is also known for his work in behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences. His empirical findings challenge the assumption of human rationality prevailing in modern economic theory.

<span class="mw-page-title-main">Amos Tversky</span> Israeli psychologist (1937–1996)

Amos Nathan Tversky was an Israeli cognitive and mathematical psychologist and a key figure in the discovery of systematic human cognitive bias and handling of risk.

Behavioral economics is the study of the psychological, cognitive, emotional, cultural and social factors involved in the decisions of individuals or institutions, and how these decisions deviate from those implied by classical economic theory.

<span class="mw-page-title-main">Prospect theory</span> Theory of behavioral economics

Prospect theory is a theory of behavioral economics, judgment and decision making that was developed by Daniel Kahneman and Amos Tversky in 1979. The theory was cited in the decision to award Kahneman the 2002 Nobel Memorial Prize in Economics.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

The conjunction fallacy is an inference that a conjoint set of two or more specific conclusions is likelier than any single member of that same set, in violation of the laws of probability. It is a type of formal fallacy.

<span class="mw-page-title-main">Simulation heuristic</span> Mental strategy

The simulation heuristic is a psychological heuristic, or simplified mental strategy, according to which people determine the likelihood of an event based on how easy it is to picture the event mentally. Partially as a result, people experience more regret over outcomes that are easier to imagine, such as "near misses". The simulation heuristic was first theorized by psychologists Daniel Kahneman and Amos Tversky as a specialized adaptation of the availability heuristic to explain counterfactual thinking and regret. However, it is not the same as the availability heuristic. Specifically the simulation heuristic is defined as "how perceivers tend to substitute normal antecedent events for exceptional ones in psychologically 'undoing' this specific outcome."

The Allais paradox is a choice problem designed by Maurice Allais to show an inconsistency of actual observed choices with the predictions of expected utility theory. Rather than adhering to rationality, the Allais paradox proves that individuals rarely make rational decisions consistently when required to do so immediately. The independence axiom of expected utility theory, which requires that the preferences of an individual should not change when altering two lotteries by equal proportions, was proven to be violated by the paradox.

The disposition effect is an anomaly discovered in behavioral finance. It relates to the tendency of investors to sell assets that have increased in value, while keeping assets that have dropped in value.

In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.

Reference class forecasting or comparison class forecasting is a method of predicting the future by looking at similar past situations and their outcomes. The theories behind reference class forecasting were developed by Daniel Kahneman and Amos Tversky. The theoretical work helped Kahneman win the Nobel Prize in Economics.

Barbara Tversky is a professor emerita of psychology at Stanford University and a professor of psychology and education at Teachers College, Columbia University. Tversky specializes in cognitive psychology.

Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

The certainty effect is the psychological effect resulting from the reduction of probability from certain to probable. It is an idea introduced in prospect theory.

<i>Thinking, Fast and Slow</i> 2011 book by Daniel Kahneman

Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.

Insensitivity to sample size is a cognitive bias that occurs when people judge the probability of obtaining a sample statistic without respect to the sample size. For example, in one study, subjects assigned the same probability to the likelihood of obtaining a mean height of above six feet [183 cm] in samples of 10, 100, and 1,000 men. In other words, variation is more likely in smaller samples, but people may not expect this.

Extension neglect is a type of cognitive bias which occurs when the sample size is ignored when its determination is relevant. For instance, when reading an article about a scientific study, extension neglect occurs when the reader ignores the number of people involved in the study but still makes inferences about a population based on the sample. In reality, if the sample size is too small, the results might risk errors in statistical hypothesis testing. A study based on only a few people may draw invalid conclusions because only one person has exceptionally high or low scores (outlier), and there are not enough people there to correct this via averaging out. But often, the sample size is not prominently displayed in science articles, and the reader in this case might still believe the article's conclusion due to extension neglect.

<span class="mw-page-title-main">Eldar Shafir</span>

Eldar Shafir is an American behavioral scientist, and the co-author of Scarcity: Why Having Too Little Means So Much. He is the Class of 1987 Professor in Behavioral Science and Public Policy; Professor of Psychology and Public Affairs at Princeton University Department of Psychology and the Princeton School of Public and International Affairs, and Inaugural Director of Princeton’s Kahneman-Treisman Center for Behavioral Science and Public Policy,.

References