Less-is-more effect

Last updated

The less-is-more effect refers to the finding that heuristic decision strategies can yield more accurate judgments than alternative strategies that use more pieces of information. Understanding these effects is part of the study of ecological rationality.

Contents

Examples

One popular less-is-more effect was found in comparing the take-the-best heuristic with a linear decision strategy in making judgments about which of two objects has a higher value on some criterion. Whereas the linear decision strategy uses all available cues and weighs them, the take-the-best heuristic uses only the first cue that differs between the objects. Despite this frugality, the heuristic yielded more accurate judgments than the linear decision strategy. [1]

Beyond this first finding, less-is-more effects were found for other heuristics, including the recognition heuristic [2] and the hiatus heuristic. [3]

Explanations

Some less-is-more effects can be explained within the framework of bias and variance. According to the bias-variance tradeoff, errors in prediction are due to two sources. Consider a decision strategy that uses a random sample of objects to make a judgment about an object outside of this sample. Due to sampling variance, there is a large number of hypothetical predictions, each based on a different random sample. Bias refers to the difference between the average of these hypothetical predictions and the true value of the object to be judged. In contrast, variance refers to the average variation of the hypothetical judgments around their average. [4]

Determinants of variance

The variance component of judgment error depends on the degree to which the decision strategy adapts to each possible sample. One major determinant of this degree is a strategy's number of free parameters. Therefore, (heuristic) strategies that use fewer pieces of information and have fewer parameters tend to have lower error from variance than strategies with more parameters. [5]

Determinants of bias

At the same time, fewer parameters tend to increase the error from bias, implying that heuristic strategies are more likely to be biased than strategies that use more pieces of information. The exact amount of bias, however, depends on the specific problem to which a decision strategy is applied. If the decision problem has a statistical structure that matches the structure of the heuristic strategy, the bias can be surprisingly small. For example, analyses of the take-the-best heuristic and other lexicographic heuristics have shown that the bias of these strategies is equal to the bias of the linear strategy when the weights of the linear strategy show specific regularities [6] [7] that were found to be prevalent in many real-life situations. [8]

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

A heuristic, or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

<span class="mw-page-title-main">Overfitting</span> Flaw in mathematical modelling

In mathematical modeling, overfitting is "the production of an analysis that corresponds too closely or exactly to a particular set of data, and may therefore fail to fit to additional data or predict future observations reliably". An overfitted model is a mathematical model that contains more parameters than can be justified by the data. In a mathematical sense, these parameters represent the degree of a polynomial. The essence of overfitting is to have unknowingly extracted some of the residual variation as if that variation represented underlying model structure.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of known protyical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

The conjunction fallacy is an inference that a conjoint set of two or more specific conclusions is likelier than any single member of that same set, in violation of the laws of probability. It is a type of formal fallacy.

The recognition heuristic, originally termed the recognition principle, has been used as a model in the psychology of judgment and decision making and as a heuristic in artificial intelligence. The goal is to make inferences about a criterion that is not directly accessible to the decision maker, based on recognition retrieved from memory. This is possible if recognition of alternatives has relevance to the criterion. For two alternatives, the heuristic is defined as:

If one of two objects is recognized and the other is not, then infer that the recognized object has the higher value with respect to the criterion.

<span class="mw-page-title-main">Gerd Gigerenzer</span> German cognitive psychologist

Gerd Gigerenzer is a German psychologist who has studied the use of bounded rationality and heuristics in decision making. Gigerenzer is director emeritus of the Center for Adaptive Behavior and Cognition (ABC) at the Max Planck Institute for Human Development and director of the Harding Center for Risk Literacy, both in Berlin.

Daniel G. Goldstein is an American cognitive psychologist known for the specification and testing of heuristics and models of bounded rationality in the field of judgment and decision making. He is an honorary research fellow at London Business School and works with Microsoft Research as a principal researcher.

In psychology, the take-the-best heuristic is a heuristic which decides between two alternatives by choosing based on the first cue that discriminates them, where cues are ordered by cue validity. In the original formulation, the cues were assumed to have binary values or have an unknown value. The logic of the heuristic is that it bases its choice on the best cue (reason) only and ignores the rest.

The gaze heuristic is a heuristic used in directing correct motion to achieve a goal using one main variable. An example of the gaze heuristic is catching a ball. The gaze heuristic is one example of psychologist Gerd Gigerenzer's one good reason heuristic, where human animals and non-human animals are able to process large amounts of information quickly and react, regardless of whether the information is consciously processed.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Scarcity, in the area of social psychology, works much like scarcity in the area of economics. Scarcity is basically how people handle satisfying themselves regarding unlimited wants and needs with resources that are limited. Humans place a higher value on an object that is scarce, and a lower value on those that are in abundance. For example diamonds are more valuable than rocks because diamonds are not as abundant. These perceptions of scarcity can lead to irregular consumer behavior, such as systemic errors or cognitive bias.

<span class="mw-page-title-main">Bias–variance tradeoff</span> Property of a model

In statistics and machine learning, the bias–variance tradeoff describes the relationship between a model's complexity, the accuracy of its predictions, and how well it can make predictions on previously unseen data that were not used to train the model. In general, as we increase the number of tunable parameters in a model, it becomes more flexible, and can better fit a training data set. It is said to have lower error, or bias. However, for more flexible models, there will tend to be greater variance to the model fit each time we take a set of samples to create a new training data set. It is said that there is greater variance in the model's estimated parameters.

Heuristics are simple decision making strategies used to achieve a specific goal quickly and efficiently, and are commonly implemented in sports. Many sports require the ability to make fast decisions under time pressure, and the proper use of heuristics is essential for many of these decisions.

Social heuristics are simple decision making strategies that guide people's behavior and decisions in the social environment when time, information, or cognitive resources are scarce. Social environments tend to be characterised by complexity and uncertainty, and in order to simplify the decision-making process, people may use heuristics, which are decision making strategies that involve ignoring some information or relying on simple rules of thumb.

Ecological rationality is a particular account of practical rationality, which in turn specifies the norms of rational action – what one ought to do in order to act rationally. The presently dominant account of practical rationality in the social and behavioral sciences such as economics and psychology, rational choice theory, maintains that practical rationality consists in making decisions in accordance with some fixed rules, irrespective of context. Ecological rationality, in contrast, claims that the rationality of a decision depends on the circumstances in which it takes place, so as to achieve one's goals in this particular context. What is considered rational under the rational choice account thus might not always be considered rational under the ecological rationality account. Overall, rational choice theory puts a premium on internal logical consistency whereas ecological rationality targets external performance in the world. The term ecologically rational is only etymologically similar to the biological science of ecology.

In behavioural sciences, social rationality is a type of decision strategy used in social contexts, in which a set of simple rules is applied in complex and uncertain situations.

In the study of decision-making, a fast-and-frugal tree is a simple graphical structure that categorizes objects by asking one question at a time. These decision trees are used in a range of fields: psychology, artificial intelligence, and management science. Unlike other decision or classification trees, such as Leo Breiman's CART, fast-and-frugal trees are intentionally simple, both in their construction as well as their execution, and operate speedily with little information. For this reason, fast-and-frugal-trees are potentially attractive when designing resource-constrained tasks.

Intuitive statistics, or folk statistics, is the cognitive phenomenon where organisms use data to make generalizations and predictions about the world. This can be a small amount of sample data or training instances, which in turn contribute to inductive inferences about either population-level properties, future data, or both. Inferences can involve revising hypotheses, or beliefs, in light of probabilistic data that inform and motivate future predictions. The informal tendency for cognitive animals to intuitively generate statistical inferences, when formalized with certain axioms of probability theory, constitutes statistics as an academic discipline.

<span class="mw-page-title-main">Ralph Hertwig</span> German psychologist

Ralph Hertwig is a German psychologist whose work focuses on the psychology of human judgment and decision making. Hertwig is Director of the Center for Adaptive Rationality at the Max Planck Institute for Human Development in Berlin, Germany. He grew up with his brothers Steffen Hertwig and Michael Hertwig in Talheim, Heilbronn.

References

  1. Czerlinski, Jean; Goldstein, Daniel G.; Gigerenzer, Gerd (1999). "How good are simple heuristics?". Simple Heuristics that make us smart . New York: Oxford University Press. pp.  97–118.
  2. Goldstein, Daniel G.; Gigerenzer, Gerd (2002). "Models of ecological rationality: The recognition heuristic". Psychological Review. 109 (1): 75–90. doi:10.1037/0033-295x.109.1.75. hdl: 11858/00-001M-0000-0025-9128-B . ISSN   1939-1471. PMID   11863042.
  3. Wübben, Markus; Wangenheim, Florian v. (2008). "Instant Customer Base Analysis: Managerial Heuristics Often "Get it Right"". Journal of Marketing. 72 (3): 82–93. doi:10.1509/jmkg.72.3.082. ISSN   0022-2429.
  4. Hastie, Trevor; Tibshirani, Robert; Friedman, Jerome (2009). The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition. Springer Series in Statistics (2 ed.). New York: Springer-Verlag. ISBN   9780387848570.
  5. Gigerenzer, Gerd; Brighton, Henry (2009). "Homo Heuristicus: Why Biased Minds Make Better Inferences". Topics in Cognitive Science. 1 (1): 107–143. CiteSeerX   10.1.1.321.3027 . doi: 10.1111/j.1756-8765.2008.01006.x . ISSN   1756-8765. PMID   25164802.
  6. Martignon, Laura; Hoffrage, Ulrich (2002). "Fast, frugal, and fit: Simple heuristics for paired comparison". Theory and Decision. 52 (1): 29–71. doi:10.1023/A:1015516217425. ISSN   0040-5833.
  7. Hogarth, Robin M.; Karelaia, Natalia (2006-11-01). ""Take-the-Best" and Other Simple Strategies: Why and When they Work "Well" with Binary Cues". Theory and Decision. 61 (3): 205–249. doi:10.1007/s11238-006-9000-8. ISSN   1573-7187.
  8. Şimşek, Özgür (2013), Burges, C. J. C.; Bottou, L.; Welling, M.; Ghahramani, Z. (eds.), "Linear decision rule as aspiration for simple decision heuristics" (PDF), Advances in Neural Information Processing Systems 26, Curran Associates, Inc., pp. 2904–2912, retrieved 2019-06-01