The planning fallacy is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed. This phenomenon sometimes occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned. [1] [2] [3] The bias affects predictions only about one's own tasks. On the other hand, when outside observers predict task completion times, they tend to exhibit a pessimistic bias, overestimating the time needed. [4] [5] The planning fallacy involves estimates of task completion times more optimistic than those encountered in similar projects in the past.
The planning fallacy was first proposed by Daniel Kahneman and Amos Tversky in 1979. [6] [7] In 2003, Lovallo and Kahneman proposed an expanded definition as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also cost overruns and benefit shortfalls. [8]
In a 1994 study, 37 psychology students were asked to estimate how long it would take to finish their senior theses. The average estimate was 33.9 days. They also estimated how long it would take "if everything went as well as it possibly could" (averaging 27.4 days) and "if everything went as poorly as it possibly could" (averaging 48.6 days). The average actual completion time was 55.5 days, with about 30% of the students completing their thesis in the amount of time they predicted. [1]
Another study asked students to estimate when they would complete their personal academic projects. Specifically, the researchers asked for estimated times by which the students thought it was 50%, 75%, and 99% probable their personal projects would be done. [5]
A survey of Canadian tax payers, published in 1997, found that they mailed in their tax forms about a week later than they predicted. They had no misconceptions about their past record of getting forms mailed in, but expected that they would get it done more quickly next time. [9] This illustrates a defining feature of the planning fallacy: that people recognize that their past predictions have been over-optimistic, while insisting that their current predictions are realistic. [4]
Carter and colleagues conducted three studies in 2005 that demonstrate empirical support that the planning fallacy also affects predictions concerning group tasks. This research emphasizes the importance of how temporal frames and thoughts of successful completion contribute to the planning fallacy. [10]
The segmentation effect is defined as the time allocated for a task being significantly smaller than the sum of the time allocated to individual smaller sub-tasks of that task. In a study performed by Forsyth in 2008, this effect was tested to determine if it could be used to reduce the planning fallacy. In three experiments, the segmentation effect was shown to be influential. However, the segmentation effect demands a great deal of cognitive resources and is not very feasible to use in everyday situations. [17]
Implementation intentions are concrete plans that accurately show how, when, and where one will act. It has been shown through various experiments that implementation intentions help people become more aware of the overall task and see all possible outcomes. Initially, this actually causes predictions to become even more optimistic. However, it is believed that forming implementation intentions "explicitly recruits willpower" by having the person commit themselves to the completion of the task. Those that had formed implementation intentions during the experiments began work on the task sooner, experienced fewer interruptions, and later predictions had reduced optimistic bias than those who had not. It was also found that the reduction in optimistic bias was mediated by the reduction in interruptions. [3]
Reference class forecasting predicts the outcome of a planned action based on actual outcomes in a reference class of similar actions to that being forecast.
The Sydney Opera House was expected to be completed in 1963. A scaled-down version opened in 1973, a decade later. The original cost was estimated at $7 million, but its delayed completion led to a cost of $102 million. [10]
The Eurofighter Typhoon defense project took six years longer than expected, with an overrun cost of 8 billion euros. [10]
The Big Dig which undergrounded the Boston Central Artery was completed seven years later than planned, [18] for $8.08 billion on a budget of $2.8 billion (in 1988 dollars). [19]
The Denver International Airport opened sixteen months later than scheduled, with a total cost of $4.8 billion, over $2 billion more than expected. [20]
The Berlin Brandenburg Airport is another case. After 15 years of planning, construction began in 2006, with the opening planned for October 2011. There were numerous delays. It was finally opened on October 31, 2020. The original budget was €2.83 billion; current projections are close to €10.0 billion.
Olkiluoto Nuclear Power Plant Unit 3 faced severe delay and a cost overrun. The construction started in 2005 and was expected to be completed by 2009, but completed only in 2023. [21] [22] Initially, the estimated cost of the project was around 3 billion euros, but the cost has escalated to approximately 10 billion euros. [23]
California High-Speed Rail is still under construction, with tens of billions of dollars in overruns expected, and connections to major cities postponed until after completion of the rural segment.
A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.
In economics and business decision-making, a sunk cost is a cost that has already been incurred and cannot be recovered. Sunk costs are contrasted with prospective costs, which are future costs that may be avoided if action is taken. In other words, a sunk cost is a sum paid in the past that is no longer relevant to decisions about the future. Even though economists argue that sunk costs are no longer relevant to future rational decision-making, people in everyday life often take previous expenditures in situations, such as repairing a car or house, into their future decisions regarding those properties.
Daniel Kahneman was an Israeli-American psychologist best-known for his work on the psychology of judgment and decision-making as well as behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences together with Vernon L. Smith. Kahneman's published empirical findings challenge the assumption of human rationality prevailing in modern economic theory. Kahneman became known as the "grandfather of behavioral economics."
Amos Nathan Tversky was an Israeli cognitive and mathematical psychologist and a key figure in the discovery of systematic human cognitive bias and handling of risk.
The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.
The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of a known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.
The clustering illusion is the tendency to erroneously consider the inevitable "streaks" or "clusters" arising in small samples from random distributions to be non-random. The illusion is caused by a human tendency to underpredict the amount of variability likely to appear in a small sample of random or pseudorandom data.
The conjunction fallacy is an inference that a conjoint set of two or more specific conclusions is likelier than any single member of that same set, in violation of the laws of probability. It is a type of formal fallacy.
In the psychology of affective forecasting, the impact bias, a form of which is the durability bias, is the tendency for people to overestimate the length or the intensity of future emotional states.
The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.
A cost overrun, also known as a cost increase or budget overrun, involves unexpected incurred costs. When these costs are in excess of budgeted amounts due to a value engineering underestimation of the actual cost during budgeting, they are known by these terms.
Optimism bias is a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event. It is also known as delusional optimism, unrealistic optimism or comparative optimism.
Bent Flyvbjerg is a Danish economic geographer. He was the First BT Professor and Inaugural Chair of Major Programme Management at Oxford University's Saïd Business School and is the Villum Kann Rasmussen Professor and Chair of Major Program Management at the IT University of Copenhagen. He was previously Professor of Planning at Aalborg University, Denmark and Chair of Infrastructure Policy and Planning at Delft University of Technology, The Netherlands. He is a fellow of St Anne's College, Oxford.
Reference class forecasting or comparison class forecasting is a method of predicting the future by looking at similar past situations and their outcomes. The theories behind reference class forecasting were developed by Daniel Kahneman and Amos Tversky. The theoretical work helped Kahneman win the Nobel Prize in Economics.
Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.
Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.
Extension neglect is a type of cognitive bias which occurs when the sample size is ignored when its determination is relevant. For instance, when reading an article about a scientific study, extension neglect occurs when the reader ignores the number of people involved in the study but still makes inferences about a population based on the sample. In reality, if the sample size is too small, the results might risk errors in statistical hypothesis testing. A study based on only a few people may draw invalid conclusions because only one person has exceptionally high or low scores (outlier), and there are not enough people there to correct this via averaging out. But often, the sample size is not prominently displayed in science articles, and the reader in this case might still believe the article's conclusion due to extension neglect.
Illusion of validity is a cognitive bias in which a person overestimates their ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story.
Intuitive statistics, or folk statistics, is the cognitive phenomenon where organisms use data to make generalizations and predictions about the world. This can be a small amount of sample data or training instances, which in turn contribute to inductive inferences about either population-level properties, future data, or both. Inferences can involve revising hypotheses, or beliefs, in light of probabilistic data that inform and motivate future predictions. The informal tendency for cognitive animals to intuitively generate statistical inferences, when formalized with certain axioms of probability theory, constitutes statistics as an academic discipline.
{{cite journal}}
: Cite journal requires |journal=
(help) Decision Research Technical Report PTR-1042-77-6. In Kahneman, Daniel; Tversky, Amos (1982). "Intuitive prediction: Biases and corrective procedures". In Kahneman, Daniel; Slovic, Paul; Tversky, Amos (eds.). Judgment Under Uncertainty: Heuristics and Biases. Vol. 185. pp. 414–421. doi:10.1017/CBO9780511809477.031. ISBN 978-0511809477. PMID 17835457.{{cite book}}
: |journal=
ignored (help){{cite book}}
: |journal=
ignored (help)