Planning fallacy

Last updated

The planning fallacy is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed. This phenomenon sometimes occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned. [1] [2] [3] The bias affects predictions only about one's own tasks. On the other hand, when outside observers predict task completion times, they tend to exhibit a pessimistic bias, overestimating the time needed. [4] [5] The planning fallacy involves estimates of task completion times more optimistic than those encountered in similar projects in the past.

Contents

Daniel Kahneman who, along with Amos Tversky, proposed the fallacy Daniel Kahneman (3283955327) (cropped).jpg
Daniel Kahneman who, along with Amos Tversky, proposed the fallacy

The planning fallacy was first proposed by Daniel Kahneman and Amos Tversky in 1979. [6] [7] In 2003, Lovallo and Kahneman proposed an expanded definition as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also cost overruns and benefit shortfalls. [8]

Empirical evidence

For individual tasks

In a 1994 study, 37 psychology students were asked to estimate how long it would take to finish their senior theses. The average estimate was 33.9 days. They also estimated how long it would take "if everything went as well as it possibly could" (averaging 27.4 days) and "if everything went as poorly as it possibly could" (averaging 48.6 days). The average actual completion time was 55.5 days, with about 30% of the students completing their thesis in the amount of time they predicted. [1]

Another study asked students to estimate when they would complete their personal academic projects. Specifically, the researchers asked for estimated times by which the students thought it was 50%, 75%, and 99% probable their personal projects would be done. [5]

A survey of Canadian tax payers, published in 1997, found that they mailed in their tax forms about a week later than they predicted. They had no misconceptions about their past record of getting forms mailed in, but expected that they would get it done more quickly next time. [9] This illustrates a defining feature of the planning fallacy: that people recognize that their past predictions have been over-optimistic, while insisting that their current predictions are realistic. [4]

For group tasks

Carter and colleagues conducted three studies in 2005 that demonstrate empirical support that the planning fallacy also affects predictions concerning group tasks. This research emphasizes the importance of how temporal frames and thoughts of successful completion contribute to the planning fallacy. [10]

Additional studies

Bent Flyvbjerg and Cass Sunstein argue that Albert O. Hirschman's Hiding Hand principle is the planning fallacy writ large, and they tested the empirical validity of the principle. [11] See also further reading below for additional studies.

Proposed explanations

Methods for counteracting

Segmentation effect

The segmentation effect is defined as the time allocated for a task being significantly smaller than the sum of the time allocated to individual smaller sub-tasks of that task. In a study performed by Forsyth in 2008, this effect was tested to determine if it could be used to reduce the planning fallacy. In three experiments, the segmentation effect was shown to be influential. However, the segmentation effect demands a great deal of cognitive resources and is not very feasible to use in everyday situations. [18]

Implementation intentions

Implementation intentions are concrete plans that accurately show how, when, and where one will act. It has been shown through various experiments that implementation intentions help people become more aware of the overall task and see all possible outcomes. Initially, this actually causes predictions to become even more optimistic. However, it is believed that forming implementation intentions "explicitly recruits willpower" by having the person commit themselves to the completion of the task. Those that had formed implementation intentions during the experiments began work on the task sooner, experienced fewer interruptions, and later predictions had reduced optimistic bias than those who had not. It was also found that the reduction in optimistic bias was mediated by the reduction in interruptions. [3]

Reference class forecasting

Reference class forecasting predicts the outcome of a planned action based on actual outcomes in a reference class of similar actions to that being forecast.

Real-world examples

Sydney Opera House, still under construction in 1966, three years after its expected completion date Sydney Opera House - construction - phase 2 1966.jpg
Sydney Opera House, still under construction in 1966, three years after its expected completion date

The Sydney Opera House was expected to be completed in 1963. A scaled-down version opened in 1973, a decade later. The original cost was estimated at $7 million, but its delayed completion led to a cost of $102 million. [10]

The Eurofighter Typhoon defense project took six years longer than expected, with an overrun cost of 8 billion euros. [10]

The Big Dig which undergrounded the Boston Central Artery was completed seven years later than planned, [19] for $8.08 billion on a budget of $2.8 billion (in 1988 dollars). [20]

The Denver International Airport opened sixteen months later than scheduled, with a total cost of $4.8 billion, over $2 billion more than expected. [21]

The Berlin Brandenburg Airport is another egregious case. After 15 years of planning, construction began in 2006, with the opening planned for October 2011. There were numerous delays. It was finally opened on October 31, 2020. The original budget was €2.83 billion; current projections are close to €10.0 billion.

Olkiluoto Nuclear Power Plant Unit 3 faced severe delay and a cost overrun. The construction started in 2005 and was expected to be completed by 2009, but completed only in 2023. [22] [23] Initially, the estimated cost of the project was around 3 billion euros, but the cost has escalated to approximately 10 billion euros. [24]

California High-Speed Rail is still under construction, with tens of billions of dollars in overruns expected, and connections to major cities postponed until after completion of the rural segment.

See also

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

In economics and business decision-making, a sunk cost is a cost that has already been incurred and cannot be recovered. Sunk costs are contrasted with prospective costs, which are future costs that may be avoided if action is taken. In other words, a sunk cost is a sum paid in the past that is no longer relevant to decisions about the future. Even though economists argue that sunk costs are no longer relevant to future rational decision-making, people in everyday life often take previous expenditures in situations, such as repairing a car or house, into their future decisions regarding those properties.

<span class="mw-page-title-main">Daniel Kahneman</span> Israeli-American psychologist

Daniel Kahneman is an Israeli-American author, psychologist and economist notable for his work on hedonic psychology, psychology of judgment and decision-making, as well as behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences. His empirical findings challenge the assumption of human rationality prevailing in modern economic theory.

<span class="mw-page-title-main">Amos Tversky</span> Israeli psychologist (1937–1996)

Amos Nathan Tversky was an Israeli cognitive and mathematical psychologist and a key figure in the discovery of systematic human cognitive bias and handling of risk.

The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of known protyical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

<span class="mw-page-title-main">Clustering illusion</span> Erroneously seeing patterns in randomness

The clustering illusion is the tendency to erroneously consider the inevitable "streaks" or "clusters" arising in small samples from random distributions to be non-random. The illusion is caused by a human tendency to underpredict the amount of variability likely to appear in a small sample of random or pseudorandom data.

The conjunction fallacy is an inference that a conjoint set of two or more specific conclusions is likelier than any single member of that same set, in violation of the laws of probability. It is a type of formal fallacy.

In the psychology of affective forecasting, the impact bias, a form of which is the durability bias, is the tendency for people to overestimate the length or the intensity of future emotional states.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

A cost overrun, also known as a cost increase or budget overrun, involves unexpected incurred costs. When these costs are in excess of budgeted amounts due to a value engineering underestimation of the actual cost during budgeting, they are known by these terms.

Optimism bias is a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event. It is also known as unrealistic optimism or comparative optimism.

Bent Flyvbjerg is a Danish economic geographer. He was the First BT Professor and Inaugural Chair of Major Programme Management at Oxford University's Saïd Business School and is the Villum Kann Rasmussen Professor and Chair of Major Program Management at the IT University of Copenhagen. He was previously Professor of Planning at Aalborg University, Denmark and Chair of Infrastructure Policy and Planning at Delft University of Technology, The Netherlands. He is a fellow of St Anne's College, Oxford.

Reference class forecasting or comparison class forecasting is a method of predicting the future by looking at similar past situations and their outcomes. The theories behind reference class forecasting were developed by Daniel Kahneman and Amos Tversky. The theoretical work helped Kahneman win the Nobel Prize in Economics.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

<i>Thinking, Fast and Slow</i> 2011 book by Daniel Kahneman

Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.

Extension neglect is a type of cognitive bias which occurs when the sample size is ignored when its determination is relevant. For instance, when reading an article about a scientific study, extension neglect occurs when the reader ignores the number of people involved in the study but still makes inferences about a population based on the sample. In reality, if the sample size is too small, the results might risk errors in statistical hypothesis testing. A study based on only a few people may draw invalid conclusions because only one person has exceptionally high or low scores (outlier), and there are not enough people there to correct this via averaging out. But often, the sample size is not prominently displayed in science articles, and the reader in this case might still believe the article's conclusion due to extension neglect.

Illusion of validity is a cognitive bias in which a person overestimates their ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story.

The hiding hand principle is a theory that offers a framework to examine how ignorance intersects with rational choice to undertake a project; the intersection is seen to provoke creative success over the obstacles through the deduction that it is too late to abandon the project. The term was coined by economist Albert O. Hirschman.

References

  1. 1 2 Buehler, Roger; Dale Griffin; Michael Ross (1994). "Exploring the "planning fallacy": Why people underestimate their task completion times". Journal of Personality and Social Psychology. 67 (3): 366–381. doi:10.1037/0022-3514.67.3.366. S2CID   4222578.
  2. Kruger, Justin; Evans, Matt (15 October 2003). "If you don't want to be late, enumerate: Unpacking Reduces the Planning Fallacy". Journal of Experimental Social Psychology. 40 (5): 586–598. doi:10.1016/j.jesp.2003.11.001.
  3. 1 2 Koole, Sander; Van't Spijker, Mascha (2000). "Overcoming the planning fallacy through willpower: Effects of implementation intentions on actual and predicted task-completion times" (PDF). European Journal of Social Psychology. 30 (6): 873–888. doi:10.1002/1099-0992(200011/12)30:6<873::AID-EJSP22>3.0.CO;2-U. hdl: 1871/17588 . Archived from the original (PDF) on 2019-11-29.
  4. 1 2 3 Buehler, Roger; Griffin, Dale, & Ross, Michael (2002). "Inside the planning fallacy: The causes and consequences of optimistic time predictions". In Thomas Gilovich, Dale Griffin, & Daniel Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment, pp. 250–270. Cambridge, UK: Cambridge University Press. doi : 10.1017/CBO9780511808098.016
  5. 1 2 Buehler, Roger; Dale Griffin; Michael Ross (1995). "It's about time: Optimistic predictions in work and love". European Review of Social Psychology. 6: 1–32. doi:10.1080/14792779343000112.
  6. 1 2 3 Pezzo, Mark V.; Litman, Jordan A.; Pezzo, Stephanie P. (2006). "On the distinction between yuppies and hippies: Individual differences in prediction biases for planning future tasks". Personality and Individual Differences. 41 (7): 1359–1371. doi:10.1016/j.paid.2006.03.029. ISSN   0191-8869.
  7. Kahneman, Daniel; Tversky, Amos (1977). "Intuitive prediction: Biases and corrective procedures" (PDF). Archived (PDF) from the original on September 8, 2013.{{cite journal}}: Cite journal requires |journal= (help) Decision Research Technical Report PTR-1042-77-6. In Kahneman, Daniel; Tversky, Amos (1982). "Intuitive prediction: Biases and corrective procedures". In Kahneman, Daniel; Slovic, Paul; Tversky, Amos (eds.). Judgment Under Uncertainty: Heuristics and Biases. pp. 414–421. doi:10.1017/CBO9780511809477.031. ISBN   978-0511809477. PMID   17835457.{{cite book}}: |journal= ignored (help)
  8. Lovallo, Dan; Kahneman, Daniel (July 2003). "Delusions of Success: How Optimism Undermines Executives' Decisions". Harvard Business Review. 81 (7): 56–63. PMID   12858711.
  9. Buehler, Roger; Dale Griffin; Johanna Peetz (2010). The Planning Fallacy: Cognitive, Motivational, and Social Origins (PDF). pp. 1–62. doi:10.1016/s0065-2601(10)43001-4. ISBN   9780123809469 . Retrieved 2012-09-15.{{cite book}}: |journal= ignored (help)
  10. 1 2 3 4 Sanna, Lawrence J.; Parks, Craig D.; Chang, Edward C.; Carter, Seth E. (2005). "The Hourglass Is Half Full or Half Empty: Temporal Framing and the Group Planning Fallacy". Group Dynamics: Theory, Research, and Practice. 9 (3): 173–188. doi:10.1037/1089-2699.9.3.173.
  11. Flyvbjerg, Bent; Sunstein, Cass R. (2015). "The Principle of the Malevolent Hiding Hand; or, the Planning Fallacy Writ Large". Rochester, NY. arXiv: 1509.01526 . Bibcode:2015arXiv150901526F. SSRN   2654423.{{cite journal}}: Cite journal requires |journal= (help)
  12. Pezzo, Stephanie P.; Pezzo, Mark V.; Stone, Eric R. (2006). "The social implications of planning: How public predictions bias future plans". Journal of Experimental Social Psychology. 2006 (2): 221–227. doi:10.1016/j.jesp.2005.03.001.
  13. Roy, Michael M.; Christenfeld, Nicholas J. S.; McKenzie, Craig R. M. (2005). "Underestimating the Duration of Future Events: Memory Incorrectly Used or Memory Bias?". Psychological Bulletin. 131 (5): 738–756. CiteSeerX   10.1.1.525.3506 . doi:10.1037/0033-2909.131.5.738. PMID   16187856.
  14. Wilson, Timothy D.; Wheatley, Thalia; Meyers, Jonathan M.; Gilbert, Daniel T.; Axsom, Danny (2000). "Focalism: A source of durability bias in affective forecasting". Journal of Personality and Social Psychology. 78 (5): 821–836. doi:10.1037/0022-3514.78.5.821. PMID   10821192.
  15. Jones, Larry R; Euske, Kenneth J (October 1991). "Strategic misrepresentation in budgeting". Journal of Public Administration Research and Theory. 1 (4): 437–460. Archived from the original on 2012-12-16. Retrieved 11 March 2013.
  16. Taleb, Nassim (2012). Antifragile: Things That Gain from Disorder . Random House Publishing. ISBN   978-1-4000-6782-4.
  17. For infrastructure projects to succeed, think slow and act fast
  18. Forsyth, D. K. (June 2008). "Allocating time to future tasks: The effect of task segmentation on planning fallacy bias". Memory & Cognition. 36 (4): 791–798. doi: 10.3758/MC.36.4.791 . PMID   18604961.
  19. "No Light at the End of his Tunnel: Boston's Central Artery/Third Harbor Tunnel Project". Project on Government Oversight. 1 February 1995. Archived from the original on 8 November 2014. Retrieved 7 November 2014.
  20. Johnson, Glen (July 13, 2006). "Governor seeks to take control of Big Dig inspections". Boston Globe. Archived from the original on March 11, 2007. Retrieved July 13, 2006.
  21. "Denver International Airport" (PDF). United States General Accounting Office. September 1995. Retrieved 7 November 2014.
  22. Lehto, Essi; Buli, Nora (2022-03-16). "Finland starts much-delayed nuclear plant, brings respite to power market". Reuters. Retrieved 2023-04-23.
  23. Lehto, Essi (2023-04-15). "After 18 years, Europe's largest nuclear reactor starts regular output". Reuters. Retrieved 2023-04-23.
  24. Buli, Nora (2021-08-23). "Finland's Olkiluoto 3 nuclear reactor faces another delay". Reuters. Retrieved 2023-04-23.

Further reading