Futures studies |
---|
![]() |
Concepts |
Techniques |
Technology assessment and forecasting |
Related topics |
Reference class forecasting or comparison class forecasting is a method of predicting the future by looking at similar past situations and their outcomes. The theories behind reference class forecasting were developed by Daniel Kahneman and Amos Tversky. The theoretical work helped Kahneman win the Nobel Prize in Economics.
Reference class forecasting is so named as it predicts the outcome of a planned action based on actual outcomes in a reference class of similar actions to that being forecast.
Discussion of which reference class to use when forecasting a given situation is known as the reference class problem.
Kahneman and Tversky [1] [2] found that human judgment is generally optimistic due to overconfidence and insufficient consideration of distributional information about outcomes.
People tend to underestimate the costs, completion times, and risks of planned actions, whereas they tend to overestimate the benefits of those same actions. Such error is caused by actors taking an "inside view", where focus is on the constituents of the specific planned action instead of on the actual outcomes of similar ventures that have already been completed.
Kahneman and Tversky concluded that disregard of distributional information, i.e. risk, is perhaps the major source of error in forecasting. On that basis they recommended that forecasters "should therefore make every effort to frame the forecasting problem so as to facilitate utilizing all the distributional information that is available". [2] : 416 Using distributional information from previous ventures similar to the one being forecast is called taking an "outside view". Reference class forecasting is a method for taking an outside view on planned actions.
Reference class forecasting for a specific project involves the following three steps:
The reference class problem, also known as reference class tennis, is the discussion of which reference class to use when forecasting a given situation.
Suppose someone were trying to predict how long it would take to write a psychology textbook. Reference class tennis would involve debating whether we should take the average of all books (closest to an outside view), just all textbooks, or just all psychology textbooks (closest to an inside view). [3] [4]
Whereas Kahneman and Tversky developed the theories of reference class forecasting, Flyvbjerg and COWI (2004) developed the method for its practical use in policy and planning, which was published as an official Guidance Document in June 2004 by the UK Department for Transport. [5]
The first instance of reference class forecasting in practice is described in Flyvbjerg (2006). [6] This forecast was part of a review of the Edinburgh Tram Line 2 business case, which was carried out in October 2004 by Ove Arup and Partners Scotland. At the time, the project was forecast to cost a total of £320 million, of which £64 million – or 25% – was allocated for contingency. Using the newly implemented reference class forecasting guidelines, Ove Arup and Partners Scotland calculated the 80th percentile value (i.e., 80% likelihood of staying within budget) for total capital costs to be £400 million, which equaled 57% contingency. Similarly, they calculated the 50th percentile value (i.e., 50% likelihood of staying within budget) to be £357 million, which equaled 40% contingency. The review further acknowledged that the reference class forecasts were likely to be too low because the guidelines recommended that the uplifts should be applied at the time of decision to build, which the project had not yet reached, and that the risks therefore would be substantially higher at this early business case stage. On this basis, the review concluded that the forecasted costs could have been underestimated. The Edinburgh Tram Line 2 opened three years late in May 2014 with a final outturn cost of £776 million, which equals £628 million in 2004-prices. [7]
Since the Edinburgh forecast, reference class forecasting has been applied to numerous other projects in the UK, including the £15 (US$29) billion Crossrail project in London. After 2004, The Netherlands, Denmark, and Switzerland have also implemented various types of reference class forecasting.
Before this, in 2001 (updated in 2011), AACE International (the Association for the Advancement of Cost Engineering) included Estimate Validation as a distinct step in the recommended practice of Cost Estimating (Estimate Validation is equivalent to Reference class forecasting in that it calls for separate empirical-based evaluations to benchmark the base estimate):
The estimate should be benchmarked or validated against or compared to historical experience and/or past estimates of the enterprise and of competitive enterprises to check its appropriateness, competitiveness, and to identify improvement opportunities...Validation examines the estimate from a different perspective and using different metrics than are used in estimate preparation. [8]
In the process industries (e.g., oil and gas, chemicals, mining, energy, etc. which tend to dominate AACE's membership), benchmarking (i.e., "outside view") of project cost estimates against the historical costs of completed projects of similar types, including probabilistic information, has a long history. [9] A method combining reference class forecasting and competitive crowdsourcing, Human Forest, has also been used in the life sciences, to estimate the likelihood that vaccines and treatments will successfully progress through clinical trial phases. [10] [11]
Daniel Kahneman was an Israeli-American psychologist best known for his work on the psychology of judgment and decision-making as well as behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences together with Vernon L. Smith. Kahneman's published empirical findings challenge the assumption of human rationality prevailing in modern economic theory. Kahneman became known as the "grandfather of behavioral economics."
Amos Nathan Tversky was an Israeli cognitive and mathematical psychologist and a key figure in the discovery of systematic human cognitive bias and handling of risk.
Prospect theory is a theory of behavioral economics, judgment and decision making that was developed by Daniel Kahneman and Amos Tversky in 1979. The theory was cited in the decision to award Kahneman the 2002 Nobel Memorial Prize in Economics.
The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.
The planning fallacy is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed. This phenomenon sometimes occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned. The bias affects predictions only about one's own tasks. On the other hand, when outside observers predict task completion times, they tend to exhibit a pessimistic bias, overestimating the time needed. The planning fallacy involves estimates of task completion times more optimistic than those encountered in similar projects in the past.
The Allais paradox is a choice problem designed by Maurice Allais to show an inconsistency of actual observed choices with the predictions of expected utility theory. The Allais paradox demonstrates that individuals rarely make rational decisions consistently when required to do so immediately. The independence axiom of expected utility theory, which requires that the preferences of an individual should not change when altering two lotteries by equal proportions, was proven to be violated by the paradox.
The disposition effect is an anomaly discovered in behavioral finance. It relates to the tendency of investors to sell assets that have increased in value, while keeping assets that have dropped in value.
In behavioral economics, cumulative prospect theory (CPT) is a model for descriptive decisions under risk and uncertainty which was introduced by Amos Tversky and Daniel Kahneman in 1992. It is a further development and variant of prospect theory. The difference between this version and the original version of prospect theory is that weighting is applied to the cumulative probability distribution function, as in rank-dependent expected utility theory but not applied to the probabilities of individual outcomes. In 2002, Daniel Kahneman received the Nobel Memorial Prize in Economic Sciences for his contributions to behavioral economics, in particular the development of CPT.
A cost estimate is the approximation of the cost of a program, project, or operation. The cost estimate is the product of the cost estimating process. The cost estimate has a single total value and may have identifiable component values.
A cost overrun, also known as a cost increase or budget overrun, involves unexpected incurred costs. When these costs are in excess of budgeted amounts due to a value engineering underestimation of the actual cost during budgeting, they are known by these terms.
Optimism bias or optimistic bias is a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event. It is also known as unrealistic optimism or comparative optimism. It is common and transcends gender, ethnicity, nationality, and age. Autistic people are less susceptible to this kind of bias. It has also been reported in other animals, such as rats and birds.
Cost engineering is "the engineering practice devoted to the management of project cost, involving such activities as estimating, cost control, cost forecasting, investment appraisal and risk analysis". "Cost Engineers budget, plan and monitor investment projects. They seek the optimum balance between cost, quality and time requirements."
When estimating the cost for a project, product or other item or investment, there is always uncertainty as to the precise content of all items in the estimate, how work will be performed, what work conditions will be like when the project is executed and so on. These uncertainties are risks to the project. Some refer to these risks as "known-unknowns" because the estimator is aware of them, and based on past experience, can even estimate their probable costs. The estimated costs of the known-unknowns is referred to by cost estimators as cost contingency.
When the actual benefits of a venture are less than the projected or estimated benefits, the result is known as a benefit shortfall.
Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.
Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.
Illusion of validity is a cognitive bias in which a person overestimates their ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story.
Estimation is the process of finding an estimate or approximation, which is a value that is usable for some purpose even if input data may be incomplete, uncertain, or unstable. The value is nonetheless usable because it is derived from the best information available. Typically, estimation involves "using the value of a statistic derived from a sample to estimate the value of a corresponding population parameter". The sample provides information that can be projected, through various formal or informal processes, to determine a range most likely to describe the missing information. An estimate that turns out to be incorrect will be an overestimate if the estimate exceeds the actual result and an underestimate if the estimate falls short of the actual result.
Debiasing is the reduction of bias, particularly with respect to judgment and decision making. Biased judgment and decision making is that which systematically deviates from the prescriptions of objective standards such as facts, logic, and rational behavior or prescriptive norms. Biased judgment and decision making exists in consequential domains such as medicine, law, policy, and business, as well as in everyday life. Investors, for example, tend to hold onto falling stocks too long and sell rising stocks too quickly. Employers exhibit considerable discrimination in hiring and employment practices, and some parents continue to believe that vaccinations cause autism despite knowing that this link is based on falsified evidence. At an individual level, people who exhibit less decision bias have more intact social environments, reduced risk of alcohol and drug use, lower childhood delinquency rates, and superior planning and problem solving abilities.
Lavagnon Ika is a Benin-born, Canadian project management scientist, academic, thought leader, and author. He is professor of Project Management, the founding director of the Major Projects Observatory, as well as the program director of the MSc in Management at the Telfer School of Management at the University of Ottawa, and an Extraordinary Professor at the University of Pretoria.
{{cite journal}}
: Cite journal requires |journal=
(help) Decision Research Technical Report PTR-1042-77-6. In Kahneman, Daniel; Tversky, Amos (1982). "Intuitive prediction: Biases and corrective procedures". In Kahneman, Daniel; Slovic, Paul; Tversky, Amos (eds.). Judgment Under Uncertainty: Heuristics and Biases. pp. 414–421. doi:10.1017/CBO9780511809477.031. ISBN 9780511809477.{{cite journal}}
: Cite journal requires |journal=
(help)