Cone of uncertainty

Last updated

In project management, the cone of uncertainty describes the evolution of the amount of best case uncertainty during a project. [1] At the beginning of a project, comparatively little is known about the product or work results, and so estimates are subject to large uncertainty. As more research and development is done, more information is learned about the project, and the uncertainty then tends to decrease, reaching 0% when all residual risk has been terminated or transferred. This usually happens by the end of the project i.e. by transferring the responsibilities to a separate maintenance group.

Contents

The term cone of uncertainty is used in software development where the technical and business environments change very rapidly. However, the concept, under different names, is a well-established basic principle of cost engineering. Most[ citation needed ] environments change so slowly that they can be considered static for the duration of a typical project, and traditional project management methods therefore focus on achieving a full understanding of the environment through careful analysis and planning. Well before any significant investments are made, the uncertainty is reduced to a level where the risk can be carried comfortably. In this kind of environment the uncertainty level decreases rapidly in the beginning and the cone shape is less obvious. The software business however is very volatile and there is an external pressure to decrease the uncertainty level over time. The project must actively and continuously work to reduce the uncertainty level.

The cone of uncertainty is narrowed both by research and by decisions that remove the sources of variability from the project. These decisions are about scope, what is included and not included in the project. If these decisions change later in the project then the cone will widen.

Original research for engineering and construction in the chemical industry demonstrated that actual final costs often exceeded the earliest "base" estimate by as much as 100% (or underran by as much as 50% [2] ). Research in the software industry on the cone of uncertainty stated that in the beginning of the project life cycle (i.e. before gathering of requirements) estimates have in general an uncertainty of factor 4 on both the high side and the low side. [3] This means that the actual effort or scope can be 4 times or 1/4 of the first estimates. This uncertainty tends to decrease over the course of a project, although that decrease is not guaranteed. [4]

Applications

One way to account for the cone of uncertainty in the project estimate is to first determine a 'most likely' single-point estimate and then calculate the high-low range using predefined multipliers (dependent on the level of uncertainty at that time). This can be done with formulas applied to spreadsheets, or by using a project management tool that allows the task owner to enter a low/high ranged estimate and will then create a schedule that will include this level of uncertainty.

A projected three- and five-day path of Hurricane Irene, here downgraded to a tropical storm 09L 2011 5day.gif
A projected three- and five-day path of Hurricane Irene, here downgraded to a tropical storm

The cone of uncertainty is also used extensively as a graphic in hurricane forecasting, where its most iconic usage is more formally known as the NHC Track Forecast Cone, [5] and more colloquially known as the Error Cone, Cone of Probability, or the Cone of Death. (Note that the usage in hurricane forecasting is essentially the opposite of the usage in software development. In software development, the uncertainty surrounds the current state of the project, and in the future the uncertainty decreases, whereas in hurricane forecasting the current location of the storm is certain, and the future path of the storm becomes increasingly uncertain). [6] Over the past decade, storms have traveled within their projected areas two-thirds of the time, [7] and the cones themselves have shrunk due to improvements in methodology. The NHC first began in-house five-day projections in 2001, and began issuing such to the public in 2003. It is currently working in-house on seven-day forecasts, but the resultant cone of uncertainty is so large that the possible benefits for disaster management are problematic. [8]

History

The original conceptual basis of the cone of uncertainty was developed for engineering and construction in the chemical industry by the founders of the American Association of Cost Engineers (now AACE International). They published a proposed standard estimate type classification system with uncertainty ranges in 1958 [9] and presented "cone" illustrations in the industry literature at that time. [2] In the software field, the concept was picked up by Barry Boehm. [10] Boehm referred to the concept as the "Funnel Curve". [11] Boehm's initial quantification of the effects of the Funnel Curve were subjective. [10] Later work by Boehm and his colleagues at USC applied data from a set of software projects from the U.S. Air Force and other sources to validate the model. The basic model was further validated based on work at NASA's Software Engineering Lab. [12] [13]

The first time the name "cone of uncertainty" was used to describe this concept was in Software Project Survival Guide. [14]

Implication

See also

Related Research Articles

Project management is the process of leading the work of a team to achieve all project goals within the given constraints. This information is usually described in project documentation, created at the beginning of the development process. The primary constraints are scope, time, and budget. The secondary challenge is to optimize the allocation of necessary inputs and apply them to meet pre-defined objectives.

Project management software are computer programs that help plan, organize, and manage resources.

<span class="mw-page-title-main">Barry Boehm</span> American computer scientist (1935–2022)

Barry William Boehm was an American software engineer, distinguished professor of computer science, industrial and systems engineering; the TRW Professor of Software Engineering; and founding director of the Center for Systems and Software Engineering at the University of Southern California. He was known for his many contributions to the area of software engineering.

The Constructive Cost Model (COCOMO) is a procedural software cost estimation model developed by Barry W. Boehm. The model parameters are derived from fitting a regression formula using data from historical projects.

The Constructive Systems Engineering Cost Model (COSYSMO) was created by Ricardo Valerdi while at the University of Southern California Center for Software Engineering. It gives an estimate of the number of person-months it will take to staff systems engineering resources on hardware and software projects. Initially developed in 2002, the model now contains a calibration data set of more than 50 projects provided by major aerospace and defense companies such as Raytheon, Northrop Grumman, Lockheed Martin, SAIC, General Dynamics, and BAE Systems.

Cost escalation can be defined as changes in the cost or price of specific goods or services in a given economy over a period. This is similar to the concepts of inflation and deflation except that escalation is specific to an item or class of items, it is often not primarily driven by changes in the money supply, and it tends to be less sustained. While escalation includes general inflation related to the money supply, it is also driven by changes in technology, practices, and particularly supply-demand imbalances that are specific to a good or service in a given economy. For example, while general inflation in the US was less than 5% in the 2003-2007 time period, steel prices increased (escalated) by over 50% because of supply-demand imbalance. Cost escalation may contribute to a project cost overrun but it is not synonymous with it.

A cost estimate is the approximation of the cost of a program, project, or operation. The cost estimate is the product of the cost estimating process. The cost estimate has a single total value and may have identifiable component values.

Legacy modernization, also known as software modernization or platform modernization, refers to the conversion, rewriting or porting of a legacy system to modern computer programming languages, architectures, software libraries, protocols or hardware platforms. Legacy transformation aims to retain and extend the value of the legacy investment through migration to new platforms to benefit from the advantage of the new technologies.

<span class="mw-page-title-main">Tropical cyclone forecasting</span> Science of forecasting how a tropical cyclone moves and its effects

Tropical cyclone forecasting is the science of forecasting where a tropical cyclone's center, and its effects, are expected to be at some point in the future. There are several elements to tropical cyclone forecasting: track forecasting, intensity forecasting, rainfall forecasting, storm surge, tornado, and seasonal forecasting. While skill is increasing in regard to track forecasting, intensity forecasting skill remains unchanged over the past several years. Seasonal forecasting began in the 1980s in the Atlantic basin and has spread into other basins in the years since.

Programming productivity describes the degree of the ability of individual programmers or development teams to build and evolve software systems. Productivity traditionally refers to the ratio between the quantity of software produced and the cost spent for it. Here the delicacy lies in finding a reasonable way to define software quantity.

In project management, accurate estimates are the basis of sound project planning. Many processes have been developed to aid engineers in making accurate estimates, such as

Cost engineering is "the engineering practice devoted to the management of project cost, involving such activities as estimating, cost control, cost forecasting, investment appraisal and risk analysis". "Cost Engineers budget, plan and monitor investment projects. They seek the optimum balance between cost, quality and time requirements."

In software development, effort estimation is the process of predicting the most realistic amount of effort required to develop or maintain software based on incomplete, uncertain and noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets, investment analyses, pricing processes and bidding rounds.

A glossary of terms relating to project management and consulting.

A phase-gate process is a project management technique in which an initiative or project is divided into distinct stages or phases, separated by decision points.

Daniel D. Galorath is an American software developer, businessman and author. Galorath is the President and CEO of Galorath Incorporated and one of the chief developers of the project management software known as SEER-SEM.

In software engineering, a software development process or software development life cycle (SDLC) is a process of planning and managing software development. It typically involves dividing software development work into smaller, parallel, or sequential steps or sub-processes to improve design and/or product management. The methodology may include the pre-definition of specific deliverables and artifacts that are created and completed by a project team to develop or maintain an application.

Weighted Micro Function Points (WMFP) is a modern software sizing algorithm which is a successor to solid ancestor scientific methods as COCOMO, COSYSMO, maintainability index, cyclomatic complexity, function points, and Halstead complexity. It produces more accurate results than traditional software sizing methodologies, while requiring less configuration and knowledge from the end user, as most of the estimation is based on automatic measurements of an existing source code.

AFCAA REVIC is a set of programs for use in estimating the cost of software development projects. The Revised Enhanced Version of Intermediate COCOMO (REVIC) model is a copyrighted program available for public distribution under agreement with the REVIC developer, Ray Kile, and the U.S. Air Force Cost Analysis Agency (AFCAA). It is distributed by the U.K. Navy Engineering Process Office (EPO).

The following outline is provided as an overview of and topical guide to project management:

References

Footnotes

  1. "The cone of uncertainty". Construx.
  2. 1 2 Bauman, H. Carl (April 1958). "Accuracy Considerations for Capital Cost Estimation". Ind. Eng. Chem. 50 (4). 55A–58A. doi:10.1021/i650580a748.
  3. Boehm 1981.
  4. McConnell, S (2006). Software Estimation: Demystifying the Black Art. Microsoft Press. p. 38.
  5. "Definition of the NHC Track Forecast Cone". NHC - National Hurricane Center (National Oceanic and Atmospheric Administration).
  6. Hennen, Dave (24 August 2011). "How forecasters develop hurricanes' 'cone of uncertainty'". CNN . Retrieved 8 March 2020.
  7. "The 'Cone of Uncertainty' and Hurricane Forecasting: CRED researchers analyze an iconic climate forecasting visual aid" (PDF). Center for Research on Environmental Decisions (CRED). 1 June 2007.
  8. Kleinberg, Eliot (22 April 2011). "Smaller 'cone of probability' cuts down on hurricane fear". The Palm Beach Post.
  9. Gorey, J.M. (1958). "Estimate Types", AACE Bulletin-November 1958.
  10. 1 2 Boehm 1981, p. 311.
  11. Stutzke, D. (2005). Estimating Software Intensive Systems, Pearson. p. 10.
  12. NASA (1990). Manager’s Handbook for Software Development, Revision 1. Document number SEL-84-101. Greenbelt, Maryland: Goddard Space Flight Center, NASA. p. 3-2.
  13. Boehm, Barry W.; et al. (2000). Software Cost Estimation with COCOMO II. (with CD-ROM). Englewood Cliffs NJ: Prentice Hall. ISBN   9780130266927.
  14. McConnell, S (1997). Software Project Survival Guide, Microsoft Press.

Further reading