Decision analysis (DA) is the discipline comprising the philosophy, methodology, and professional practice necessary to address important decisions in a formal manner. Decision analysis includes many procedures, methods, and tools for identifying, clearly representing, and formally assessing important aspects of a decision; for prescribing a recommended course of action by applying the maximum expected-utility axiom to a well-formed representation of the decision; and for translating the formal representation of a decision and its corresponding recommendation into insight for the decision maker, and other corporate and non-corporate stakeholders.
In 1931, mathematical philosopher Frank Ramsey pioneered the idea of subjective probability as a representation of an individual’s beliefs or uncertainties. Then, in the 1940s, mathematician John von Neumann and economist Oskar Morgenstern developed an axiomatic basis for utility theory as a way of expressing an individual’s preferences over uncertain outcomes. (This is in contrast to social-choice theory, which addresses the problem of deriving group preferences from individual preferences.) Statistician Leonard Jimmie Savage then developed an alternate axiomatic framework for decision analysis in the early 1950s. The resulting expected-utility theory provides a complete axiomatic basis for decision making under uncertainty.
Once these basic theoretical developments had been established, the methods of decision analysis were then further codified and popularized, becoming widely taught (e.g., in business schools and departments of industrial engineering). A brief and highly accessible introductory text was published in 1968 by decision theorist Howard Raiffa of the Harvard Business School. [1] Subsequently, in 1976, Ralph Keeney and Howard Raiffa extended the basics of utility theory to provide a comprehensive methodology for handling decisions involving trade-offs between multiple objectives. [2] Engineering professor Ron Howard of Stanford University and decision analyst Jim Matheson then published, in 1977, a set of readings on decision analysis; [3] this was expanded into a two-volume set in 1984. [4] Subsequent textbooks and additional developments are documented below under Further reading.
Although decision analysis is inherently interdisciplinary (involving contributions from mathematicians, philosophers, economists, statisticians, and cognitive psychologists), it has historically been considered a branch of operations research. In 1980, the Decision Analysis Society was formed as a special interest group within Operations Research Society of America (ORSA), which later merged with The Institute of Management Sciences (TIMS) to become the Institute for Operations Research and the Management Sciences (INFORMS). Beginning in 2004, INFORMS has published a dedicated journal for these topics, Decision Analysis.
Following along with these academic developments, decision analysis has also evolved into a mature professional discipline. [5] The method has been used to support business and public-policy decision-making since the late 1950s; applications from 1990-2001 were reviewed in the inaugural issue of Decision Analysis. [6] Decision analysis has been especially widely adopted in the pharmaceutical industry and the oil and gas industry, since both industries regularly need to make large high-risk decisions (e.g., about investing in development of a new drug or making a major acquisition). [7]
Framing is the front end of decision analysis, which focuses on developing an opportunity statement (what and why), boundary conditions, success measures, a decision hierarchy, strategy table, and action items. It is sometimes believed that the application of decision analysis always requires the use of quantitative methods. In reality, however, many decisions can be made using qualitative tools that are part of the decision-analysis toolbox, such as value-focused thinking, [8] without the need for quantitative methods.
The framing process may lead to the development of an influence diagram or decision tree. These are commonly used graphical representations of decision-analysis problems. These graphical tools are used to represent the alternatives available to the decision maker, the uncertainties they involve, and how well the decision maker's objectives would be achieved by various final outcomes. They can also form the basis of a quantitative model when needed. For example, quantitative methods of conducting Bayesian inference and identifying optimal decisions using influence diagrams were developed in the 1980s, [9] [10] and are now incorporated in software.
In a quantitative decision-analysis model, uncertainties are represented through probabilities -- specifically, subjective probabilities. The decision maker's attitude to risk is represented by utility functions, and the attitude to trade-offs between conflicting objectives can be expressed using multi-attribute value functions or multi-attribute utility functions (if there is risk involved). (In some cases, utility functions can be replaced by the probability of achieving an uncertain aspiration level or "target".) [11] [12] Based on the axioms of decision analysis, the best decision to choose is the one whose consequences have the maximum expected utility (or that maximizes the probability of achieving the uncertain aspiration level).
It is sometimes assumed that quantitative decision analysis can be applied only to factors that lend themselves easily to measurement (e.g., in natural units such as dollars). However, quantitative decision analysis and related methods, such as applied information economics, can also be applied even to seemingly intangible factors.
Prescriptive decision-making research focuses on how to make "optimal" decisions (based on the axioms of rationality), while descriptive decision-making research aims to explain how people actually make decisions (regardless of whether their decisions are "good" or optimal). Unsurprisingly, therefore, there are numerous situations in which decisions made by individuals depart markedly from the decisions that would be recommended by decision analysis.
Some have criticized formal methods of decision analysis for allowing decision makers to avoid taking responsibility for their own decisions, and instead recommend reliance on intuition or "gut feelings". [13] Moreover, for decisions that must be made under significant time pressure, it is not surprising that formal methods of decision analysis are of little use, with intuition and expertise becoming more important. [14] However, when time permits, studies have demonstrated that quantitative algorithms for decision making can yield results that are superior to "unaided intuition". [15] In addition, despite the known biases in the types of human judgments required for decision analysis, research has shown at least a modest benefit of training and feedback in reducing bias. [16]
Critics cite the phenomenon of paralysis by analysis as one possible consequence of over-reliance on decision analysis in organizations (the expense of decision analysis is in itself a factor in the analysis). However, strategies are available to reduce such risk. [17]
There is currently a great deal of interest in quantitative methods for decision making. However, many such methods depart from the axioms of decision analysis, and can therefore generate misleading recommendations under some circumstances, so are not truly prescriptive methods. Some of the most popular of such non-decision-analytic methods include fuzzy-set theory for the representation of uncertainties, and the analytic-hierarchy process for the representation of preferences or value judgments. While there may occasionally be justification for such methods in applications (e.g., based on ease of use), decision analysts would argue for multi-attribute utility theory as the gold standard to which other methods should be compared, based on its rigorous axiomatic basis.
Although decision analysis has been frequently used in support of government decision making, it is important to note that the basic theory applies only to individual decision makers. There is unfortunately no axiomatic prescriptive theory comparable to decision analysis that is specifically designed for group or public-policy decisions. For more on this topic, see group decision-making for discussions of the behavioral issues involved in group decisions, and social choice theory for theoretical considerations that can affect group decisions.
Decision-analytic methods have been used in a wide variety of fields, including business (planning, marketing, negotiation), management, environmental remediation, health care, research, energy, exploration, litigation and dispute resolution, etc. An important early application was a study of the pros and cons of hurricane seeding, undertaken by the Stanford Research Institute in the early 1970s for the Environmental Science Services Administration (a predecessor of the National Oceanic and Atmospheric Administration). [18]
Decision analysis is today used by major corporations to make multibillion-dollar capital investments. For example, In 2010, Chevron won the Decision Analysis Society Practice Award for its use of decision analysis in all major decisions. [19] In a video detailing Chevron's use of decision analysis, Chevron Vice Chairman George Kirkland notes that "decision analysis is a part of how Chevron does business for a simple, but powerful, reason: it works." [20] It can also be used to make complex personal decisions, such as planning for retirement, deciding when to have a child, [21] planning a major vacation, or choosing among several possible medical treatments.
Decision-making software packages are available for implementing decision analysis. Some particularly notable packages include Analytica for influence diagrams, and DecideIT and Logical Decisions for multi-attribute decision making.
Cost–benefit analysis (CBA), sometimes also called benefit–cost analysis, is a systematic approach to estimating the strengths and weaknesses of alternatives. It is used to determine options which provide the best approach to achieving benefits while preserving savings in, for example, transactions, activities, and functional business requirements. A CBA may be used to compare completed or potential courses of action, and to estimate or evaluate the value against the cost of a decision, project, or policy. It is commonly used to evaluate business or policy decisions, commercial transactions, and project investments. For example, the U.S. Securities and Exchange Commission must conduct cost-benefit analyses before instituting regulations or deregulations.
In mathematical optimization and decision theory, a loss function or cost function is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event. An optimization problem seeks to minimize a loss function. An objective function is either a loss function or its opposite, in which case it is to be maximized. The loss function could include terms from several levels of the hierarchy.
Decision theory is a branch of applied probability theory and analytic philosophy concerned with the theory of making decisions based on assigning probabilities to various factors and assigning numerical consequences to the outcome.
University of Juba Post Graduates Diploma
The expected utility hypothesis is a foundational assumption in mathematical economics concerning decision making under uncertainty. It postulates that rational agents maximize utility, meaning the subjective desirability of their actions. Rational choice theory, a cornerstone of microeconomics, builds this postulate to model aggregate social behaviour.
Multiple-criteria decision-making (MCDM) or multiple-criteria decision analysis (MCDA) is a sub-discipline of operations research that explicitly evaluates multiple conflicting criteria in decision making. It is also known as multiple attribute utility theory, multiple attribute value theory, multiple attribute preference theory, and multi-objective decision analysis.
An influence diagram (ID) is a compact graphical and mathematical representation of a decision situation. It is a generalization of a Bayesian network, in which not only probabilistic inference problems but also decision making problems can be modeled and solved.
Ronald Arthur Howard is an emeritus professor in the Department of Engineering-Economic Systems in the School of Engineering at Stanford University.
Howard Raiffa was an American academic who was the Frank P. Ramsey Professor (Emeritus) of Managerial Economics, a joint chair held by the Business School and Harvard Kennedy School at Harvard University. He was an influential Bayesian decision theorist and pioneer in the field of decision analysis, with works in statistical decision theory, game theory, behavioral decision theory, risk analysis, and negotiation analysis. He helped found and was the first director of the International Institute for Applied Systems Analysis.
Event chain methodology is a network analysis technique that is focused on identifying and managing events and relationship between them that affect project schedules. It is an uncertainty modeling schedule technique. Event chain methodology is an extension of quantitative project risk analysis with Monte Carlo simulations. It is the next advance beyond critical path method and critical chain project management. Event chain methodology tries to mitigate the effect of motivational and cognitive biases in estimating and scheduling. It improves accuracy of risk assessment and helps to generate more realistic risk adjusted project schedules.
Problematic Integration Theory is a theory of communication that addresses the processes and dynamics of how people receive, evaluate, and respond to information and experiences. The premises of PI are based on the view that message processing, specifically the development of probabilistic and evaluative orientations, is a social and cultural construction. In situations where there is agreement between probabilistic orientation and evaluative orientation, integration is in harmony, i.e., not problematic. However, when there is disagreement between these orientations about an object, then integration becomes problematic. This disharmony leads to conflict and discomfort, which can manifest itself as cognitive, communicative, affective, and/or motivational.
In decision theory, the evidential reasoning approach (ER) is a generic evidence-based multi-criteria decision analysis (MCDA) approach for dealing with problems having both quantitative and qualitative criteria under various uncertainties including ignorance and randomness. It has been used to support various decision analysis, assessment and evaluation activities such as environmental impact assessment and organizational self-assessment based on a range of quality models.
Value-driven design (VDD) is a systems engineering strategy based on microeconomics which enables multidisciplinary design optimization. Value-driven design is being developed by the American Institute of Aeronautics and Astronautics, through a program committee of government, industry and academic representatives. In parallel, the U.S. Defense Advanced Research Projects Agency has promulgated an identical strategy, calling it value-centric design, on the F6 Program. At this point, the terms value-driven design and value-centric design are interchangeable. The essence of these strategies is that design choices are made to maximize system value rather than to meet performance requirements.
Decision-making software is software for computer applications that help individuals and organisations make choices and take decisions, typically by ranking, prioritizing or choosing from a number of options.
In simple terms, risk is the possibility of something bad happening. Risk involves uncertainty about the effects/implications of an activity with respect to something that humans value, often focusing on negative, undesirable consequences. Many different definitions have been proposed. The international standard definition of risk for common understanding in different applications is "effect of uncertainty on objectives".
Portfolio optimization is the process of selecting an optimal portfolio, out of a set of considered portfolios, according to some objective. The objective typically maximizes factors such as expected return, and minimizes costs like financial risk, resulting in a multi-objective optimization problem. Factors being considered may range from tangible to intangible.
In decision theory and quantitative policy analysis, the expected value of including uncertainty (EVIU) is the expected difference in the value of a decision based on a probabilistic analysis versus a decision based on an analysis that ignores uncertainty.
Probability bounds analysis (PBA) is a collection of methods of uncertainty propagation for making qualitative and quantitative calculations in the face of uncertainties of various kinds. It is used to project partial information about random variables and other quantities through mathematical expressions. For instance, it computes sure bounds on the distribution of a sum, product, or more complex function, given only sure bounds on the distributions of the inputs. Such bounds are called probability boxes, and constrain cumulative probability distributions.
Decision quality (DQ) is the quality of a decision at the moment the decision is made, regardless of its outcome. Decision quality concepts permit the assurance of both effectiveness and efficiency in analyzing decision problems. In that sense, decision quality can be seen as an extension to decision analysis. Decision quality also describes the process that leads to a high-quality decision. Properly implemented, the DQ process enables capturing maximum value in uncertain and complex scenarios.
The value of structural health information is the expected utility gain of a built environment system by information provided by structural health monitoring (SHM). The quantification of the value of structural health information is based on decision analysis adapted to built environment engineering. The value of structural health information can be significant for the risk and integrity management of built environment systems.