This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these template messages)
|
Info-gap decision theory seeks to optimize robustness to failure under severe uncertainty, [1] [2] in particular applying sensitivity analysis of the stability radius type [3] to perturbations in the value of a given estimate of the parameter of interest. It has some connections with Wald's maximin model; some authors distinguish them, others consider them instances of the same principle.
It was developed by Yakov Ben-Haim, [4] and has found many applications and described as a theory for decision-making under "severe uncertainty". It has been criticized as unsuited for this purpose, and alternatives proposed, including such classical approaches as robust optimization.
Info-gap theory has generated a lot of literature. Info-gap theory has been studied or applied in a range of applications including engineering, [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] biological conservation, [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] theoretical biology, [31] homeland security, [32] economics, [33] [34] [35] project management [36] [37] [38] and statistics. [39] Foundational issues related to info-gap theory have also been studied. [40] [41] [42] [43] [44] [45]
A typical engineering application is the vibration analysis of a cracked beam, where the location, size, shape and orientation of the crack is unknown and greatly influence the vibration dynamics. [9] Very little is usually known about these spatial and geometrical uncertainties. The info-gap analysis allows one to model these uncertainties, and to determine the degree of robustness - to these uncertainties - of properties such as vibration amplitude, natural frequencies, and natural modes of vibration. Another example is the structural design of a building subject to uncertain loads such as from wind or earthquakes. [8] [10] The response of the structure depends strongly on the spatial and temporal distribution of the loads. However, storms and earthquakes are highly idiosyncratic events, and the interaction between the event and the structure involves very site-specific mechanical properties which are rarely known. The info-gap analysis enables the design of the structure to enhance structural immunity against uncertain deviations from design-base or estimated worst-case loads.[ citation needed ] Another engineering application involves the design of a neural net for detecting faults in a mechanical system, based on real-time measurements. A major difficulty is that faults are highly idiosyncratic, so that training data for the neural net will tend to differ substantially from data obtained from real-time faults after the net has been trained. The info-gap robustness strategy enables one to design the neural net to be robust to the disparity between training data and future real events. [11] [13]
The conservation biologist faces info-gaps in using biological models. They use info-gap robustness curves to select among management options for spruce-budworm populations in Eastern Canada. Burgman [46] uses the fact that the robustness curves of different alternatives can intersect.
Project management is another area where info-gap uncertainty is common. The project manager often has very limited information about the duration and cost of some of the tasks in the project, and info-gap robustness can assist in project planning and integration. [37] Financial economics is another area where the future is unpredictable, which may be either pernicious or propitious. Info-gap robustness and opportuneness analyses can assist in portfolio design, credit rationing, and other applications. [33]
A general criticism of non-probabilistic decision rules, discussed in detail at decision theory: alternatives to probability theory, is that optimal decision rules (formally, admissible decision rules) can always be derived by probabilistic methods, with a suitable utility function and prior distribution (this is the statement of the complete class theorems), and thus that non-probabilistic methods such as info-gap are unnecessary and do not yield new or better decision rules.
A more general criticism of decision making under uncertainty is the impact of outsized, unexpected events, ones that are not captured by the model. This is discussed particularly in black swan theory, and info-gap, used in isolation, is vulnerable to this, as are a fortiori all decision theories that use a fixed universe of possibilities, notably probabilistic ones.
Sniedovich [47] raises two points to info-gap decision theory, one substantive, one scholarly:
Sniedovich has challenged the validity of info-gap theory for making decisions under severe uncertainty. Sniedovich notes that the info-gap robustness function is "local" to the region around , where is likely to be substantially in error.
Symbolically, max assuming min (worst-case) outcome, or maximin.
In other words, while it is not a maximin analysis of outcome over the universe of uncertainty, it is a maximin analysis over a properly construed decision space.
Ben-Haim argues that info-gap's robustness model is not min-max/maximin analysis because it is not worst-case analysis of outcomes; it is a satisficing model, not an optimization model – a (straightforward) maximin analysis would consider worst-case outcomes over the entire space which, since uncertainty is often potentially unbounded, would yield an unbounded bad worst case.
Sniedovich [3] has shown that info-gap's robustness model is a simple stability radius model, namely a local stability model of the generic form
where denotes a ball of radius centered at and denotes the set of values of that satisfy pre-determined stability conditions.
In other words, info-gap's robustness model is a stability radius model characterized by a stability requirement of the form . Since stability radius models are designed for the analysis of small perturbations in a given nominal value of a parameter, Sniedovich [3] argues that info-gap's robustness model is unsuitable for the treatment of severe uncertainty characterized by a poor estimate and a vast uncertainty space.
It is correct that the info-gap robustness function is local, and has restricted quantitative value in some cases. However, a major purpose of decision analysis is to provide focus for subjective judgments. That is, regardless of the formal analysis, a framework for discussion is provided. Without entering into any particular framework, or characteristics of frameworks in general, discussion follows about proposals for such frameworks.
Simon [48] introduced the idea of bounded rationality. Limitations on knowledge, understanding, and computational capability constrain the ability of decision makers to identify optimal choices. Simon advocated satisficing rather than optimizing: seeking adequate (rather than optimal) outcomes given available resources. Schwartz, [49] Conlisk [50] and others discuss extensive evidence for the phenomenon of bounded rationality among human decision makers, as well as for the advantages of satisficing when knowledge and understanding are deficient. The info-gap robustness function provides a means of implementing a satisficing strategy under bounded rationality. For instance, in discussing bounded rationality and satisficing in conservation and environmental management, Burgman notes that "Info-gap theory ... can function sensibly when there are 'severe' knowledge gaps." The info-gap robustness and opportuneness functions provide "a formal framework to explore the kinds of speculations that occur intuitively when examining decision options." [51] Burgman then proceeds to develop an info-gap robust-satisficing strategy for protecting the endangered orange-bellied parrot. Similarly, Vinot, Cogan and Cipolla [52] discuss engineering design and note that "the downside of a model-based analysis lies in the knowledge that the model behavior is only an approximation to the real system behavior. Hence the question of the honest designer: how sensitive is my measure of design success to uncertainties in my system representation? ... It is evident that if model-based analysis is to be used with any level of confidence then ... [one must] attempt to satisfy an acceptable sub-optimal level of performance while remaining maximally robust to the system uncertainties." [52] They proceed to develop an info-gap robust-satisficing design procedure for an aerospace application.
Of course, decision in the face of uncertainty is nothing new, and attempts to deal with it have a long history. A number of authors have noted and discussed similarities and differences between info-gap robustness and minimax or worst-case methods [7] [16] [35] [37] [53] . [54] Sniedovich [47] has demonstrated formally that the info-gap robustness function can be represented as a maximin optimization, and is thus related to Wald's minimax theory. Sniedovich [47] has claimed that info-gap's robustness analysis is conducted in the neighborhood of an estimate that is likely to be substantially wrong, concluding that the resulting robustness function is equally likely to be substantially wrong.
On the other hand, the estimate is the best one has, so it is useful to know if it can err greatly and still yield an acceptable outcome. This critical question clearly raises the issue of whether robustness (as defined by info-gap theory) is qualified to judge whether confidence is warranted, [5] [55] [56] and how it compares to methods used to inform decisions under uncertainty using considerations not limited to the neighborhood of a bad initial guess. Answers to these questions vary with the particular problem at hand. Some general comments follow.
Sensitivity analysis – how sensitive conclusions are to input assumptions – can be performed independently of a model of uncertainty: most simply, one may take two different assumed values for an input and compares the conclusions. From this perspective, info-gap can be seen as a technique of sensitivity analysis, though by no means the only.
The robust optimization literature [57] [58] [59] [60] [61] [62] provides methods and techniques that take a global approach to robustness analysis. These methods directly address decision under severe uncertainty, and have been used for this purpose for more than thirty years now. Wald's Maximin model is the main instrument used by these methods.
The principal difference between the Maximin model employed by info-gap and the various Maximin models employed by robust optimization methods is in the manner in which the total region of uncertainty is incorporated in the robustness model. Info-gap takes a local approach that concentrates on the immediate neighborhood of the estimate. In sharp contrast, robust optimization methods set out to incorporate in the analysis the entire region of uncertainty, or at least an adequate representation thereof. In fact, some of these methods do not even use an estimate.
Classical decision theory, [63] [64] offers two approaches to decision-making under severe uncertainty, namely maximin and Laplaces' principle of insufficient reason (assume all outcomes equally likely); these may be considered alternative solutions to the problem info-gap addresses.
Further, as discussed at decision theory: alternatives to probability theory, probabilists, particularly Bayesians probabilists, argue that optimal decision rules (formally, admissible decision rules) can always be derived by probabilistic methods (this is the statement of the complete class theorems), and thus that non-probabilistic methods such as info-gap are unnecessary and do not yield new or better decision rules.
As attested by the rich literature on robust optimization, maximin provides a wide range of methods for decision making in the face of severe uncertainty.
Indeed, as discussed in criticism of info-gap decision theory, info-gap's robustness model can be interpreted as an instance of the general maximin model.
As for Laplaces' principle of insufficient reason, in this context it is convenient to view it as an instance of Bayesian analysis.
The essence of the Bayesian analysis is applying probabilities for different possible realizations of the uncertain parameters. In the case of Knightian (non-probabilistic) uncertainty, these probabilities represent the decision maker's "degree of belief" in a specific realization.
In our example, suppose there are only five possible realizations of the uncertain revenue to allocation function. The decision maker believes that the estimated function is the most likely, and that the likelihood decreases as the difference from the estimate increases. Figure 11 exemplifies such a probability distribution.
Now, for any allocation, one can construct a probability distribution of the revenue, based on his prior beliefs. The decision maker can then choose the allocation with the highest expected revenue, with the lowest probability for an unacceptable revenue, etc.
The most problematic step of this analysis is the choice of the realizations probabilities. When there is an extensive and relevant past experience, an expert may use this experience to construct a probability distribution. But even with extensive past experience, when some parameters change, the expert may only be able to estimate that is more likely than , but will not be able to reliably quantify this difference. Furthermore, when conditions change drastically, or when there is no past experience at all, it may prove to be difficult even estimating whether is more likely than .
Nevertheless, methodologically speaking, this difficulty is not as problematic as basing the analysis of a problem subject to severe uncertainty on a single point estimate and its immediate neighborhood, as done by info-gap. And what is more, contrary to info-gap, this approach is global, rather than local.
Still, it must be stressed that Bayesian analysis does not expressly concern itself with the question of robustness.
Bayesian analysis raises the issue of learning from experience and adjusting probabilities accordingly. In other words, decision is not a one-stop process, but profits from a sequence of decisions and observations.
Satisficing is a decision-making strategy or cognitive heuristic that entails searching through the available alternatives until an acceptability threshold is met. The term satisficing, a portmanteau of satisfy and suffice, was introduced by Herbert A. Simon in 1956, although the concept was first posited in his 1947 book Administrative Behavior. Simon used satisficing to explain the behavior of decision makers under circumstances in which an optimal solution cannot be determined. He maintained that many natural problems are characterized by computational intractability or a lack of information, both of which preclude the use of mathematical optimization procedures. He observed in his Nobel Prize in Economics speech that "decision makers can satisfice either by finding optimum solutions for a simplified world, or by finding satisfactory solutions for a more realistic world. Neither approach, in general, dominates the other, and both have continued to co-exist in the world of management science".
Global optimization is a branch of applied mathematics and numerical analysis that attempts to find the global minima or maxima of a function or a set of functions on a given set. It is usually described as a minimization problem because the maximization of the real-valued function is equivalent to the minimization of the function .
Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system can be divided and allocated to different sources of uncertainty in its inputs. This involves estimating sensitivity indices that quantify the influence of an input or group of inputs on the output. A related practice is uncertainty analysis, which has a greater focus on uncertainty quantification and propagation of uncertainty; ideally, uncertainty and sensitivity analysis should be run in tandem.
Model predictive control (MPC) is an advanced method of process control that is used to control a process while satisfying a set of constraints. It has been in use in the process industries in chemical plants and oil refineries since the 1980s. In recent years it has also been used in power system balancing models and in power electronics. Model predictive controllers rely on dynamic models of the process, most often linear empirical models obtained by system identification. The main advantage of MPC is the fact that it allows the current timeslot to be optimized, while keeping future timeslots in account. This is achieved by optimizing a finite time-horizon, but only implementing the current timeslot and then optimizing again, repeatedly, thus differing from a linear–quadratic regulator (LQR). Also MPC has the ability to anticipate future events and can take control actions accordingly. PID controllers do not have this predictive ability. MPC is nearly universally implemented as a digital control, although there is research into achieving faster response times with specially designed analog circuitry.
Bayesian experimental design provides a general probability-theoretical framework from which other theories on experimental design can be derived. It is based on Bayesian inference to interpret the observations/data acquired during the experiment. This allows accounting for both any prior knowledge on the parameters to be determined as well as uncertainties in observations.
In decision theory, the Ellsberg paradox is a paradox in which people's decisions are inconsistent with subjective expected utility theory. John Maynard Keynes published a version of the paradox in 1921. Daniel Ellsberg popularized the paradox in his 1961 paper, "Risk, Ambiguity, and the Savage Axioms". It is generally taken to be evidence of ambiguity aversion, in which a person tends to prefer choices with quantifiable risks over those with unknown, incalculable risks.
In mathematics, the stability radius of an object at a given nominal point is the radius of the largest ball, centered at the nominal point, all of whose elements satisfy pre-determined stability conditions. The picture of this intuitive notion is this:
Imprecise probability generalizes probability theory to allow for partial probability specifications, and is applicable when information is scarce, vague, or conflicting, in which case a unique probability distribution may be hard to identify. Thereby, the theory aims to represent the available knowledge more accurately. Imprecision is useful for dealing with expert elicitation, because:
Uncertainty quantification (UQ) is the science of quantitative characterization and estimation of uncertainties in both computational and real world applications. It tries to determine how likely certain outcomes are if some aspects of the system are not exactly known. An example would be to predict the acceleration of a human body in a head-on crash with another car: even if the speed was exactly known, small differences in the manufacturing of individual cars, how tightly every bolt has been tightened, etc., will lead to different results that can only be predicted in a statistical sense.
Robust optimization is a field of mathematical optimization theory that deals with optimization problems in which a certain measure of robustness is sought against uncertainty that can be represented as deterministic variability in the value of the parameters of the problem itself and/or its solution. It is related to, but often distinguished from, probabilistic optimization methods such as chance-constrained optimization.
Probabilistic design is a discipline within engineering design. It deals primarily with the consideration and minimization of the effects of random variability upon the performance of an engineering system during the design phase. Typically, these effects studied and optimized are related to quality and reliability. It differs from the classical approach to design by assuming a small probability of failure instead of using the safety factor. Probabilistic design is used in a variety of different applications to assess the likelihood of failure. Disciplines which extensively use probabilistic design principles include product design, quality control, systems engineering, machine design, civil engineering and manufacturing.
In decision theory, on making decisions under uncertainty—should information about the best course of action arrive after taking a fixed decision—the human emotional response of regret is often experienced, and can be measured as the value of difference between a made decision and the optimal decision.
The scenario approach or scenario optimization approach is a technique for obtaining solutions to robust optimization and chance-constrained optimization problems based on a sample of the constraints. It also relates to inductive reasoning in modeling and decision-making. The technique has existed for decades as a heuristic approach and has more recently been given a systematic theoretical foundation.
Robust decision-making (RDM) is an iterative decision analytics framework that aims to help identify potential robust strategies, characterize the vulnerabilities of such strategies, and evaluate the tradeoffs among them. RDM focuses on informing decisions under conditions of what is called "deep uncertainty", that is, conditions where the parties to a decision do not know or do not agree on the system models relating actions to consequences or the prior probability distributions for the key input parameters to those models.
In decision theory and game theory, Wald's maximin model is a non-probabilistic decision-making model according to which decisions are ranked on the basis of their worst-case outcomes – the optimal decision is one with the least bad worst outcome. It is one of the most important models in robust decision making in general and robust optimization in particular.
A probability box is a characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties that is often used in risk analysis or quantitative uncertainty modeling where numerical calculations must be performed. Probability bounds analysis is used to make arithmetic and logical calculations with p-boxes.
Probability bounds analysis (PBA) is a collection of methods of uncertainty propagation for making qualitative and quantitative calculations in the face of uncertainties of various kinds. It is used to project partial information about random variables and other quantities through mathematical expressions. For instance, it computes sure bounds on the distribution of a sum, product, or more complex function, given only sure bounds on the distributions of the inputs. Such bounds are called probability boxes, and constrain cumulative probability distributions.
P-boxes and probability bounds analysis have been used in many applications spanning many disciplines in engineering and environmental science, including:
In regression analysis, an interval predictor model (IPM) is an approach to regression where bounds on the function to be approximated are obtained. This differs from other techniques in machine learning, where usually one wishes to estimate point values or an entire probability distribution. Interval Predictor Models are sometimes referred to as a nonparametric regression technique, because a potentially infinite set of functions are contained by the IPM, and no specific distribution is implied for the regressed variables.
Probabilistic numerics is an active field of study at the intersection of applied mathematics, statistics, and machine learning centering on the concept of uncertainty in computation. In probabilistic numerics, tasks in numerical analysis such as finding numerical solutions for integration, linear algebra, optimization and simulation and differential equations are seen as problems of statistical, probabilistic, or Bayesian inference.