This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these template messages)
|
Developer(s) | Lumina Decision Systems |
---|---|
Initial release | January 16, 1992 |
Written in | C++ |
Operating system | Windows |
Platform | IA-32, x64 |
Available in | English |
Type | Decision-making software |
License | Commercial proprietary software |
Website | analytica |
Analytica is a visual software developed by Lumina Decision Systems for creating, analyzing and communicating quantitative decision models. [1] It combines hierarchical influence diagrams for visual creation and view of models, intelligent arrays for working with multidimensional data, Monte Carlo simulation for analyzing risk and uncertainty, and optimization, including linear and nonlinear programming. Its design is based on ideas from the field of decision analysis. As a computer language, it combines a declarative (non-procedural) structure for referential transparency, array abstraction, and automatic dependency maintenance for efficient sequencing of computation.
Analytica models are organized as influence diagrams. Variables (and other objects) appear as nodes of various shapes on a diagram, connected by arrows that provide a visual representation of dependencies. Analytica influence diagrams may be hierarchical, in which a single module node on a diagram represents an entire sub-model.
Hierarchical influence diagrams in Analytica serve as an organizational tool. Because the visual layout of an influence diagram matches these natural human abilities both spatially and in the level of abstraction, people are able to take in more information about a model's structure and organization at a glance than is possible with less visual paradigms, such as Spreadsheets and Mathematical expressions. Managing the structure and organization of a large model can be a significant part of the modeling process, but is substantially aided by the visualization of influence diagrams.
Influence diagrams also serve as a tool for communication. Once a quantitative model has been created and its final results computed, it is often the case that an understanding of how the results are obtained, and how various assumptions impact the results, is far more important than the specific numbers computed. Analytica gives users the ability to help target audiences understand these aspects within their models. The visual representation of an influence diagram quickly communicates an understanding at a level of abstraction that is normally more appropriate than detailed representations such as mathematical expressions or cell formulas. When more detail is desired, users can drill down to increasing levels of detail, speeded by the visual depiction of the model's structure.
The existence of an easily understandable and transparent model supports communication and debate within an organization, and this effect is one of the primary benefits of quantitative model building. When all interested parties are able to understand a common model structure, debates and discussions will often focus more directly on specific assumptions, can cut down on "cross-talk", and therefore lead to more productive interactions within the organization. The influence diagram serves as a graphical representation that can help to make models accessible to people at different levels.
Analytica uses index objects to track the dimensions of multidimensional arrays. An index object has a name and a list of elements. When two multidimensional values are combined, for example in an expression such as
Profit = Revenue − Expenses
where Revenue and Expenses are each multidimensional, Analytica repeats the profit calculation over each dimension, but recognizes when same dimension occurs in both values and treats it as the same dimension during the calculation, in a process called intelligent array abstraction. Unlike most programming languages, there is no inherent ordering to the dimensions in a multidimensional array. This avoids duplicated formulas and explicit FOR loops, both common sources of modeling errors. The simplified expressions made possible by intelligent array abstraction allow the model to be more accessible, interpretable, and transparent.
Another consequence of intelligent array abstraction is that new dimensions can be introduced or removed from an existing model, without requiring changes to the model structure or changes to variable definitions. For example, while creating a model, the model builder might assume a particular variable, for example Discounted rate , contains a single number. Later, after constructing a model, a user might replace the single number with a table of numbers, perhaps Discount rate broken down by Country and by Economic scenario. These new divisions may reflect the fact that the effective discount rate is not the same for international divisions of a company, and that different rates are applicable to different hypothetical scenarios. Analytica automatically propagates these new dimensions to any results that depend upon Discount rate, so for example, the result for Net present value will become multidimensional and contain these new dimensions. In essence, Analytica repeats the same calculation using the discount rate for each possible combination of Country and Economic scenario.
This flexibility is important when exploring computation tradeoffs between the level of detail, computation time, available data, and overall size or dimensionality of parametric spaces. Such adjustments are common after models have been fully constructed as a way of exploring what-if scenarios and overall relationships between variables.
Incorporating uncertainty into model outputs helps to provide more realistic and informative projections. Uncertain quantities in Analytica can be specified using a distribution function. When evaluated, distributions are sampled using either Latin hypercube, Monte Carlo, or Sobol sampling, then the samples are propagated through the computations to the results. The sampled result distribution and summary statistics can then be viewed directly (mean, Fractile bands, probability density function (PDF), cumulative distribution function (CDF)), Analytica supports collaborative decision analysis and probability management through the use of the SIP Math(tm) standard. [2] [3]
System dynamics is an approach to simulating the behavior of complex systems over time. It deals with feedback loops and time delays on the behavior of the entire system. The Dynamic() function in Analytica allows definition of variables with cyclic dependencies, such as feedback loops. It expands the influence diagram notation, which does not normally allow cycles. At least one link in each cycle includes a time lag, depicted as a gray influence arrow to distinguish it from standard black arrows without time lags.
Analytica includes a general language of operators and functions for expressing mathematical relationships among variables. Users can define functions and libraries to extend the language.
Analytica has several features as a programming language designed to make it easy to use for quantitative modeling:
Analytica has been used for policy analysis, business modeling, and risk analysis. [4] Areas in which Analytica has been applied include energy, [5] [6] [7] [8] [9] [10] health and pharmaceuticals, [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] environmental risk and emissions policy analysis, [27] [28] [29] [30] [31] [32] [33] [34] [35] wildlife management, [36] [37] [38] [39] ecology, [40] [41] [42] [43] [44] [45] [46] climate change, [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] technology and defense, [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] strategic financial planning, [75] [76] R&D planning and portfolio management, [77] [78] [79] financial services, aerospace, [80] manufacturing, [81] and environmental health impact assessment. [82] [83]
The Analytica software runs on Microsoft Windows operating systems. Analytica Free Edition is available for an unlimited time and lets you build models of up to 101 user objects. Analytica Professional, Enterprise, Optimizer are desktop editions with increasing levels of functionality. The Analytica Cloud Platform lets users share models via a server and run them via a web browser. Analytica 6.1 was released in 2021.
Analytica's predecessor, called Demos, [84] grew from the research on tools for policy analysis by Max Henrion as a PhD student and later professor at Carnegie Mellon University between 1979 and 1990. Henrion founded Lumina Decision Systems in 1991 with Brian Arnold. Lumina continued to develop the software and apply it to environmental and public policy analysis applications. Lumina first released Analytica as a product in 1996.
Real options valuation, also often termed real options analysis, applies option valuation techniques to capital budgeting decisions. A real option itself, is the right—but not the obligation—to undertake certain business initiatives, such as deferring, abandoning, expanding, staging, or contracting a capital investment project. For example, real options valuation could examine the opportunity to invest in the expansion of a firm's factory and the alternative option to sell the factory.
Risk assessment determines possible mishaps, their likelihood and consequences, and the tolerances for such events. The results of this process may be expressed in a quantitative or qualitative fashion. Risk assessment is an inherent part of a broader risk management strategy to help reduce any potential risk-related consequences.
Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system can be divided and allocated to different sources of uncertainty in its inputs. A related practice is uncertainty analysis, which has a greater focus on uncertainty quantification and propagation of uncertainty; ideally, uncertainty and sensitivity analysis should be run in tandem.
Decision analysis (DA) is the discipline comprising the philosophy, methodology, and professional practice necessary to address important decisions in a formal manner. Decision analysis includes many procedures, methods, and tools for identifying, clearly representing, and formally assessing important aspects of a decision; for prescribing a recommended course of action by applying the maximum expected-utility axiom to a well-formed representation of the decision; and for translating the formal representation of a decision and its corresponding recommendation into insight for the decision maker, and other corporate and non-corporate stakeholders.
In mathematical finance, a Monte Carlo option model uses Monte Carlo methods to calculate the value of an option with multiple sources of uncertainty or with complicated features. The first application to option pricing was by Phelim Boyle in 1977. In 1996, M. Broadie and P. Glasserman showed how to price Asian options by Monte Carlo. An important development was the introduction in 1996 by Carriere of Monte Carlo methods for options with early exercise features.
Uncertainty quantification (UQ) is the science of quantitative characterization and estimation of uncertainties in both computational and real world applications. It tries to determine how likely certain outcomes are if some aspects of the system are not exactly known. An example would be to predict the acceleration of a human body in a head-on crash with another car: even if the speed was exactly known, small differences in the manufacturing of individual cars, how tightly every bolt has been tightened, etc., will lead to different results that can only be predicted in a statistical sense.
An ecosystem model is an abstract, usually mathematical, representation of an ecological system, which is studied to better understand the real system.
A fuzzy cognitive map (FCM) is a cognitive map within which the relations between the elements of a "mental landscape" can be used to compute the "strength of impact" of these elements. Fuzzy cognitive maps were introduced by Bart Kosko. Robert Axelrod introduced cognitive maps as a formal way of representing social scientific knowledge and modeling decision making in social and political systems, then brought in the computation.
Quantification of Margins and Uncertainty (QMU) is a decision support methodology for complex technical decisions. QMU focuses on the identification, characterization, and analysis of performance thresholds and their associated margins for engineering systems that are evaluated under conditions of uncertainty, particularly when portions of those results are generated using computational modeling and simulation. QMU has traditionally been applied to complex systems where comprehensive experimental test data is not readily available and cannot be easily generated for either end-to-end system execution or for specific subsystems of interest. Examples of systems where QMU has been applied include nuclear weapons performance, qualification, and stockpile assessment. QMU focuses on characterizing in detail the various sources of uncertainty that exist in a model, thus allowing the uncertainty in the system response output variables to be well quantified. These sources are frequently described in terms of probability distributions to account for the stochastic nature of complex engineering systems. The characterization of uncertainty supports comparisons of design margins for key system performance metrics to the uncertainty associated with their calculation by the model. QMU supports risk-informed decision-making processes where computational simulation results provide one of several inputs to the decision-making authority. There is currently no standardized methodology across the simulation community for conducting QMU; the term is applied to a variety of different modeling and simulation techniques that focus on rigorously quantifying model uncertainty in order to support comparison to design margins.
A hydrologic model is a simplification of a real-world system that aids in understanding, predicting, and managing water resources. Both the flow and quality of water are commonly studied using hydrologic models.
In simple terms, risk is the possibility of something bad happening. Risk involves uncertainty about the effects/implications of an activity with respect to something that humans value, often focusing on negative, undesirable consequences. Many different definitions have been proposed. The international standard definition of risk for common understanding in different applications is "effect of uncertainty on objectives".
Robust decision-making (RDM) is an iterative decision analytics framework that aims to help identify potential robust strategies, characterize the vulnerabilities of such strategies, and evaluate the tradeoffs among them. RDM focuses on informing decisions under conditions of what is called "deep uncertainty", that is, conditions where the parties to a decision do not know or do not agree on the system models relating actions to consequences or the prior probability distributions for the key input parameters to those models.
A probability box is a characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties that is often used in risk analysis or quantitative uncertainty modeling where numerical calculations must be performed. Probability bounds analysis is used to make arithmetic and logical calculations with p-boxes.
In decision theory and quantitative policy analysis, the expected value of including uncertainty (EVIU) is the expected difference in the value of a decision based on a probabilistic analysis versus a decision based on an analysis that ignores uncertainty.
Probability bounds analysis (PBA) is a collection of methods of uncertainty propagation for making qualitative and quantitative calculations in the face of uncertainties of various kinds. It is used to project partial information about random variables and other quantities through mathematical expressions. For instance, it computes sure bounds on the distribution of a sum, product, or more complex function, given only sure bounds on the distributions of the inputs. Such bounds are called probability boxes, and constrain cumulative probability distributions.
P-boxes and probability bounds analysis have been used in many applications spanning many disciplines in engineering and environmental science, including:
Alternatives assessment or alternatives analysis is a problem-solving approach used in environmental design, technology, and policy. It aims to minimize environmental harm by comparing multiple potential solutions in the context of a specific problem, design goal, or policy objective. It is intended to inform decision-making in situations with many possible courses of action, a wide range of variables to consider, and significant degrees of uncertainty. Alternatives assessment was originally developed as a robust way to guide precautionary action and avoid paralysis by analysis; authors such as O'Brien have presented alternatives assessment as an approach that is complementary to risk assessment, the dominant decision-making approach in environmental policy. Likewise, Ashford has described the similar concept of technology options analysis as a way to generate innovative solutions to the problems of industrial pollution more effectively than through risk-based regulation.
The discipline of probability management communicates and calculates uncertainties as data structures that obey both the laws of arithmetic and probability, while preserving statistical coherence. The simplest approach is to use vector arrays of simulated or historical realizations and metadata called Stochastic Information Packets (SIPs). A set of SIPs, which preserve statistical relationships between variables, is said to be coherent and is referred to as a Stochastic Library Unit with Relationships Preserved (SLURP). SIPs and SLURPs allow stochastic simulations to communicate with one another. For example, see Analytica (Wikipedia), Analytica, Oracle Crystal Ball, Frontline Solvers, and Autobox.
Techno-economic assessment or techno-economic analysis is a method of analyzing the economic performance of an industrial process, product, or service. It typically uses software modeling to estimate capital cost, operating cost, and revenue based on technical and financial input parameters. One desired outcome is to summarize results in a concise and visually coherent form, using visualization tools such as tornado diagrams and sensitivity analysis graphs.
M. Granger Morgan is an American scientist, academic, and engineer who is the Hamerschlag University Professor of Engineering at Carnegie Mellon University. Over his career, Morgan has led the development of the area of engineering and public policy.
{{cite web}}
: CS1 maint: archived copy as title (link)