This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these messages)
|
Developer(s) | Lumina Decision Systems |
---|---|
Initial release | January 16, 1992 |
Written in | C++ |
Operating system | Windows |
Platform | IA-32, x64 |
Available in | English |
Type | Decision-making software |
License | Commercial proprietary software |
Website | analytica |
Analytica is a visual software developed by Lumina Decision Systems for creating, analyzing and communicating quantitative decision models. [1] It combines hierarchical influence diagrams for visual creation and view of models, intelligent arrays for working with multidimensional data, Monte Carlo simulation for analyzing risk and uncertainty, and optimization, including linear and nonlinear programming. Its design is based on ideas from the field of decision analysis. As a computer language, it combines a declarative (non-procedural) structure for referential transparency, array abstraction, and automatic dependency maintenance for efficient sequencing of computation.
Analytica models are organized as influence diagrams. Variables (and other objects) appear as nodes of various shapes on a diagram, connected by arrows that provide a visual representation of dependencies. Analytica influence diagrams may be hierarchical, in which a single module node on a diagram represents an entire sub-model.
Hierarchical influence diagrams in Analytica serve as an organizational tool. Because the visual layout of an influence diagram matches these natural human abilities both spatially and in the level of abstraction, people are able to take in more information about a model's structure and organization at a glance than is possible with less visual paradigms, such as Spreadsheets and Mathematical expressions. Managing the structure and organization of a large model can be a significant part of the modeling process, but is substantially aided by the visualization of influence diagrams.
Influence diagrams also serve as a tool for communication. Once a quantitative model has been created and its final results computed, it is often the case that an understanding of how the results are obtained, and how various assumptions impact the results, is far more important than the specific numbers computed. Analytica gives users the ability to help target audiences understand these aspects within their models. The visual representation of an influence diagram quickly communicates an understanding at a level of abstraction that is normally more appropriate than detailed representations such as mathematical expressions or cell formulas. When more detail is desired, users can drill down to increasing levels of detail, speeded by the visual depiction of the model's structure.
The existence of an easily understandable and transparent model supports communication and debate within an organization, and this effect is one of the primary benefits of quantitative model building. When all interested parties are able to understand a common model structure, debates and discussions will often focus more directly on specific assumptions, can cut down on "cross-talk", and therefore lead to more productive interactions within the organization. The influence diagram serves as a graphical representation that can help to make models accessible to people at different levels.
Analytica uses index objects to track the dimensions of multidimensional arrays. An index object has a name and a list of elements. When two multidimensional values are combined, for example in an expression such as
Profit = Revenue − Expenses
where Revenue and Expenses are each multidimensional, Analytica repeats the profit calculation over each dimension, but recognizes when same dimension occurs in both values and treats it as the same dimension during the calculation, in a process called intelligent array abstraction. Unlike most programming languages, there is no inherent ordering to the dimensions in a multidimensional array. This avoids duplicated formulas and explicit FOR loops, both common sources of modeling errors. The simplified expressions made possible by intelligent array abstraction allow the model to be more accessible, interpretable, and transparent.
Another consequence of intelligent array abstraction is that new dimensions can be introduced or removed from an existing model, without requiring changes to the model structure or changes to variable definitions. For example, while creating a model, the model builder might assume a particular variable, for example Discounted rate , contains a single number. Later, after constructing a model, a user might replace the single number with a table of numbers, perhaps Discount rate broken down by Country and by Economic scenario. These new divisions may reflect the fact that the effective discount rate is not the same for international divisions of a company, and that different rates are applicable to different hypothetical scenarios. Analytica automatically propagates these new dimensions to any results that depend upon Discount rate, so for example, the result for Net present value will become multidimensional and contain these new dimensions. In essence, Analytica repeats the same calculation using the discount rate for each possible combination of Country and Economic scenario.
This flexibility is important when exploring computation tradeoffs between the level of detail, computation time, available data, and overall size or dimensionality of parametric spaces. Such adjustments are common after models have been fully constructed as a way of exploring what-if scenarios and overall relationships between variables.
Incorporating uncertainty into model outputs helps to provide more realistic and informative projections. Uncertain quantities in Analytica can be specified using a distribution function. When evaluated, distributions are sampled using either Latin hypercube, Monte Carlo, or Sobol sampling, then the samples are propagated through the computations to the results. The sampled result distribution and summary statistics can then be viewed directly (mean, Fractile bands, probability density function (PDF), cumulative distribution function (CDF)), Analytica supports collaborative decision analysis and probability management through the use of the SIP Math(tm) standard. [2] [3]
System dynamics is an approach to simulating the behavior of complex systems over time. It deals with feedback loops and time delays on the behavior of the entire system. The Dynamic() function in Analytica allows definition of variables with cyclic dependencies, such as feedback loops. It expands the influence diagram notation, which does not normally allow cycles. At least one link in each cycle includes a time lag, depicted as a gray influence arrow to distinguish it from standard black arrows without time lags.
Analytica includes a general language of operators and functions for expressing mathematical relationships among variables. Users can define functions and libraries to extend the language.
Analytica has several features as a programming language designed to make it easy to use for quantitative modeling:
Analytica has been used for policy analysis, business modeling, and risk analysis. [4] Areas in which Analytica has been applied include energy, [5] [6] [7] health and pharmaceuticals, [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] environmental risk and emissions policy analysis, [22] [23] [24] [25] [26] [27] [28] [29] [30] wildlife management, [31] [32] [33] [34] ecology, [35] [36] [37] [38] [39] [40] [41] climate change, [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] technology and defense, [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] strategic financial planning, [70] [71] R&D planning and portfolio management, [72] [73] [74] financial services, aerospace, [75] manufacturing, [76] and environmental health impact assessment. [77] [78]
The Analytica software runs on Microsoft Windows operating systems. Analytica Free Edition is available for an unlimited time and lets you build models of up to 101 user objects. Analytica Professional, Enterprise, Optimizer are desktop editions with increasing levels of functionality. The Analytica Cloud Platform lets users share models via a server and run them via a web browser. Analytica 6.4 was released in 2023.
Analytica's predecessor, called Demos, [79] grew from the research on tools for policy analysis by Max Henrion as a PhD student and later professor at Carnegie Mellon University between 1979 and 1990. Henrion founded Lumina Decision Systems in 1991 with Brian Arnold. Lumina continued to develop the software and apply it to environmental and public policy analysis applications. Lumina first released Analytica as a product in 1996.
Risk assessment determines possible mishaps, their likelihood and consequences, and the tolerances for such events. The results of this process may be expressed in a quantitative or qualitative fashion. Risk assessment is an inherent part of a broader risk management strategy to help reduce any potential risk-related consequences.
A vulnerability assessment is the process of identifying, quantifying, and prioritizing the vulnerabilities in a system. Examples of systems for which vulnerability assessments are performed include, but are not limited to, information technology systems, energy supply systems, water supply systems, transportation systems, and communication systems. Such assessments may be conducted on behalf of a range of different organizations, from small businesses up to large regional infrastructures. Vulnerability from the perspective of disaster management means assessing the threats from potential hazards to the population and to infrastructure. It may be conducted in the political, social, economic or environmental fields.
Cost–benefit analysis (CBA), sometimes also called benefit–cost analysis, is a systematic approach to estimating the strengths and weaknesses of alternatives. It is used to determine options which provide the best approach to achieving benefits while preserving savings in, for example, transactions, activities, and functional business requirements. A CBA may be used to compare completed or potential courses of action, and to estimate or evaluate the value against the cost of a decision, project, or policy. It is commonly used to evaluate business or policy decisions, commercial transactions, and project investments. For example, the U.S. Securities and Exchange Commission must conduct cost-benefit analyses before instituting regulations or deregulations.
Life cycle assessment (LCA), also known as life cycle analysis, is a methodology for assessing environmental impacts associated with all the stages of the life cycle of a commercial product, process, or service. For instance, in the case of a manufactured product, environmental impacts are assessed from raw material extraction and processing (cradle), through the product's manufacture, distribution and use, to the recycling or final disposal of the materials composing it (grave).
Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system can be divided and allocated to different sources of uncertainty in its inputs. This involves estimating sensitivity indices that quantify the influence of an input or group of inputs on the output. A related practice is uncertainty analysis, which has a greater focus on uncertainty quantification and propagation of uncertainty; ideally, uncertainty and sensitivity analysis should be run in tandem.
Decision analysis (DA) is the discipline comprising the philosophy, methodology, and professional practice necessary to address important decisions in a formal manner. Decision analysis includes many procedures, methods, and tools for identifying, clearly representing, and formally assessing important aspects of a decision; for prescribing a recommended course of action by applying the maximum expected-utility axiom to a well-formed representation of the decision; and for translating the formal representation of a decision and its corresponding recommendation into insight for the decision maker, and other corporate and non-corporate stakeholders.
Prognostics is an engineering discipline focused on predicting the time at which a system or a component will no longer perform its intended function. This lack of performance is most often a failure beyond which the system can no longer be used to meet desired performance. The predicted time then becomes the remaining useful life (RUL), which is an important concept in decision making for contingency mitigation. Prognostics predicts the future performance of a component by assessing the extent of deviation or degradation of a system from its expected normal operating conditions. The science of prognostics is based on the analysis of failure modes, detection of early signs of wear and aging, and fault conditions. An effective prognostics solution is implemented when there is sound knowledge of the failure mechanisms that are likely to cause the degradations leading to eventual failures in the system. It is therefore necessary to have initial information on the possible failures in a product. Such knowledge is important to identify the system parameters that are to be monitored. Potential uses for prognostics is in condition-based maintenance. The discipline that links studies of failure mechanisms to system lifecycle management is often referred to as prognostics and health management (PHM), sometimes also system health management (SHM) or—in transportation applications—vehicle health management (VHM) or engine health management (EHM). Technical approaches to building models in prognostics can be categorized broadly into data-driven approaches, model-based approaches, and hybrid approaches.
Post-normal science (PNS) was developed in the 1990s by Silvio Funtowicz and Jerome R. Ravetz. It is a problem-solving strategy appropriate when "facts [are] uncertain, values in dispute, stakes high and decisions urgent", conditions often present in policy-relevant research. In those situations, PNS recommends suspending temporarily the traditional scientific ideal of truth, concentrating on quality as assessed by internal and extended peer communities.
An economic analysis of climate change uses economic tools and models to calculate the magnitude and distribution of damages caused by climate change. It can also give guidance for the best policies for mitigation and adaptation to climate change from an economic perspective. There are many economic models and frameworks. For example, in a cost–benefit analysis, the trade offs between climate change impacts, adaptation, and mitigation are made explicit. For this kind of analysis, integrated assessment models (IAMs) are useful. Those models link main features of society and economy with the biosphere and atmosphere into one modelling framework. The total economic impacts from climate change are difficult to estimate. In general, they increase the more the global surface temperature increases.
Info-gap decision theory seeks to optimize robustness to failure under severe uncertainty, in particular applying sensitivity analysis of the stability radius type to perturbations in the value of a given estimate of the parameter of interest. It has some connections with Wald's maximin model; some authors distinguish them, others consider them instances of the same principle.
Exposure assessment is a branch of environmental science and occupational hygiene that focuses on the processes that take place at the interface between the environment containing the contaminant of interest and the organism being considered. These are the final steps in the path to release an environmental contaminant, through transport to its effect in a biological system. It tries to measure how much of a contaminant can be absorbed by an exposed target organism, in what form, at what rate and how much of the absorbed amount is actually available to produce a biological effect. Although the same general concepts apply to other organisms, the overwhelming majority of applications of exposure assessment are concerned with human health, making it an important tool in public health.
A hydrologic model is a simplification of a real-world system that aids in understanding, predicting, and managing water resources. Both the flow and quality of water are commonly studied using hydrologic models.
In simple terms, risk is the possibility of something bad happening. Risk involves uncertainty about the effects/implications of an activity with respect to something that humans value, often focusing on negative, undesirable consequences. Many different definitions have been proposed. One international standard definition of risk is the "effect of uncertainty on objectives".
A probability box is a characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties that is often used in risk analysis or quantitative uncertainty modeling where numerical calculations must be performed. Probability bounds analysis is used to make arithmetic and logical calculations with p-boxes.
In decision theory and quantitative policy analysis, the expected value of including uncertainty (EVIU) is the expected difference in the value of a decision based on a probabilistic analysis versus a decision based on an analysis that ignores uncertainty.
Probability bounds analysis (PBA) is a collection of methods of uncertainty propagation for making qualitative and quantitative calculations in the face of uncertainties of various kinds. It is used to project partial information about random variables and other quantities through mathematical expressions. For instance, it computes sure bounds on the distribution of a sum, product, or more complex function, given only sure bounds on the distributions of the inputs. Such bounds are called probability boxes, and constrain cumulative probability distributions.
P-boxes and probability bounds analysis have been used in many applications spanning many disciplines in engineering and environmental science, including:
The predicted no-effect concentration (PNEC) is the concentration of a chemical which marks the limit at which below no adverse effects of exposure in an ecosystem are measured. PNEC values are intended to be conservative and predict the concentration at which a chemical will likely have no toxic effect. They are not intended to predict the upper limit of concentration of a chemical that has a toxic effect. PNEC values are often used in environmental risk assessment as a tool in ecotoxicology. A PNEC for a chemical can be calculated with acute toxicity or chronic toxicity single-species data, Species Sensitivity Distribution (SSD) multi-species data, field data or model ecosystems data. Depending on the type of data used, an assessment factor is used to account for the confidence of the toxicity data being extrapolated to an entire ecosystem.
Alternatives assessment or alternatives analysis is a problem-solving approach used in environmental design, technology, and policy. It aims to minimize environmental harm by comparing multiple potential solutions in the context of a specific problem, design goal, or policy objective. It is intended to inform decision-making in situations with many possible courses of action, a wide range of variables to consider, and significant degrees of uncertainty. Alternatives assessment was originally developed as a robust way to guide precautionary action and avoid paralysis by analysis; authors such as O'Brien have presented alternatives assessment as an approach that is complementary to risk assessment, the dominant decision-making approach in environmental policy. Likewise, Ashford has described the similar concept of technology options analysis as a way to generate innovative solutions to the problems of industrial pollution more effectively than through risk-based regulation.
M. Granger Morgan is an American scientist, academic, and engineer who is the Hamerschlag University Professor of Engineering at Carnegie Mellon University. Over his career, Morgan has led the development of the area of engineering and public policy.
{{cite web}}
: CS1 maint: archived copy as title (link)