Analytica (software)

Last updated
Analytica
Developer(s) Lumina Decision Systems
Initial releaseJanuary 16, 1992;32 years ago (1992-01-16)
Written in C++
Operating system Windows
Platform IA-32, x64
Available inEnglish
Type Decision-making software
License Commercial proprietary software
Website analytica.com

Analytica is a visual software developed by Lumina Decision Systems for creating, analyzing and communicating quantitative decision models. [1] It combines hierarchical influence diagrams for visual creation and view of models, intelligent arrays for working with multidimensional data, Monte Carlo simulation for analyzing risk and uncertainty, and optimization, including linear and nonlinear programming. Its design is based on ideas from the field of decision analysis. As a computer language, it combines a declarative (non-procedural) structure for referential transparency, array abstraction, and automatic dependency maintenance for efficient sequencing of computation.

Contents

Hierarchical influence diagrams

Analytica models are organized as influence diagrams. Variables (and other objects) appear as nodes of various shapes on a diagram, connected by arrows that provide a visual representation of dependencies. Analytica influence diagrams may be hierarchical, in which a single module node on a diagram represents an entire sub-model.

Hierarchical influence diagrams in Analytica serve as an organizational tool. Because the visual layout of an influence diagram matches these natural human abilities both spatially and in the level of abstraction, people are able to take in more information about a model's structure and organization at a glance than is possible with less visual paradigms, such as Spreadsheets and Mathematical expressions. Managing the structure and organization of a large model can be a significant part of the modeling process, but is substantially aided by the visualization of influence diagrams.

Influence diagrams also serve as a tool for communication. Once a quantitative model has been created and its final results computed, it is often the case that an understanding of how the results are obtained, and how various assumptions impact the results, is far more important than the specific numbers computed. Analytica gives users the ability to help target audiences understand these aspects within their models. The visual representation of an influence diagram quickly communicates an understanding at a level of abstraction that is normally more appropriate than detailed representations such as mathematical expressions or cell formulas. When more detail is desired, users can drill down to increasing levels of detail, speeded by the visual depiction of the model's structure.

The existence of an easily understandable and transparent model supports communication and debate within an organization, and this effect is one of the primary benefits of quantitative model building. When all interested parties are able to understand a common model structure, debates and discussions will often focus more directly on specific assumptions, can cut down on "cross-talk", and therefore lead to more productive interactions within the organization. The influence diagram serves as a graphical representation that can help to make models accessible to people at different levels.

Intelligent multidimensional arrays

Analytica uses index objects to track the dimensions of multidimensional arrays. An index object has a name and a list of elements. When two multidimensional values are combined, for example in an expression such as

Profit = Revenue − Expenses

where Revenue and Expenses are each multidimensional, Analytica repeats the profit calculation over each dimension, but recognizes when same dimension occurs in both values and treats it as the same dimension during the calculation, in a process called intelligent array abstraction. Unlike most programming languages, there is no inherent ordering to the dimensions in a multidimensional array. This avoids duplicated formulas and explicit FOR loops, both common sources of modeling errors. The simplified expressions made possible by intelligent array abstraction allow the model to be more accessible, interpretable, and transparent.

Another consequence of intelligent array abstraction is that new dimensions can be introduced or removed from an existing model, without requiring changes to the model structure or changes to variable definitions. For example, while creating a model, the model builder might assume a particular variable, for example Discounted rate , contains a single number. Later, after constructing a model, a user might replace the single number with a table of numbers, perhaps Discount rate broken down by Country and by Economic scenario. These new divisions may reflect the fact that the effective discount rate is not the same for international divisions of a company, and that different rates are applicable to different hypothetical scenarios. Analytica automatically propagates these new dimensions to any results that depend upon Discount rate, so for example, the result for Net present value will become multidimensional and contain these new dimensions. In essence, Analytica repeats the same calculation using the discount rate for each possible combination of Country and Economic scenario.

This flexibility is important when exploring computation tradeoffs between the level of detail, computation time, available data, and overall size or dimensionality of parametric spaces. Such adjustments are common after models have been fully constructed as a way of exploring what-if scenarios and overall relationships between variables.

Uncertainty analysis

Incorporating uncertainty into model outputs helps to provide more realistic and informative projections. Uncertain quantities in Analytica can be specified using a distribution function. When evaluated, distributions are sampled using either Latin hypercube, Monte Carlo, or Sobol sampling, then the samples are propagated through the computations to the results. The sampled result distribution and summary statistics can then be viewed directly (mean, Fractile bands, probability density function (PDF), cumulative distribution function (CDF)), Analytica supports collaborative decision analysis and probability management through the use of the SIP Math(tm) standard. [2] [3]

Systems dynamics modeling

System dynamics is an approach to simulating the behavior of complex systems over time. It deals with feedback loops and time delays on the behavior of the entire system. The Dynamic() function in Analytica allows definition of variables with cyclic dependencies, such as feedback loops. It expands the influence diagram notation, which does not normally allow cycles. At least one link in each cycle includes a time lag, depicted as a gray influence arrow to distinguish it from standard black arrows without time lags.

As a programming language

Analytica includes a general language of operators and functions for expressing mathematical relationships among variables. Users can define functions and libraries to extend the language.

Analytica has several features as a programming language designed to make it easy to use for quantitative modeling:

Applications of Analytica

Analytica has been used for policy analysis, business modeling, and risk analysis. [4] Areas in which Analytica has been applied include energy, [5] [6] [7] health and pharmaceuticals, [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] environmental risk and emissions policy analysis, [22] [23] [24] [25] [26] [27] [28] [29] [30] wildlife management, [31] [32] [33] [34] ecology, [35] [36] [37] [38] [39] [40] [41] climate change, [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] technology and defense, [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] strategic financial planning, [70] [71] R&D planning and portfolio management, [72] [73] [74] financial services, aerospace, [75] manufacturing, [76] and environmental health impact assessment. [77] [78]

Editions

The Analytica software runs on Microsoft Windows operating systems. Analytica Free Edition is available for an unlimited time and lets you build models of up to 101 user objects. Analytica Professional, Enterprise, Optimizer are desktop editions with increasing levels of functionality. The Analytica Cloud Platform lets users share models via a server and run them via a web browser. Analytica 6.4 was released in 2023.

History

Analytica's predecessor, called Demos, [79] grew from the research on tools for policy analysis by Max Henrion as a PhD student and later professor at Carnegie Mellon University between 1979 and 1990. Henrion founded Lumina Decision Systems in 1991 with Brian Arnold. Lumina continued to develop the software and apply it to environmental and public policy analysis applications. Lumina first released Analytica as a product in 1996.

Related Research Articles

<span class="mw-page-title-main">Risk assessment</span> Estimation of risk associated with exposure to a given set of hazards

Risk assessment determines possible mishaps, their likelihood and consequences, and the tolerances for such events. The results of this process may be expressed in a quantitative or qualitative fashion. Risk assessment is an inherent part of a broader risk management strategy to help reduce any potential risk-related consequences.

A vulnerability assessment is the process of identifying, quantifying, and prioritizing the vulnerabilities in a system. Examples of systems for which vulnerability assessments are performed include, but are not limited to, information technology systems, energy supply systems, water supply systems, transportation systems, and communication systems. Such assessments may be conducted on behalf of a range of different organizations, from small businesses up to large regional infrastructures. Vulnerability from the perspective of disaster management means assessing the threats from potential hazards to the population and to infrastructure. It may be conducted in the political, social, economic or environmental fields.

Cost–benefit analysis (CBA), sometimes also called benefit–cost analysis, is a systematic approach to estimating the strengths and weaknesses of alternatives. It is used to determine options which provide the best approach to achieving benefits while preserving savings in, for example, transactions, activities, and functional business requirements. A CBA may be used to compare completed or potential courses of action, and to estimate or evaluate the value against the cost of a decision, project, or policy. It is commonly used to evaluate business or policy decisions, commercial transactions, and project investments. For example, the U.S. Securities and Exchange Commission must conduct cost-benefit analyses before instituting regulations or deregulations.

<span class="mw-page-title-main">Life-cycle assessment</span> Methodology for assessing environmental impacts

Life cycle assessment (LCA), also known as life cycle analysis, is a methodology for assessing environmental impacts associated with all the stages of the life cycle of a commercial product, process, or service. For instance, in the case of a manufactured product, environmental impacts are assessed from raw material extraction and processing (cradle), through the product's manufacture, distribution and use, to the recycling or final disposal of the materials composing it (grave).

Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system can be divided and allocated to different sources of uncertainty in its inputs. This involves estimating sensitivity indices that quantify the influence of an input or group of inputs on the output. A related practice is uncertainty analysis, which has a greater focus on uncertainty quantification and propagation of uncertainty; ideally, uncertainty and sensitivity analysis should be run in tandem.

Decision analysis (DA) is the discipline comprising the philosophy, methodology, and professional practice necessary to address important decisions in a formal manner. Decision analysis includes many procedures, methods, and tools for identifying, clearly representing, and formally assessing important aspects of a decision; for prescribing a recommended course of action by applying the maximum expected-utility axiom to a well-formed representation of the decision; and for translating the formal representation of a decision and its corresponding recommendation into insight for the decision maker, and other corporate and non-corporate stakeholders.

Prognostics is an engineering discipline focused on predicting the time at which a system or a component will no longer perform its intended function. This lack of performance is most often a failure beyond which the system can no longer be used to meet desired performance. The predicted time then becomes the remaining useful life (RUL), which is an important concept in decision making for contingency mitigation. Prognostics predicts the future performance of a component by assessing the extent of deviation or degradation of a system from its expected normal operating conditions. The science of prognostics is based on the analysis of failure modes, detection of early signs of wear and aging, and fault conditions. An effective prognostics solution is implemented when there is sound knowledge of the failure mechanisms that are likely to cause the degradations leading to eventual failures in the system. It is therefore necessary to have initial information on the possible failures in a product. Such knowledge is important to identify the system parameters that are to be monitored. Potential uses for prognostics is in condition-based maintenance. The discipline that links studies of failure mechanisms to system lifecycle management is often referred to as prognostics and health management (PHM), sometimes also system health management (SHM) or—in transportation applications—vehicle health management (VHM) or engine health management (EHM). Technical approaches to building models in prognostics can be categorized broadly into data-driven approaches, model-based approaches, and hybrid approaches.

<span class="mw-page-title-main">Post-normal science</span> Approach to the use of science on urgent issues involving uncertainty in facts and moral values

Post-normal science (PNS) was developed in the 1990s by Silvio Funtowicz and Jerome R. Ravetz. It is a problem-solving strategy appropriate when "facts [are] uncertain, values in dispute, stakes high and decisions urgent", conditions often present in policy-relevant research. In those situations, PNS recommends suspending temporarily the traditional scientific ideal of truth, concentrating on quality as assessed by internal and extended peer communities.

<span class="mw-page-title-main">Economic analysis of climate change</span> Using economic tools to investigate climate change

An economic analysis of climate change uses economic tools and models to calculate the magnitude and distribution of damages caused by climate change. It can also give guidance for the best policies for mitigation and adaptation to climate change from an economic perspective. There are many economic models and frameworks. For example, in a cost–benefit analysis, the trade offs between climate change impacts, adaptation, and mitigation are made explicit. For this kind of analysis, integrated assessment models (IAMs) are useful. Those models link main features of society and economy with the biosphere and atmosphere into one modelling framework. The total economic impacts from climate change are difficult to estimate. In general, they increase the more the global surface temperature increases.

Info-gap decision theory seeks to optimize robustness to failure under severe uncertainty, in particular applying sensitivity analysis of the stability radius type to perturbations in the value of a given estimate of the parameter of interest. It has some connections with Wald's maximin model; some authors distinguish them, others consider them instances of the same principle.

Exposure assessment is a branch of environmental science and occupational hygiene that focuses on the processes that take place at the interface between the environment containing the contaminant of interest and the organism being considered. These are the final steps in the path to release an environmental contaminant, through transport to its effect in a biological system. It tries to measure how much of a contaminant can be absorbed by an exposed target organism, in what form, at what rate and how much of the absorbed amount is actually available to produce a biological effect. Although the same general concepts apply to other organisms, the overwhelming majority of applications of exposure assessment are concerned with human health, making it an important tool in public health.

<span class="mw-page-title-main">Hydrological model</span> Predicting and managing water resources

A hydrologic model is a simplification of a real-world system that aids in understanding, predicting, and managing water resources. Both the flow and quality of water are commonly studied using hydrologic models.

<span class="mw-page-title-main">Risk</span> Possibility of something bad happening

In simple terms, risk is the possibility of something bad happening. Risk involves uncertainty about the effects/implications of an activity with respect to something that humans value, often focusing on negative, undesirable consequences. Many different definitions have been proposed. One international standard definition of risk is the "effect of uncertainty on objectives".

<span class="mw-page-title-main">Probability box</span> Characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties

A probability box is a characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties that is often used in risk analysis or quantitative uncertainty modeling where numerical calculations must be performed. Probability bounds analysis is used to make arithmetic and logical calculations with p-boxes.

In decision theory and quantitative policy analysis, the expected value of including uncertainty (EVIU) is the expected difference in the value of a decision based on a probabilistic analysis versus a decision based on an analysis that ignores uncertainty.

Probability bounds analysis (PBA) is a collection of methods of uncertainty propagation for making qualitative and quantitative calculations in the face of uncertainties of various kinds. It is used to project partial information about random variables and other quantities through mathematical expressions. For instance, it computes sure bounds on the distribution of a sum, product, or more complex function, given only sure bounds on the distributions of the inputs. Such bounds are called probability boxes, and constrain cumulative probability distributions.

P-boxes and probability bounds analysis have been used in many applications spanning many disciplines in engineering and environmental science, including:

The predicted no-effect concentration (PNEC) is the concentration of a chemical which marks the limit at which below no adverse effects of exposure in an ecosystem are measured. PNEC values are intended to be conservative and predict the concentration at which a chemical will likely have no toxic effect. They are not intended to predict the upper limit of concentration of a chemical that has a toxic effect. PNEC values are often used in environmental risk assessment as a tool in ecotoxicology. A PNEC for a chemical can be calculated with acute toxicity or chronic toxicity single-species data, Species Sensitivity Distribution (SSD) multi-species data, field data or model ecosystems data. Depending on the type of data used, an assessment factor is used to account for the confidence of the toxicity data being extrapolated to an entire ecosystem.

Alternatives assessment or alternatives analysis is a problem-solving approach used in environmental design, technology, and policy. It aims to minimize environmental harm by comparing multiple potential solutions in the context of a specific problem, design goal, or policy objective. It is intended to inform decision-making in situations with many possible courses of action, a wide range of variables to consider, and significant degrees of uncertainty. Alternatives assessment was originally developed as a robust way to guide precautionary action and avoid paralysis by analysis; authors such as O'Brien have presented alternatives assessment as an approach that is complementary to risk assessment, the dominant decision-making approach in environmental policy. Likewise, Ashford has described the similar concept of technology options analysis as a way to generate innovative solutions to the problems of industrial pollution more effectively than through risk-based regulation.

<span class="mw-page-title-main">M. Granger Morgan</span> American academic

M. Granger Morgan is an American scientist, academic, and engineer who is the Hamerschlag University Professor of Engineering at Carnegie Mellon University. Over his career, Morgan has led the development of the area of engineering and public policy.

References

  1. Granger Morgan and Max Henrion (1998), Analytica:A Software Tool for Uncertainty Analysis and Model Communication Archived June 30, 2007, at the Wayback Machine , Chapter 10 of Uncertainty: A Guide to Dealing with Uncertainty in Quantitative Risk and Policy Analysis, second edition, Cambridge University Press, New York.
  2. The SIPmath Standard Archived 2017-01-21 at the Wayback Machine
  3. Paul D. Kaplan and Sam Savage (2011), Monte Carlo, A Lightbulb for Illuminating Uncertainty Archived 2017-03-07 at the Wayback Machine , in Investments & Wealth Monitor
  4. Jun Long, Baruch Fischhoff (2000), Setting Risk Priorities: A Formal Model Risk Analysis, Risk Analysis 20(3):339–352.
  5. Ye Li and H. Keith Florig (Sept. 2006), Modeling the Operation and Maintenance Costs of a Large Scale Tidal Current Turbine Farm, Oceans (2006):1-6
  6. Jouni T Tuomisto and Marko Tainio (2005), An economic way of reducing health, environmental, and other pressures of urban traffic: a decision analysis on trip aggregation, BMC Public Health 5:123. doi : 10.1186/1471-2458-5-123
  7. Yurika Nishioka, Jonathan I. Levy, Gregory A. Norris, Andrew Wilson, Patrick Hofstetter, John D. Spengler (Oct 2002), Integrating Risk Assessment and Life Cycle Assessment: A Case Study of Insulation, Risk Analysis 22(5):1003–1017.
  8. Igor Linkov, Richard Wilson and George M., Gray (1998), Anticarcinogenic Responses in Rodent Cancer Bioassays Are Not Explained by Random Effects, Toxicological Sciences 43(1), Oxford University Press.
  9. Davis Bu, Eric Pan, Janice Walker, Julia Adler-Milstein, David Kendrick, Julie M. Hook, Caitlin M. Cusack, David W. Bates, and Blackford Middleton (2007), Benefits of Information Technology–Enabled Diabetes Management, Diabetes Care 30:1137–1142, American Diabetes Association.
  10. E. Ekaette, R.C. Lee, K-L Kelly, P. Dunscombe (Aug 2006), A Monte Carlo simulation approach to the characterization of uncertainties in cancer staging and radiation treatment decisions, Journal of the Operational Research Society 58:177–185.
  11. Lyon, Joseph L.; Alder, Stephen C.; Stone, Mary Bishop; Scholl, Alan; Reading, James C.; Holubkov, Richard; Sheng, Xiaoming; White, George L. Jr; Hegmann, Kurt T.; Anspaugh, Lynn; Hoffman, F Owen; Simon, Steven L.; Thomas, Brian; Carroll, Raymond; Meikle, A Wayne (Nov 2006),Thyroid Disease Associated With Exposure to the Nevada Nuclear Weapons Test Site Radiation: A Reevaluation Based on Corrected Dosimetry and Examination Data, Epidemiology 17(6):604–614.
  12. Negar Elmieh, Hadi Dowlatabadi, Liz Casman (Jan 2006), A model for Probabilistic Assessment of Malathion Spray Exposures (PAMSE) in British Columbia Archived September 29, 2011, at the Wayback Machine , CMU EEP.
  13. von Winterfeldt, Detlof; Eppel, Thomas; Adams, John; Neutra, Raymond; Delpizzo, Vincent (2004). "Managing Potential Health Risks from Electric Powerlines: A Decision Analysis Caught in Controversy". Risk Analysis. 24 (6): 1487–1502. Bibcode:2004RiskA..24.1487V. doi:10.1111/j.0272-4332.2004.00544.x. PMID   15660606. S2CID   34685466.
  14. Montville, Rebecca; Chen, Yuhuan; Schaffner, Donald W. (2002). "Risk assessment of hand washing efficacy using literature and experimental data". International Journal of Food Microbiology. 73 (2–3): 305–313. doi:10.1016/S0168-1605(01)00666-3. PMID   11934038.
  15. DC Kendrick, D Bu, E Pan, B Middleton (2007), Crossing the Evidence Chasm: Building Evidence Bridges from Process Changes to Clinical Outcomes, Journal of the American Medical Informatics Association, Elsevier.
  16. Cox, Louis Anthony (Tony) (2005). "Potential human health benefits of antibiotics used in food animals: A case study of virginiamycin". Environment International. 31 (4): 549–563. Bibcode:2005EnInt..31..549C. doi:10.1016/j.envint.2004.10.012. PMID   15871160.
  17. Jan Walker, Eric Pan, Douglas Johnston, Julia Adler-Milstein, David W. Bates, and Blackford Middleton (19 Jan 2005), The Value Of Health Care Information Exchange And Interoperability, Health Affairs.
  18. Doug Johnston, Eric Pan, Blackford Middleton, Finding the Value in Healthcare Information Technologies Archived July 6, 2008, at the Wayback Machine , Center for Information Technology Leadership (C!TL) whitepaper.
  19. Chrisman, L., Langley, P., Bay, S., and Pohorille, A. (Jan 2003), "Incorporating biological knowledge into evaluation of causal regulatory hypotheses", Pacific Symposium on Biocomputing (PSB).
  20. Jan Walker, Eric Pan, Douglas Johnson, Julia Adler-Milstein, David W. Bates and Blackford Middleton (2005), The Value of Health Care Information and Exchange And Interoperability" Health Affairs.
  21. Steve Lohr, Road Map to a Digital System of Health Records, New York Times, January 29, 2005
  22. C. Bloyd, J. Camp, G. Conzelmann, J. Formento, J. Molburg, J. Shannon, M. Henrion, R. Sonnenblick, K. Soo Hoo, J. Kalagnanam, S. Siegel, R. Sinha, M. Small, T. Sullivan, R. Marnicio, P. Ryan, R. Turner, D. Austin, D. Burtraw, D. Farrell, T. Green, A. Krupnick, and E. Mansur (Dec 1996), Tracking and Analysis Framework (TAF) Model Documentation and User’s Guide: An Interaction Model for Integrated Assessment of Title IV of the Clean Air Act Amendments Archived January 5, 2009, at the Wayback Machine , Decision and Information Sciences Division, Argonne National Laboratory.
  23. Max Henrion, Richard Sonnenblick, Cary Bloyd (Jan 1997), Innovations in Integrated Assessment: The Tracking and Analysis Framework (TAF) Archived January 5, 2009, at the Wayback Machine , Air and Waste Management Conference on Acid Rain and Electric Utilities, Scottsdale, AZ
  24. Richard Sonnenblick and Max Henrion (Jan 1997), Uncertainty in the Tracking and Analysis Framework Integrated Assessment: The Value of Knowing How Little You Know Archived January 5, 2009, at the Wayback Machine , Air and Waste Management Conference on Acid Rain and Electric Utilities, Scottsdale, Arizona.
  25. Sinha, R.; Small, M. J.; Ryan, P. F.; Sullivan, T. J.; Cosby, B. J. (1998). "Reduced-Form Modelling of Surface Water and Soil Chemistry for the Tracking and Analysis Framework". Water, Air, and Soil Pollution. 105 (3/4): 617–642. Bibcode:1998WASP..105..617S. doi:10.1023/A:1004993425759. S2CID   92758035.
  26. Dallas Burtraw and Erin Mansur (Mar 1999), The Effects of Trading and Banking in the SO2 Allowance Market Archived 2007-07-15 at the Wayback Machine , Discussion paper 99–25, Resources for the Future.
  27. Galen mcKinley, Miriam Zuk, Morten Höjer, Montserrat Avalos, Isabel González, Rodolfo Iniestra, Israel Laguna, Miguel A. Martínez, Patricia Osnaya, Luz M. Reynales, Raydel Valdés, and Julia Martínez (2005), Quantification of Local and Global Benefits from Air Pollution Control in Mexico City Archived September 29, 2011, at the Wayback Machine , Environ. Sci. Technol. 39:1954–1961.
  28. Luis A. CIFUENTES, Enzo SAUMA, Hector JORQUERA and Felipe SOTO (2000), Preliminary Estimation of the Potential Ancillary Benefits for Chile Archived 2012-04-23 at the Wayback Machine , Ancillary Benefits and Costs of Greenhouse Gas Mitigation.
  29. Marko Tainio, Jouni T Tuomisto, Otto Hänninen, Juhani Ruuskanen, Matti J Jantunen, and Juha Pekkanen (2007), Parameter and model uncertainty in a life-table model for fine particles (PM2.5): a statistical modeling study, Environ Health 6(24).
  30. Basson, L.; Petrie, J.G. (2007). "An integrated approach for the consideration of uncertainty in decision making supported by Life Cycle Assessment". Environmental Modelling & Software. 22 (2): 167–176. Bibcode:2007EnvMS..22..167B. doi:10.1016/j.envsoft.2005.07.026.
  31. Matthew F. Bingham, Zhimin Li, Kristy E. Mathews, Colleen M. Spagnardi, Jennifer S. Whaley, Sara G. Veale and Jason C. Kinnell (2011), An Application of Behavioral Modeling to Characterize Urban Angling Decisions and Values, North American Journal of Fisheries Management 31:257–268.
  32. Woodbury, Peter B.; Smith, James E.; Weinstein, David A.; Laurence, John A. (1998). "Assessing potential climate change effects on loblolly pine growth: A probabilistic regional modeling approach". Forest Ecology and Management. 107 (1–3): 99–116. Bibcode:1998ForEM.107...99W. doi: 10.1016/S0378-1127(97)00323-X .
  33. P.R. Richard, M. Power, M. Hammill, and W. Doidge(2003). Eastern Hudson Bay Beluga Precautionary Approach Case Study: Risk analysis models for co-management Archived April 3, 2012, at the Wayback Machine , Canadian Science Advisory Secretariat Research Document.
  34. P.R. Richard (2003), Incorporating Uncertainty in Population Assessments Archived April 3, 2012, at the Wayback Machine , Canadian Science Advisory Secretariat Research Document.
  35. O'Ryan R., Diaz M. (2008), The Use of Probabilistic Analysis to Improve Decision-Making in Environmental Regulation in a Developing Context: The Case of Arsenic Regulation in Chile, Human and Ecological Risk Assessment, Vol 14, Issue 3, pg: 623–640.
  36. Andrew Gronewold and Mark Borsuk, "A probabilistic modeling tool for assessing water quality standard compliance", submitted to EMS Oct 2008.
  37. Borsuk, Mark E.; Reichert, Peter; Peter, Armin; Schager, Eva; Burkhardt-Holm, Patricia (2006). "Assessing the decline of brown trout (Salmo trutta) in Swiss rivers using a Bayesian probability network". Ecological Modelling. 192 (1–2): 224–244. Bibcode:2006EcMod.192..224B. doi:10.1016/j.ecolmodel.2005.07.006.
  38. Borsuk, Mark E.; Stow, Craig A.; Reckhow, Kenneth H. (2004). "A Bayesian network of eutrophication models for synthesis, prediction, and uncertainty analysis". Ecological Modelling. 173 (2–3): 219–239. Bibcode:2004EcMod.173..219B. doi:10.1016/j.ecolmodel.2003.08.020.
  39. Mark E. Borsuk, Sean P. Powers, and Charles H. Peterson (2002), A survival model of the effects of bottom-water hypoxia on the population density of an estuarine clam (Macoma balthica) [ dead link ], Canadian Journal of Fisheries and Aquatic Sciences (59):1266–1274.
  40. Rebecca Montville and Donald Schaffner (Feb 2005), Monte Carlo Simulation of Pathogen Behavior during the Sprout Production Process Archived October 2, 2011, at the Wayback Machine , Applied and Environmental Microbiology 71(2):746–753.
  41. Rasmussen, S.K.J; Ross, T.; Olley, J.; McMeekin, T. (2002). "A process risk model for the shelf life of Atlantic salmon fillets". International Journal of Food Microbiology. 73 (1): 47–60. doi:10.1016/S0168-1605(01)00687-0. PMID   11885573.
  42. Groves, David G.; Lempert, Robert J. (2007). "A new analytic method for finding policy-relevant scenarios". Global Environmental Change. 17 (1): 73–85. Bibcode:2007GEC....17...73G. doi:10.1016/j.gloenvcha.2006.11.006. S2CID   510560.
  43. Senbel, Maged; McDaniels, Timothy; Dowlatabadi, Hadi (2003). "The ecological footprint: A non-monetary metric of human consumption applied to North America". Global Environmental Change. 13 (2): 83–100. Bibcode:2003GEC....13...83S. doi:10.1016/S0959-3780(03)00009-8.
  44. Dowlatabadi, H. (1998). Sensitivity of Climate Change Mitigation Estimates to Assumptions About Technical Change. Energy Economics 20: 473–93.
  45. West, J. J. and H. Dowlatabadi (1998). On assessing the economic impacts of sea level rise on developed coasts. Climate, change and risk. London, Routledge. 205–20.
  46. Leiss, W., H. Dowlatabadi, and Greg Paoli (2001). Who's Afraid of Climate Change? A guide for the perplexed. Isuma 2(4): 95–103.
  47. Morgan, M. G., M. Kandlikar, J. Risbey and H. Dowlatabadi (1999). Why conventional tools for policy analysis are often inadequate for problems of global change. Climatic Change 41: 271–81.
  48. Casman, E. A., M. G. Morgan and H. Dowlatabadi (1999). Mixed Levels of Uncertainty in Complex Policy Models. Risk Analysis 19(1): 33–42.
  49. Dowlatabadi, H. (2003). Scale and Scope In Integrated Assessment: lessons from ten years with ICAM. Scaling in Integrated Assessment. J. Rotmans and D. S. Rothman. Lisse, Swetz & Zeitlinger: 55–72.
  50. Dowlatabadi, H. (2000). Bumping against a gas ceiling. Climatic Change 46(3): 391–407.
  51. Morgan, M. G. and H. Dowlatabadi (1996). Learning From Integrated Assessment of Climate Change. Climatic Change 34: 337–368.
  52. Henry Heimeier (1996), A New Paradigm For Modeling The Precision Strike Process, published in MILCOM96.
  53. Russell F. Richards, Henry A. Neimeier, W. L. Hamm, and D. L. Alexander, "Analytical Modeling in Support of C4ISR Mission Assessment (CMA)," Third International Symposium on Command and Control Research and Technology, National Defense University, Fort McNair, Washington, DC, June 17–20, 1997, pp. 626–639.
  54. Henry Neimeier and C. McGowan (1996), "Analyzing Processes with HANQ", Proceedings of the International Council on Systems Engineering '96.
  55. Kenneth P. Kuskey and Susan K. Parker (2000), "The Architecture of CAPE Models", MITRE technical paper. See Abstract.
  56. Henry Neimeier (1994), "Analytic Queuing Network", Conference Proceedings of the 12th International Conference on the System Dynamics Society, in Stirling, Scotland.
  57. Henry Neimeier (1996), "Analytic Uncertainty Modeling Versus Discrete Event Simulation", PHALANX.
  58. Rahul Tongia, "Can broadband over powerline carrier (PLC) compete?". The author uses Analytica to model the economic viability of the introduction of a PLC service.
  59. Promises and False Promises of PowerLine Carrier (PLC) Broadband Communications – A Techno-Economic Analysis "Archived copy" (PDF). Archived from the original (PDF) on 2007-02-11. Retrieved 2011-07-08.{{cite web}}: CS1 maint: archived copy as title (link)
  60. Kanchana Wanichkorn and Marvin Sirbu (1998), The Economics of Premises Internet Telephony [ permanent dead link ], CMU-EPP.
  61. E.L. Kyser, E.R. Hnatek, M.H. Roettgering (2001), The politics of accelerated stress testing, Sound and Vibration 35(3):24–29.
  62. Kevin J. Soo Hoo (June 2000), How Much Is Enough? A Risk-Management Approach to Computer Security Archived September 21, 2011, at the Wayback Machine , Working Paper, Consortium for Research on Information Security and Policy (CRISP), Stanford University.
  63. M. Steinbach and S. Giles of MITRE (2005), A Model for Joint Infrastructure Investment Archived 2011-09-24 at the Wayback Machine , AIAA-2005-7309, in AIAA 5th ATIO and 16th Lighter-than-air sys tech and balloon systems conferences, Arlington VA, Sep 26–28, 2005.
  64. Bloomfield, R., Guerra, S. (2002), Process modelling to support dependability arguments, Proceedings. International Conference on Dependable Systems and Networks, pg. 113–122. DSN 2002.
  65. Christopher L Weber and Sanath K Kalidas (Fall 2004), Cost-Benefit Analysis of LEED Silver Certification for New House Residence Hall at Carnegie Mellon University, Civil Systems Investment Planning and Pricing Project, Dept. of Civil & Environmental Engineering, Carnegie Mellon University.
  66. J. McMahon, X. Liu, I. Turiel (Jun 2000), Uncertainty and sensitivity analyses of ballast life-cycle cost and payback period, Technical Report LBNL–44450, Lawrence Berkeley Labs, Berkeley CA.
  67. Paul K. Davis (2000), Dealing with complexity: exploratory analysis enabled by multiresolultion, multiperspective modeling, Proceedings of the 32nd Conference on Winter Simulation, pg. 293–302.
  68. Paul K. Davis (2000), Exploratory Analysis Enabled by Multiresolution, Multiperspective Modeling, Proceedings of the 2000 Winter Simulation Conference J. A. Joines, R. R. Barton, K. Kang, and P. A. Fishwick, eds.
  69. NASA (1994), Schedule and Cost Risk Analysis Modeling (SCRAM) System, NASA SBIR Successes.
  70. "Cubeplan case studies". Cubeplan.com. Retrieved 2011-07-12.
  71. "Novix consulting services". Novix.com. Retrieved 2011-07-12.
  72. Enrich Consulting, publications on Portfolio Management Archived July 13, 2011, at the Wayback Machine
  73. "Bicore, Inc". Bicore.nl. Retrieved 2011-07-12.
  74. "R&D evaluation tools at W.L. Gore". Lumina. Archived from the original on October 17, 2013.
  75. Speeding turnaround of the Space Shuttle Archived March 15, 2012, at the Wayback Machine , Lumina case studies
  76. Auto maker saves $250M on warranty costs Archived 2010-12-12 at the Wayback Machine Lumina case studies
  77. Grellier, J., Ravazzani, P., Cardis, E. (2014). "Potential health impacts of residential exposures to extremely low frequency magnetic fields in Europe". Environment International. 62. Pergamon: 55–63. Bibcode:2014EnInt..62...55G. doi:10.1016/j.envint.2013.09.017. hdl: 10044/1/41782 . PMID   24161447.
  78. Grellier, J., White, M. P., De Bell, S., Brousse, O., Elliott, L. R., Fleming, L. E., Heaviside, C., Simpson, C., Taylor, T., Wheeler, B. W., Lovell, R. (May 2024). "Valuing the health benefits of nature-based recreational physical activity in England". Environment International. 187: 108667. Bibcode:2024EnInt.18708667G. doi: 10.1016/j.envint.2024.108667 . hdl: 10871/135858 . ISSN   0160-4120. PMID   38642505.
  79. Neil Wishbow and Max Henrion, "Demos User's Manual", Department of Engineering and Public Policy, Carnegie Mellon University, 1987.