Analytica (software)

Last updated
Analytica
Developer(s) Lumina Decision Systems
Initial releaseJanuary 16, 1992;32 years ago (1992-01-16)
Written in C++
Operating system Windows
Platform IA-32, x64
Available inEnglish
Type Decision-making software
License Commercial proprietary software
Website analytica.com

Analytica is a visual software developed by Lumina Decision Systems for creating, analyzing and communicating quantitative decision models. [1] It combines hierarchical influence diagrams for visual creation and view of models, intelligent arrays for working with multidimensional data, Monte Carlo simulation for analyzing risk and uncertainty, and optimization, including linear and nonlinear programming. Its design is based on ideas from the field of decision analysis. As a computer language, it combines a declarative (non-procedural) structure for referential transparency, array abstraction, and automatic dependency maintenance for efficient sequencing of computation.

Contents

Hierarchical influence diagrams

Analytica models are organized as influence diagrams. Variables (and other objects) appear as nodes of various shapes on a diagram, connected by arrows that provide a visual representation of dependencies. Analytica influence diagrams may be hierarchical, in which a single module node on a diagram represents an entire sub-model.

Hierarchical influence diagrams in Analytica serve as an organizational tool. Because the visual layout of an influence diagram matches these natural human abilities both spatially and in the level of abstraction, people are able to take in more information about a model's structure and organization at a glance than is possible with less visual paradigms, such as Spreadsheets and Mathematical expressions. Managing the structure and organization of a large model can be a significant part of the modeling process, but is substantially aided by the visualization of influence diagrams.

Influence diagrams also serve as a tool for communication. Once a quantitative model has been created and its final results computed, it is often the case that an understanding of how the results are obtained, and how various assumptions impact the results, is far more important than the specific numbers computed. Analytica gives users the ability to help target audiences understand these aspects within their models. The visual representation of an influence diagram quickly communicates an understanding at a level of abstraction that is normally more appropriate than detailed representations such as mathematical expressions or cell formulas. When more detail is desired, users can drill down to increasing levels of detail, speeded by the visual depiction of the model's structure.

The existence of an easily understandable and transparent model supports communication and debate within an organization, and this effect is one of the primary benefits of quantitative model building. When all interested parties are able to understand a common model structure, debates and discussions will often focus more directly on specific assumptions, can cut down on "cross-talk", and therefore lead to more productive interactions within the organization. The influence diagram serves as a graphical representation that can help to make models accessible to people at different levels.

Intelligent multidimensional arrays

Analytica uses index objects to track the dimensions of multidimensional arrays. An index object has a name and a list of elements. When two multidimensional values are combined, for example in an expression such as

Profit = Revenue − Expenses

where Revenue and Expenses are each multidimensional, Analytica repeats the profit calculation over each dimension, but recognizes when same dimension occurs in both values and treats it as the same dimension during the calculation, in a process called intelligent array abstraction. Unlike most programming languages, there is no inherent ordering to the dimensions in a multidimensional array. This avoids duplicated formulas and explicit FOR loops, both common sources of modeling errors. The simplified expressions made possible by intelligent array abstraction allow the model to be more accessible, interpretable, and transparent.

Another consequence of intelligent array abstraction is that new dimensions can be introduced or removed from an existing model, without requiring changes to the model structure or changes to variable definitions. For example, while creating a model, the model builder might assume a particular variable, for example Discounted rate , contains a single number. Later, after constructing a model, a user might replace the single number with a table of numbers, perhaps Discount rate broken down by Country and by Economic scenario. These new divisions may reflect the fact that the effective discount rate is not the same for international divisions of a company, and that different rates are applicable to different hypothetical scenarios. Analytica automatically propagates these new dimensions to any results that depend upon Discount rate, so for example, the result for Net present value will become multidimensional and contain these new dimensions. In essence, Analytica repeats the same calculation using the discount rate for each possible combination of Country and Economic scenario.

This flexibility is important when exploring computation tradeoffs between the level of detail, computation time, available data, and overall size or dimensionality of parametric spaces. Such adjustments are common after models have been fully constructed as a way of exploring what-if scenarios and overall relationships between variables.

Uncertainty analysis

Incorporating uncertainty into model outputs helps to provide more realistic and informative projections. Uncertain quantities in Analytica can be specified using a distribution function. When evaluated, distributions are sampled using either Latin hypercube, Monte Carlo, or Sobol sampling, then the samples are propagated through the computations to the results. The sampled result distribution and summary statistics can then be viewed directly (mean, Fractile bands, probability density function (PDF), cumulative distribution function (CDF)), Analytica supports collaborative decision analysis and probability management through the use of the SIP Math(tm) standard. [2] [3]

Systems dynamics modeling

System dynamics is an approach to simulating the behavior of complex systems over time. It deals with feedback loops and time delays on the behavior of the entire system. The Dynamic() function in Analytica allows definition of variables with cyclic dependencies, such as feedback loops. It expands the influence diagram notation, which does not normally allow cycles. At least one link in each cycle includes a time lag, depicted as a gray influence arrow to distinguish it from standard black arrows without time lags.

As a programming language

Analytica includes a general language of operators and functions for expressing mathematical relationships among variables. Users can define functions and libraries to extend the language.

Analytica has several features as a programming language designed to make it easy to use for quantitative modeling:

Applications of Analytica

Analytica has been used for policy analysis, business modeling, and risk analysis. [4] Areas in which Analytica has been applied include energy, [5] [6] [7] [8] [9] [10] health and pharmaceuticals, [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] environmental risk and emissions policy analysis, [27] [28] [29] [30] [31] [32] [33] [34] [35] wildlife management, [36] [37] [38] [39] ecology, [40] [41] [42] [43] [44] [45] [46] climate change, [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] technology and defense, [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] strategic financial planning, [75] [76] R&D planning and portfolio management, [77] [78] [79] financial services, aerospace, [80] manufacturing, [81] and environmental health impact assessment. [82] [83]

Editions

The Analytica software runs on Microsoft Windows operating systems. Analytica Free Edition is available for an unlimited time and lets you build models of up to 101 user objects. Analytica Professional, Enterprise, Optimizer are desktop editions with increasing levels of functionality. The Analytica Cloud Platform lets users share models via a server and run them via a web browser. Analytica 6.1 was released in 2021.

History

Analytica's predecessor, called Demos, [84] grew from the research on tools for policy analysis by Max Henrion as a PhD student and later professor at Carnegie Mellon University between 1979 and 1990. Henrion founded Lumina Decision Systems in 1991 with Brian Arnold. Lumina continued to develop the software and apply it to environmental and public policy analysis applications. Lumina first released Analytica as a product in 1996.

Related Research Articles

Real options valuation, also often termed real options analysis, applies option valuation techniques to capital budgeting decisions. A real option itself, is the right—but not the obligation—to undertake certain business initiatives, such as deferring, abandoning, expanding, staging, or contracting a capital investment project. For example, real options valuation could examine the opportunity to invest in the expansion of a firm's factory and the alternative option to sell the factory.

Risk assessment determines possible mishaps, their likelihood and consequences, and the tolerances for such events. The results of this process may be expressed in a quantitative or qualitative fashion. Risk assessment is an inherent part of a broader risk management strategy to help reduce any potential risk-related consequences.

Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system can be divided and allocated to different sources of uncertainty in its inputs. A related practice is uncertainty analysis, which has a greater focus on uncertainty quantification and propagation of uncertainty; ideally, uncertainty and sensitivity analysis should be run in tandem.

Decision analysis (DA) is the discipline comprising the philosophy, methodology, and professional practice necessary to address important decisions in a formal manner. Decision analysis includes many procedures, methods, and tools for identifying, clearly representing, and formally assessing important aspects of a decision; for prescribing a recommended course of action by applying the maximum expected-utility axiom to a well-formed representation of the decision; and for translating the formal representation of a decision and its corresponding recommendation into insight for the decision maker, and other corporate and non-corporate stakeholders.

In mathematical finance, a Monte Carlo option model uses Monte Carlo methods to calculate the value of an option with multiple sources of uncertainty or with complicated features. The first application to option pricing was by Phelim Boyle in 1977. In 1996, M. Broadie and P. Glasserman showed how to price Asian options by Monte Carlo. An important development was the introduction in 1996 by Carriere of Monte Carlo methods for options with early exercise features.

Uncertainty quantification (UQ) is the science of quantitative characterization and estimation of uncertainties in both computational and real world applications. It tries to determine how likely certain outcomes are if some aspects of the system are not exactly known. An example would be to predict the acceleration of a human body in a head-on crash with another car: even if the speed was exactly known, small differences in the manufacturing of individual cars, how tightly every bolt has been tightened, etc., will lead to different results that can only be predicted in a statistical sense.

<span class="mw-page-title-main">Ecosystem model</span> A typically mathematical representation of an ecological system

An ecosystem model is an abstract, usually mathematical, representation of an ecological system, which is studied to better understand the real system.

<span class="mw-page-title-main">Fuzzy cognitive map</span>

A fuzzy cognitive map (FCM) is a cognitive map within which the relations between the elements of a "mental landscape" can be used to compute the "strength of impact" of these elements. Fuzzy cognitive maps were introduced by Bart Kosko. Robert Axelrod introduced cognitive maps as a formal way of representing social scientific knowledge and modeling decision making in social and political systems, then brought in the computation.

Quantification of Margins and Uncertainty (QMU) is a decision support methodology for complex technical decisions. QMU focuses on the identification, characterization, and analysis of performance thresholds and their associated margins for engineering systems that are evaluated under conditions of uncertainty, particularly when portions of those results are generated using computational modeling and simulation. QMU has traditionally been applied to complex systems where comprehensive experimental test data is not readily available and cannot be easily generated for either end-to-end system execution or for specific subsystems of interest. Examples of systems where QMU has been applied include nuclear weapons performance, qualification, and stockpile assessment. QMU focuses on characterizing in detail the various sources of uncertainty that exist in a model, thus allowing the uncertainty in the system response output variables to be well quantified. These sources are frequently described in terms of probability distributions to account for the stochastic nature of complex engineering systems. The characterization of uncertainty supports comparisons of design margins for key system performance metrics to the uncertainty associated with their calculation by the model. QMU supports risk-informed decision-making processes where computational simulation results provide one of several inputs to the decision-making authority. There is currently no standardized methodology across the simulation community for conducting QMU; the term is applied to a variety of different modeling and simulation techniques that focus on rigorously quantifying model uncertainty in order to support comparison to design margins.

<span class="mw-page-title-main">Hydrological model</span> Predicting and managing water resources

A hydrologic model is a simplification of a real-world system that aids in understanding, predicting, and managing water resources. Both the flow and quality of water are commonly studied using hydrologic models.

<span class="mw-page-title-main">Risk</span> The possibility of something bad happening

In simple terms, risk is the possibility of something bad happening. Risk involves uncertainty about the effects/implications of an activity with respect to something that humans value, often focusing on negative, undesirable consequences. Many different definitions have been proposed. The international standard definition of risk for common understanding in different applications is "effect of uncertainty on objectives".

Robust decision-making (RDM) is an iterative decision analytics framework that aims to help identify potential robust strategies, characterize the vulnerabilities of such strategies, and evaluate the tradeoffs among them. RDM focuses on informing decisions under conditions of what is called "deep uncertainty", that is, conditions where the parties to a decision do not know or do not agree on the system models relating actions to consequences or the prior probability distributions for the key input parameters to those models.

<span class="mw-page-title-main">Probability box</span> Characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties

A probability box is a characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties that is often used in risk analysis or quantitative uncertainty modeling where numerical calculations must be performed. Probability bounds analysis is used to make arithmetic and logical calculations with p-boxes.

In decision theory and quantitative policy analysis, the expected value of including uncertainty (EVIU) is the expected difference in the value of a decision based on a probabilistic analysis versus a decision based on an analysis that ignores uncertainty.

Probability bounds analysis (PBA) is a collection of methods of uncertainty propagation for making qualitative and quantitative calculations in the face of uncertainties of various kinds. It is used to project partial information about random variables and other quantities through mathematical expressions. For instance, it computes sure bounds on the distribution of a sum, product, or more complex function, given only sure bounds on the distributions of the inputs. Such bounds are called probability boxes, and constrain cumulative probability distributions.

P-boxes and probability bounds analysis have been used in many applications spanning many disciplines in engineering and environmental science, including:

Alternatives assessment or alternatives analysis is a problem-solving approach used in environmental design, technology, and policy. It aims to minimize environmental harm by comparing multiple potential solutions in the context of a specific problem, design goal, or policy objective. It is intended to inform decision-making in situations with many possible courses of action, a wide range of variables to consider, and significant degrees of uncertainty. Alternatives assessment was originally developed as a robust way to guide precautionary action and avoid paralysis by analysis; authors such as O'Brien have presented alternatives assessment as an approach that is complementary to risk assessment, the dominant decision-making approach in environmental policy. Likewise, Ashford has described the similar concept of technology options analysis as a way to generate innovative solutions to the problems of industrial pollution more effectively than through risk-based regulation.

The discipline of probability management communicates and calculates uncertainties as data structures that obey both the laws of arithmetic and probability, while preserving statistical coherence. The simplest approach is to use vector arrays of simulated or historical realizations and metadata called Stochastic Information Packets (SIPs). A set of SIPs, which preserve statistical relationships between variables, is said to be coherent and is referred to as a Stochastic Library Unit with Relationships Preserved (SLURP). SIPs and SLURPs allow stochastic simulations to communicate with one another. For example, see Analytica (Wikipedia), Analytica, Oracle Crystal Ball, Frontline Solvers, and Autobox.

Techno-economic assessment or techno-economic analysis is a method of analyzing the economic performance of an industrial process, product, or service. It typically uses software modeling to estimate capital cost, operating cost, and revenue based on technical and financial input parameters. One desired outcome is to summarize results in a concise and visually coherent form, using visualization tools such as tornado diagrams and sensitivity analysis graphs.

<span class="mw-page-title-main">M. Granger Morgan</span> American academic

M. Granger Morgan is an American scientist, academic, and engineer who is the Hamerschlag University Professor of Engineering at Carnegie Mellon University. Over his career, Morgan has led the development of the area of engineering and public policy.

References

  1. Granger Morgan and Max Henrion (1998), Analytica:A Software Tool for Uncertainty Analysis and Model Communication Archived June 30, 2007, at the Wayback Machine , Chapter 10 of Uncertainty: A Guide to Dealing with Uncertainty in Quantitative Risk and Policy Analysis, second edition, Cambridge University Press, New York.
  2. The SIPmath Standard Archived 2017-01-21 at the Wayback Machine
  3. Paul D. Kaplan and Sam Savage (2011), Monte Carlo, A Lightbulb for Illuminating Uncertainty Archived 2017-03-07 at the Wayback Machine , in Investments & Wealth Monitor
  4. Jun Long, Baruch Fischhoff (2000), Setting Risk Priorities: A Formal Model Risk Analysis, Risk Analysis 20(3):339–352.
  5. Stadler M., Marnay C., Azevedo I.L., Komiyama R., Lai J. (2009), The Open Source Stochastic Building Simulation Tool SLBM and Its Capabilities to Capture Uncertainty of Policymaking in the U.S. Building Sector Archived September 27, 2011, at the Wayback Machine
  6. Ye Li and H. Keith Florig (Sept. 2006), Modeling the Operation and Maintenance Costs of a Large Scale Tidal Current Turbine Farm, Oceans (2006):1-6
  7. L.F.Miller, Brian Thomas, J.McConn, J. Hou, J.Preston, T.Anderson, and M.Humberstone (2007), Uncertainty Analysis Methods for Equilibrium Fuel Cycles, ANS Summer Abstract.
  8. Gregory A. Norris and Peter Yost (Fall 2001), Journal of Industrial Ecology 5(4):15–28, MIT Press Journals.
  9. Jouni T Tuomisto and Marko Tainio (2005), An economic way of reducing health, environmental, and other pressures of urban traffic: a decision analysis on trip aggregation, BMC Public Health 5:123. doi : 10.1186/1471-2458-5-123
  10. Yurika Nishioka, Jonathan I. Levy, Gregory A. Norris, Andrew Wilson, Patrick Hofstetter, John D. Spengler (Oct 2002), Integrating Risk Assessment and Life Cycle Assessment: A Case Study of Insulation, Risk Analysis 22(5):1003–1017.
  11. Igor Linkov, Richard Wilson and George M., Gray (1998), Anticarcinogenic Responses in Rodent Cancer Bioassays Are Not Explained by Random Effects, Toxicological Sciences 43(1), Oxford University Press.
  12. M. Loane and R. Wootton (Oct 2001), A simulation model for analysing patient activity in dermatology, Journal of Telemedicine and Telecare 7(1):23–25(3), Royal Society of Medicine Press.
  13. Davis Bu, Eric Pan, Janice Walker, Julia Adler-Milstein, David Kendrick, Julie M. Hook, Caitlin M. Cusack, David W. Bates, and Blackford Middleton (2007), Benefits of Information Technology–Enabled Diabetes Management, Diabetes Care 30:1137–1142, American Diabetes Association.
  14. Julia Adler-Milstein, Davis Bu, Eric Pan, Janice Walker, David Kendrick, Julie M. Hook, David W. Bates, Blackford Middleton. The Cost of Information Technology-Enabled Diabetes Management, Disease Management. June 1, 2007, 10(3): 115–128. doi : 10.1089/dis.2007.103640.
  15. E. Ekaette, R.C. Lee, K-L Kelly, P. Dunscombe (Aug 2006), A Monte Carlo simulation approach to the characterization of uncertainties in cancer staging and radiation treatment decisions, Journal of the Operational Research Society 58:177–185.
  16. Lyon, Joseph L.; Alder, Stephen C.; Stone, Mary Bishop; Scholl, Alan; Reading, James C.; Holubkov, Richard; Sheng, Xiaoming; White, George L. Jr; Hegmann, Kurt T.; Anspaugh, Lynn; Hoffman, F Owen; Simon, Steven L.; Thomas, Brian; Carroll, Raymond; Meikle, A Wayne (Nov 2006),Thyroid Disease Associated With Exposure to the Nevada Nuclear Weapons Test Site Radiation: A Reevaluation Based on Corrected Dosimetry and Examination Data, Epidemiology 17(6):604–614.
  17. Negar Elmieh, Hadi Dowlatabadi, Liz Casman (Jan 2006), A model for Probabilistic Assessment of Malathion Spray Exposures (PAMSE) in British Columbia Archived September 29, 2011, at the Wayback Machine , CMU EEP.
  18. von Winterfeldt, Detlof; Eppel, Thomas; Adams, John; Neutra, Raymond; Delpizzo, Vincent (2004). "Managing Potential Health Risks from Electric Powerlines: A Decision Analysis Caught in Controversy". Risk Analysis. 24 (6): 1487–1502. Bibcode:2004RiskA..24.1487V. doi:10.1111/j.0272-4332.2004.00544.x. PMID   15660606. S2CID   34685466.
  19. Montville, Rebecca; Chen, Yuhuan; Schaffner, Donald W. (2002). "Risk assessment of hand washing efficacy using literature and experimental data". International Journal of Food Microbiology. 73 (2–3): 305–313. doi:10.1016/S0168-1605(01)00666-3. PMID   11934038.
  20. DC Kendrick, D Bu, E Pan, B Middleton (2007), Crossing the Evidence Chasm: Building Evidence Bridges from Process Changes to Clinical Outcomes, Journal of the American Medical Informatics Association, Elsevier.
  21. Cox, Louis Anthony (Tony) (2005). "Potential human health benefits of antibiotics used in food animals: A case study of virginiamycin". Environment International. 31 (4): 549–563. Bibcode:2005EnInt..31..549C. doi:10.1016/j.envint.2004.10.012. PMID   15871160.
  22. Jan Walker, Eric Pan, Douglas Johnston, Julia Adler-Milstein, David W. Bates, and Blackford Middleton (19 Jan 2005), The Value Of Health Care Information Exchange And Interoperability, Health Affairs.
  23. Doug Johnston, Eric Pan, Blackford Middleton, Finding the Value in Healthcare Information Technologies Archived July 6, 2008, at the Wayback Machine , Center for Information Technology Leadership (C!TL) whitepaper.
  24. Chrisman, L., Langley, P., Bay, S., and Pohorille, A. (Jan 2003), "Incorporating biological knowledge into evaluation of causal regulatory hypotheses", Pacific Symposium on Biocomputing (PSB).
  25. Jan Walker, Eric Pan, Douglas Johnson, Julia Adler-Milstein, David W. Bates and Blackford Middleton (2005), The Value of Health Care Information and Exchange And Interoperability" Health Affairs.
  26. Steve Lohr, Road Map to a Digital System of Health Records, New York Times, January 29, 2005
  27. C. Bloyd, J. Camp, G. Conzelmann, J. Formento, J. Molburg, J. Shannon, M. Henrion, R. Sonnenblick, K. Soo Hoo, J. Kalagnanam, S. Siegel, R. Sinha, M. Small, T. Sullivan, R. Marnicio, P. Ryan, R. Turner, D. Austin, D. Burtraw, D. Farrell, T. Green, A. Krupnick, and E. Mansur (Dec 1996), Tracking and Analysis Framework (TAF) Model Documentation and User’s Guide: An Interaction Model for Integrated Assessment of Title IV of the Clean Air Act Amendments Archived January 5, 2009, at the Wayback Machine , Decision and Information Sciences Division, Argonne National Laboratory.
  28. Max Henrion, Richard Sonnenblick, Cary Bloyd (Jan 1997), Innovations in Integrated Assessment: The Tracking and Analysis Framework (TAF) Archived January 5, 2009, at the Wayback Machine , Air and Waste Management Conference on Acid Rain and Electric Utilities, Scottsdale, AZ
  29. Richard Sonnenblick and Max Henrion (Jan 1997), Uncertainty in the Tracking and Analysis Framework Integrated Assessment: The Value of Knowing How Little You Know Archived January 5, 2009, at the Wayback Machine , Air and Waste Management Conference on Acid Rain and Electric Utilities, Scottsdale, Arizona.
  30. Sinha, R.; Small, M. J.; Ryan, P. F.; Sullivan, T. J.; Cosby, B. J. (1998). "Reduced-Form Modelling of Surface Water and Soil Chemistry for the Tracking and Analysis Framework". Water, Air, and Soil Pollution. 105 (3/4): 617–642. Bibcode:1998WASP..105..617S. doi:10.1023/A:1004993425759. S2CID   92758035.
  31. Dallas Burtraw and Erin Mansur (Mar 1999), The Effects of Trading and Banking in the SO2 Allowance Market Archived 2007-07-15 at the Wayback Machine , Discussion paper 99–25, Resources for the Future.
  32. Galen mcKinley, Miriam Zuk, Morten Höjer, Montserrat Avalos, Isabel González, Rodolfo Iniestra, Israel Laguna, Miguel A. Martínez, Patricia Osnaya, Luz M. Reynales, Raydel Valdés, and Julia Martínez (2005), Quantification of Local and Global Benefits from Air Pollution Control in Mexico City Archived September 29, 2011, at the Wayback Machine , Environ. Sci. Technol. 39:1954–1961.
  33. Luis A. CIFUENTES, Enzo SAUMA, Hector JORQUERA and Felipe SOTO (2000), Preliminary Estimation of the Potential Ancillary Benefits for Chile Archived 2012-04-23 at the Wayback Machine , Ancillary Benefits and Costs of Greenhouse Gas Mitigation.
  34. Marko Tainio, Jouni T Tuomisto, Otto Hänninen, Juhani Ruuskanen, Matti J Jantunen, and Juha Pekkanen (2007), Parameter and model uncertainty in a life-table model for fine particles (PM2.5): a statistical modeling study, Environ Health 6(24).
  35. Basson, L.; Petrie, J.G. (2007). "An integrated approach for the consideration of uncertainty in decision making supported by Life Cycle Assessment". Environmental Modelling & Software. 22 (2): 167–176. Bibcode:2007EnvMS..22..167B. doi:10.1016/j.envsoft.2005.07.026.
  36. Matthew F. Bingham, Zhimin Li, Kristy E. Mathews, Colleen M. Spagnardi, Jennifer S. Whaley, Sara G. Veale and Jason C. Kinnell (2011), An Application of Behavioral Modeling to Characterize Urban Angling Decisions and Values, North American Journal of Fisheries Management 31:257–268.
  37. Woodbury, Peter B.; Smith, James E.; Weinstein, David A.; Laurence, John A. (1998). "Assessing potential climate change effects on loblolly pine growth: A probabilistic regional modeling approach". Forest Ecology and Management. 107 (1–3): 99–116. Bibcode:1998ForEM.107...99W. doi: 10.1016/S0378-1127(97)00323-X .
  38. P.R. Richard, M. Power, M. Hammill, and W. Doidge(2003). Eastern Hudson Bay Beluga Precautionary Approach Case Study: Risk analysis models for co-management Archived April 3, 2012, at the Wayback Machine , Canadian Science Advisory Secretariat Research Document.
  39. P.R. Richard (2003), Incorporating Uncertainty in Population Assessments Archived April 3, 2012, at the Wayback Machine , Canadian Science Advisory Secretariat Research Document.
  40. O'Ryan R., Diaz M. (2008), The Use of Probabilistic Analysis to Improve Decision-Making in Environmental Regulation in a Developing Context: The Case of Arsenic Regulation in Chile, Human and Ecological Risk Assessment, Vol 14, Issue 3, pg: 623–640.
  41. Andrew Gronewold and Mark Borsuk, "A probabilistic modeling tool for assessing water quality standard compliance", submitted to EMS Oct 2008.
  42. Borsuk, Mark E.; Reichert, Peter; Peter, Armin; Schager, Eva; Burkhardt-Holm, Patricia (2006). "Assessing the decline of brown trout (Salmo trutta) in Swiss rivers using a Bayesian probability network". Ecological Modelling. 192 (1–2): 224–244. Bibcode:2006EcMod.192..224B. doi:10.1016/j.ecolmodel.2005.07.006.
  43. Borsuk, Mark E.; Stow, Craig A.; Reckhow, Kenneth H. (2004). "A Bayesian network of eutrophication models for synthesis, prediction, and uncertainty analysis". Ecological Modelling. 173 (2–3): 219–239. Bibcode:2004EcMod.173..219B. doi:10.1016/j.ecolmodel.2003.08.020.
  44. Mark E. Borsuk, Sean P. Powers, and Charles H. Peterson (2002), A survival model of the effects of bottom-water hypoxia on the population density of an estuarine clam (Macoma balthica) [ dead link ], Canadian Journal of Fisheries and Aquatic Sciences (59):1266–1274.
  45. Rebecca Montville and Donald Schaffner (Feb 2005), Monte Carlo Simulation of Pathogen Behavior during the Sprout Production Process Archived October 2, 2011, at the Wayback Machine , Applied and Environmental Microbiology 71(2):746–753.
  46. Rasmussen, S.K.J; Ross, T.; Olley, J.; McMeekin, T. (2002). "A process risk model for the shelf life of Atlantic salmon fillets". International Journal of Food Microbiology. 73 (1): 47–60. doi:10.1016/S0168-1605(01)00687-0. PMID   11885573.
  47. Groves, David G.; Lempert, Robert J. (2007). "A new analytic method for finding policy-relevant scenarios". Global Environmental Change. 17 (1): 73–85. Bibcode:2007GEC....17...73G. doi:10.1016/j.gloenvcha.2006.11.006. S2CID   510560.
  48. Senbel, Maged; McDaniels, Timothy; Dowlatabadi, Hadi (2003). "The ecological footprint: A non-monetary metric of human consumption applied to North America". Global Environmental Change. 13 (2): 83–100. Bibcode:2003GEC....13...83S. doi:10.1016/S0959-3780(03)00009-8.
  49. Dowlatabadi, H. (1998). Sensitivity of Climate Change Mitigation Estimates to Assumptions About Technical Change. Energy Economics 20: 473–93.
  50. West, J. J. and H. Dowlatabadi (1998). On assessing the economic impacts of sea level rise on developed coasts. Climate, change and risk. London, Routledge. 205–20.
  51. Leiss, W., H. Dowlatabadi, and Greg Paoli (2001). Who's Afraid of Climate Change? A guide for the perplexed. Isuma 2(4): 95–103.
  52. Morgan, M. G., M. Kandlikar, J. Risbey and H. Dowlatabadi (1999). Why conventional tools for policy analysis are often inadequate for problems of global change. Climatic Change 41: 271–81.
  53. Casman, E. A., M. G. Morgan and H. Dowlatabadi (1999). Mixed Levels of Uncertainty in Complex Policy Models. Risk Analysis 19(1): 33–42.
  54. Dowlatabadi, H. (2003). Scale and Scope In Integrated Assessment: lessons from ten years with ICAM. Scaling in Integrated Assessment. J. Rotmans and D. S. Rothman. Lisse, Swetz & Zeitlinger: 55–72.
  55. Dowlatabadi, H. (2000). Bumping against a gas ceiling. Climatic Change 46(3): 391–407.
  56. Morgan, M. G. and H. Dowlatabadi (1996). Learning From Integrated Assessment of Climate Change. Climatic Change 34: 337–368.
  57. Henry Heimeier (1996), A New Paradigm For Modeling The Precision Strike Process, published in MILCOM96.
  58. Russell F. Richards, Henry A. Neimeier, W. L. Hamm, and D. L. Alexander, "Analytical Modeling in Support of C4ISR Mission Assessment (CMA)," Third International Symposium on Command and Control Research and Technology, National Defense University, Fort McNair, Washington, DC, June 17–20, 1997, pp. 626–639.
  59. Henry Neimeier and C. McGowan (1996), "Analyzing Processes with HANQ", Proceedings of the International Council on Systems Engineering '96.
  60. Kenneth P. Kuskey and Susan K. Parker (2000), "The Architecture of CAPE Models", MITRE technical paper. See Abstract.
  61. Henry Neimeier (1994), "Analytic Queuing Network", Conference Proceedings of the 12th International Conference on the System Dynamics Society, in Stirling, Scotland.
  62. Henry Neimeier (1996), "Analytic Uncertainty Modeling Versus Discrete Event Simulation", PHALANX.
  63. Rahul Tongia, "Can broadband over powerline carrier (PLC) compete?". The author uses Analytica to model the economic viability of the introduction of a PLC service.
  64. Promises and False Promises of PowerLine Carrier (PLC) Broadband Communications – A Techno-Economic Analysis "Archived copy" (PDF). Archived from the original (PDF) on 2007-02-11. Retrieved 2011-07-08.{{cite web}}: CS1 maint: archived copy as title (link)
  65. Kanchana Wanichkorn and Marvin Sirbu (1998), The Economics of Premises Internet Telephony [ permanent dead link ], CMU-EPP.
  66. E.L. Kyser, E.R. Hnatek, M.H. Roettgering (2001), The politics of accelerated stress testing, Sound and Vibration 35(3):24–29.
  67. Kevin J. Soo Hoo (June 2000), How Much Is Enough? A Risk-Management Approach to Computer Security Archived September 21, 2011, at the Wayback Machine , Working Paper, Consortium for Research on Information Security and Policy (CRISP), Stanford University.
  68. M. Steinbach and S. Giles of MITRE (2005), A Model for Joint Infrastructure Investment Archived 2011-09-24 at the Wayback Machine , AIAA-2005-7309, in AIAA 5th ATIO and 16th Lighter-than-air sys tech and balloon systems conferences, Arlington VA, Sep 26–28, 2005.
  69. Bloomfield, R., Guerra, S. (2002), Process modelling to support dependability arguments, Proceedings. International Conference on Dependable Systems and Networks, pg. 113–122. DSN 2002.
  70. Christopher L Weber and Sanath K Kalidas (Fall 2004), Cost-Benefit Analysis of LEED Silver Certification for New House Residence Hall at Carnegie Mellon University, Civil Systems Investment Planning and Pricing Project, Dept. of Civil & Environmental Engineering, Carnegie Mellon University.
  71. J. McMahon, X. Liu, I. Turiel (Jun 2000), Uncertainty and sensitivity analyses of ballast life-cycle cost and payback period, Technical Report LBNL–44450, Lawrence Berkeley Labs, Berkeley CA.
  72. Paul K. Davis (2000), Dealing with complexity: exploratory analysis enabled by multiresolultion, multiperspective modeling, Proceedings of the 32nd Conference on Winter Simulation, pg. 293–302.
  73. Paul K. Davis (2000), Exploratory Analysis Enabled by Multiresolution, Multiperspective Modeling, Proceedings of the 2000 Winter Simulation Conference J. A. Joines, R. R. Barton, K. Kang, and P. A. Fishwick, eds.
  74. NASA (1994), Schedule and Cost Risk Analysis Modeling (SCRAM) System, NASA SBIR Successes.
  75. "Cubeplan case studies". Cubeplan.com. Retrieved 2011-07-12.
  76. "Novix consulting services". Novix.com. Retrieved 2011-07-12.
  77. Enrich Consulting, publications on Portfolio Management Archived July 13, 2011, at the Wayback Machine
  78. "Bicore, Inc". Bicore.nl. Retrieved 2011-07-12.
  79. "R&D evaluation tools at W.L. Gore". Lumina. Archived from the original on October 17, 2013.
  80. Speeding turnaround of the Space Shuttle Archived March 15, 2012, at the Wayback Machine , Lumina case studies
  81. Auto maker saves $250M on warranty costs Archived 2010-12-12 at the Wayback Machine Lumina case studies
  82. Grellier, J., Ravazzani, P., Cardis, E. (2014). "Potential health impacts of residential exposures to extremely low frequency magnetic fields in Europe". Environment International. 62. Pergamon: 55–63. Bibcode:2014EnInt..62...55G. doi:10.1016/j.envint.2013.09.017. hdl: 10044/1/41782 . PMID   24161447.
  83. Grellier, J., White, M. P., De Bell, S., Brousse, O., Elliott, L. R., Fleming, L. E., Heaviside, C., Simpson, C., Taylor, T., Wheeler, B. W., Lovell, R. (May 2024). "Valuing the health benefits of nature-based recreational physical activity in England". Environment International. 187: 108667. Bibcode:2024EnInt.18708667G. doi: 10.1016/j.envint.2024.108667 . ISSN   0160-4120. PMID   38642505.
  84. Neil Wishbow and Max Henrion, "Demos User's Manual", Department of Engineering and Public Policy, Carnegie Mellon University, 1987.