Sensitivity auditing

Last updated

Sensitivity auditing is an extension of sensitivity analysis for use in policy-relevant modelling studies. [1] Its use is recommended - i.a. in the European Commission Impact assessment guidelines [2] and by the European Science Academies [3] - when a sensitivity analysis (SA) of a model-based study is meant to demonstrate the robustness of the evidence provided by the model in the context whereby the inference feeds into a policy or decision-making process.

Contents

Approach

In settings where scientific work feeds into policy, the framing of the analysis, its institutional context, and the motivations of its author may become highly relevant, and a pure SA - with its focus on quantified uncertainty - may be insufficient. The emphasis on the framing may, among other things, derive from the relevance of the policy study to different constituencies that are characterized by different norms and values, and hence by a different story about `what the problem is' and foremost about `who is telling the story'. Most often the framing includes implicit assumptions, which could be political (e.g. which group needs to be protected) all the way to technical (e.g. which variable can be treated as a constant).

In order to take these concerns into due consideration, sensitivity auditing extends the instruments of sensitivity analysis to provide an assessment of the entire knowledge- and model-generating process. It takes inspiration from NUSAP, [4] a method used to communicate the quality of quantitative information with the generation of `Pedigrees' of numbers. Likewise, sensitivity auditing has been developed to provide pedigrees of models and model-based inferences. Sensitivity auditing is especially suitable in an adversarial context, where not only the nature of the evidence, but also the degree of certainty and uncertainty associated to the evidence, is the subject of partisan interests. These are the settings considered in Post-normal science [5] or in Mode 2 [6] science. Post-normal science (PNS) is a concept developed by Silvio Funtowicz and Jerome Ravetz, [5] [7] [8] which proposes a methodology of inquiry that is appropriate when “facts are uncertain, values in dispute, stakes high and decisions urgent” (Funtowicz and Ravetz, 1992: [8] 251–273). Mode 2 Science, coined in 1994 by Gibbons et al., refers to a mode of production of scientific knowledge that is context-driven, problem-focused and interdisciplinary. Sensitivity auditing consists of a seven-point checklist:

1. Use Math Wisely: Ask if complex math is being used when simpler math could do the job. Check if the model is being stretched beyond its intended use.

2. Look for Assumptions: Find out what assumptions were made in the study, and see if they were clearly stated or hidden.

3. Avoid Garbage In, Garbage Out: Check if the data used in the model were manipulated to make the results look more certain than they really are, or if they were made overly uncertain to avoid regulation.

4. Prepare for Criticism: It's better to find problems in your study before others do. Do robust checks for uncertainty and sensitivity before publishing.

5. Be Transparent: Don't keep your model a secret. Make it clear and understandable to the public.

6. Focus on the Right Problem: Ensure your model is addressing the correct issue and not just solving a problem that isn't really there.

7. Do Thorough Analyses: Conduct in-depth tests to measure uncertainty and sensitivity using the best methods available.

Questions addressed by sensitivity auditing

These rules are meant to help an analyst to anticipate criticism, in particular relating to model-based inference feeding into an impact assessment. What questions and objections may be received by the modeler? Here is a possible list:

Sensitivity auditing in the European Commission Guidelines

Sensitivity auditing is described in the European Commission Guidelines for impact assessment. [2] Relevants excerpts are (pp. 392):

"[… ]where there is a major disagreement among stakeholders about the nature of the problem, … then sensitivity auditing is more suitable but sensitivity analysis is still advisable as one of the steps of sensitivity auditing."
"Sensitivity auditing, […] is a wider consideration of the effect of all types of uncertainty, including structural assumptions embedded in the model, and subjective decisions taken in the framing of the problem."
"The ultimate aim is to communicate openly and honestly the extent to which particular models can be used to support policy decisions and what their limitations are."
"In general sensitivity auditing stresses the idea of honestly communicating the extent to which model results can be trusted, taking into account as much as possible all forms of potential uncertainty, and to anticipate criticism by third parties."

SAPEA report

The European Academies’ association of science for policy SAPEA describes in detail sensitivity auditing in its 2019 report entitled “Making sense of science for policy under conditions of complexity and uncertainty”. [3]

Related Research Articles

Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system can be divided and allocated to different sources of uncertainty in its inputs. A related practice is uncertainty analysis, which has a greater focus on uncertainty quantification and propagation of uncertainty; ideally, uncertainty and sensitivity analysis should be run in tandem.

Public awareness of science (PAwS) is everything relating to the awareness, attitudes, behaviors, opinions, and activities that comprise the relations between the general public or lay society as a whole to scientific knowledge and organization. This concept is also known as public understanding of science (PUS), or more recently, public engagement with science and technology (PEST). It is a comparatively new approach to the task of exploring the multitude of relations and linkages science, technology, and innovation have among the general public. While early work in the discipline focused on increasing or augmenting the public's knowledge of scientific topics, in line with the information deficit model of science communication, the deficit model has largely been abandoned by science communication researchers. Instead, there is an increasing emphasis on understanding how the public chooses to use scientific knowledge and on the development of interfaces to mediate between expert and lay understandings of an issue. Newer frameworks of communicating science include the dialogue and the participation models. The dialogue model aims to create spaces for conversations between scientists and non-scientists to occur while the participation model aims to include non-scientists in the process of science.

<span class="mw-page-title-main">Post-normal science</span> Approach to the use of science on urgent issues involving uncertainty in facts and moral values

Post-normal science (PNS) was developed in the 1990s by Silvio Funtowicz and Jerome R. Ravetz. It is a problem-solving strategy appropriate when "facts [are] uncertain, values in dispute, stakes high and decisions urgent", conditions often present in policy-relevant research. In those situations, PNS recommends suspending temporarily the traditional scientific ideal of truth, concentrating on quality as assessed by internal and extended peer communities.

Info-gap decision theory seeks to optimize robustness to failure under severe uncertainty, in particular applying sensitivity analysis of the stability radius type to perturbations in the value of a given estimate of the parameter of interest. It has some connections with Wald's maximin model; some authors distinguish them, others consider them instances of the same principle.

The rational planning model is a model of the planning process involving a number of rational actions or steps. Taylor (1998) outlines five steps, as follows:

In science, engineering, and research, expert elicitation is the synthesis of opinions of authorities of a subject where there is uncertainty due to insufficient data or when such data is unattainable because of physical constraints or lack of resources. Expert elicitation is essentially a scientific consensus methodology. It is often used in the study of rare events. Expert elicitation allows for parametrization, an "educated guess", for the respective topic under study. Expert elicitation generally quantifies uncertainty.

<span class="mw-page-title-main">Jerome Ravetz</span> American philosopher

Jerome (Jerry) Ravetz is a philosopher of science. He is best known for his books analysing scientific knowledge from a social and ethical perspective, focussing on issues of quality. He is the co-author of the NUSAP notational system and of Post-normal science. He is currently an Associate Fellow at the Institute for Science, Innovation and Society, University of Oxford.

Robust decision-making (RDM) is an iterative decision analytics framework that aims to help identify potential robust strategies, characterize the vulnerabilities of such strategies, and evaluate the tradeoffs among them. RDM focuses on informing decisions under conditions of what is called "deep uncertainty", that is, conditions where the parties to a decision do not know or do not agree on the system models relating actions to consequences or the prior probability distributions for the key input parameters to those models.

In statistics, robust Bayesian analysis, also called Bayesian sensitivity analysis, is a type of sensitivity analysis applied to the outcome from Bayesian inference or Bayesian optimal decisions.

Analytica is a visual software developed by Lumina Decision Systems for creating, analyzing and communicating quantitative decision models. It combines hierarchical influence diagrams for visual creation and view of models, intelligent arrays for working with multidimensional data, Monte Carlo simulation for analyzing risk and uncertainty, and optimization, including linear and nonlinear programming. Its design is based on ideas from the field of decision analysis. As a computer language, it combines a declarative (non-procedural) structure for referential transparency, array abstraction, and automatic dependency maintenance for efficient sequencing of computation.

NUSAP is a notational system for the management and communication of uncertainty in science for policy, based on five categories for characterizing any quantitative statement: Numeral, Unit, Spread, Assessment and Pedigree. NUSAP was introduced by Silvio Funtowicz and Jerome Ravetz in the 1990 book Uncertainty and Quality in Science for Policy. See also van der Sluijs et al. 2005.

<i>Uncertainty and Quality in Science for Policy</i>

Uncertainty and Quality in Science for Policy is a 1990 book by Silvio Funtowicz and Jerome Ravetz, in which the authors explain the notational system NUSAP and applies it to several examples from the environmental sciences. The work is considered foundational to the development of post-normal science.

<span class="mw-page-title-main">Silvio Funtowicz</span> Philosopher of science

Silvio O. Funtowicz is a philosopher of science active in the field of science and technology studies. He created the NUSAP, a notational system for characterising uncertainty and quality in quantitative expressions, and together with Jerome R. Ravetz he introduced the concept of post-normal science. He is currently a guest researcher at the Centre for the Study of the Sciences and the Humanities (SVT), University of Bergen (Norway).

<i>Science on the Verge</i>

Science on the verge is a book written in 2016 by group of eight scholars working in the tradition of Post-normal science. The book analyzes the main features and possible causes of the present science's crisis.

<i>The No Nonsense Guide To Science</i> 2006 non-fiction book by Jerome Ravetz

The No Nonsense Guide to Science is a 2006 book on Post-normal science (PNS). It was written by American born British historian and philosopher of science Jerome Ravetz.

Quantitative storytelling (QST) is a systematic approach to exploring the many frames potentially legitimate in a scientific study or controversy. QST assumes that, in an interconnected society, multiple frameworks and worldviews are legitimately upheld by different entities and social actors. QST looks critically at models used in evidence-based policy. Such models are often reductionist in that tractability is achieved at the expense of suppressing available evidence. QST suggests corrective approaches to this practice.

Sensitivity analysis studies the relation between the uncertainty in a model-based the inference and the uncertainties in the model assumptions. Sensitivity analysis can play an important role in epidemiology, for example in assessing the influence of the unmeasured confounding on the causal conclusions of a study. It is also important in all mathematical modelling studies of epidemics.

Sensitivity analysis studies the relationship between the output of a model and its input variables or assumptions. Historically, the need for a role of sensitivity analysis in modelling, and many applications of sensitivity analysis have originated from environmental science and ecology.

<span class="mw-page-title-main">Andrea Saltelli</span> Italian researcher (born 1953)

Andrea Saltelli is an Italian scholar specializing in quantification using statistical and sociological tools. He has extended the theory of sensitivity analysis to sensitivity auditing, focusing on physical chemistry, environmental statistics, impact assessment and science for policy. He is currently Counsellor at the UPF Barcelona School of Management.

The concept of Extended peer community belongs to the field of Sociology of science, and in particular the use of science in the solution of social, political or ecological problems. It was first introduced by in the 1990s by Silvio Funtowicz and Jerome R. Ravetz. in the context of what would become Post-normal science. An Extended peer community is intended by these authors as a space where both credentialed experts from different disciplines and lay stakeholders can discuss and deliberate.

References

  1. Saltelli, Andrea, Ângela; Guimaraes Pereira, Jeroen P. van der Sluijs, and Silvio Funtowicz. 2013. ‘What Do I Make of Your Latinorum. Sensitivity Auditing of Mathematical Modelling’. International Journal of Foresight and Innovation Policy 9 (2/3/4): 213–34. https://doi.org/10.1504/IJFIP.2013.058610.
  2. 1 2 European Commission. 2021. “Better Regulation Toolbox.” November 25.
  3. 1 2 Science Advice for Policy by European Academies, Making sense of science for policy under conditions of complexity and uncertainty, Berlin, 2019.
  4. Van der Sluijs JP, Craye M, Funtowicz S, Kloprogge P, Ravetz J, Risbey J (2005) Combining quantitative and qualitative measures of uncertainty in model based environmental assessment: the NUSAP system. Risk Analysis 25(2):481-492
  5. 1 2 Funtowicz, S. O. & Ravetz, J. R. 1993. Science for the post-normal age. Futures, 25(7), 739–755.
  6. Gibbons, Michael; Camille Limoges; Helga Nowotny; Simon Schwartzman; Peter Scott; Martin Trow (1994). The new production of knowledge: the dynamics of science and research in contemporary societies. London: Sage. ISBN   0-8039-7794-8.
  7. Funtowicz, S.O. and Jerome R. Ravetz (1991). "A New Scientific Methodology for Global Environmental Issues." In Ecological Economics: The Science and Management of Sustainability. Ed. Robert Costanza. New York: Columbia University Press: 137–152.
  8. 1 2 Funtowicz, S. O., & Ravetz, J. R. 1992. Three types of risk assessment and the emergence of postnormal science. In S. Krimsky & D. Golding (Eds.), Social theories of risk (pp. 251–273). Westport, CT: Greenwood.