Evidential reasoning approach

Last updated

In decision theory, the evidential reasoning approach (ER) is a generic evidence-based multi-criteria decision analysis (MCDA) approach for dealing with problems having both quantitative and qualitative criteria under various uncertainties including ignorance and randomness. It has been used to support various decision analysis, assessment and evaluation activities such as environmental impact assessment [1] and organizational self-assessment [2] based on a range of quality models.

Contents

Overview

The evidential reasoning approach has recently been developed on the basis of decision theory in particular utility theory, [3] artificial intelligence in particular the theory of evidence, [4] statistical analysis and computer technology. It uses a belief structure to model an assessment with uncertainty, a belief decision matrix to represent an MCDA problem under uncertainty, evidential reasoning algorithms [5] to aggregate criteria for generating distributed assessments, and the concepts of the belief and plausibility functions to generate a utility interval for measuring the degree of ignorance. A conventional decision matrix used for modeling an MCDA problem is a special case of a belief decision matrix. [6] [7]

The use of belief decision matrices for MCDA problem modelling in the ER approach results in the following features:

  1. An assessment of an option can be more reliably and realistically represented by a belief decision matrix than by a conventional decision matrix.
  2. It accepts data of different formats with various types of uncertainties as inputs, such as single numerical values, probability distributions, and subjective judgments with belief degrees.
  3. It allows all available information embedded in different data formats, including qualitative and incomplete data, to be maximally incorporated in assessment and decision making processes.
  4. It allows assessment outcomes to be represented more informatively.
  5. It reduces the uncertainty in human and AI decisions [8] .


See also

Related Research Articles

<span class="mw-page-title-main">Dempster–Shafer theory</span> Mathematical framework to model epistemic uncertainty

The theory of belief functions, also referred to as evidence theory or Dempster–Shafer theory (DST), is a general framework for reasoning with uncertainty, with understood connections to other frameworks such as probability, possibility and imprecise probability theories. First introduced by Arthur P. Dempster in the context of statistical inference, the theory was later developed by Glenn Shafer into a general framework for modeling epistemic uncertainty—a mathematical theory of evidence. The theory allows one to combine evidence from different sources and arrive at a degree of belief that takes into account all the available evidence.

<span class="mw-page-title-main">Multiple-criteria decision analysis</span> Operations research that evaluates multiple conflicting criteria in decision making

Multiple-criteria decision-making (MCDM) or multiple-criteria decision analysis (MCDA) is a sub-discipline of operations research that explicitly evaluates multiple conflicting criteria in decision making. It is also known as multiple attribute utility theory, multiple attribute value theory, multiple attribute preference theory, and multi-objective decision analysis.

Imprecise probability generalizes probability theory to allow for partial probability specifications, and is applicable when information is scarce, vague, or conflicting, in which case a unique probability distribution may be hard to identify. Thereby, the theory aims to represent the available knowledge more accurately. Imprecision is useful for dealing with expert elicitation, because:

Info-gap decision theory seeks to optimize robustness to failure under severe uncertainty, in particular applying sensitivity analysis of the stability radius type to perturbations in the value of a given estimate of the parameter of interest. It has some connections with Wald's maximin model; some authors distinguish them, others consider them instances of the same principle.

Evidential reason or evidential reasoning may refer to:

<span class="mw-page-title-main">Analysis of competing hypotheses</span> Process to evaluate alternative hypotheses

The analysis of competing hypotheses (ACH) is a methodology for evaluating multiple competing hypotheses for observed data. It was developed by Richards (Dick) J. Heuer, Jr., a 45-year veteran of the Central Intelligence Agency, in the 1970s for use by the Agency. ACH is used by analysts in various fields who make judgments that entail a high risk of error in reasoning. ACH aims to help an analyst overcome, or at least minimize, some of the cognitive limitations that make prescient intelligence analysis so difficult to achieve.

A belief structure is a distributed assessment with beliefs.

A decision matrix is a list of values in rows and columns that allows an analyst to systematically identify, analyze, and rate the performance of relationships between sets of values and information. Elements of a decision matrix show decisions based on certain decision criteria. The matrix is useful for looking at large masses of decision factors and assessing each factor's relative significance by weighting them by importance.

Grey relational analysis (GRA) was developed by Deng Julong of Huazhong University of Science and Technology. It is one of the most widely used models of grey system theory. GRA uses a specific concept of information. It defines situations with no information as black, and those with perfect information as white. However, neither of these idealized situations ever occurs in real world problems. In fact, situations between these extremes, which contain partial information, are described as being grey, hazy or fuzzy. A variant of GRA model, Taguchi-based GRA model, is a popular optimization method in manufacturing engineering.

<span class="mw-page-title-main">Risk</span> Possibility of something bad happening

In simple terms, risk is the possibility of something bad happening. Risk involves uncertainty about the effects/implications of an activity with respect to something that humans value, often focusing on negative, undesirable consequences. Many different definitions have been proposed. One international standard definition of risk is the "effect of uncertainty on objectives".

The scenario approach or scenario optimization approach is a technique for obtaining solutions to robust optimization and chance-constrained optimization problems based on a sample of the constraints. It also relates to inductive reasoning in modeling and decision-making. The technique has existed for decades as a heuristic approach and has more recently been given a systematic theoretical foundation.

Robust decision-making (RDM) is an iterative decision analytics framework that aims to help identify potential robust strategies, characterize the vulnerabilities of such strategies, and evaluate the tradeoffs among them. RDM focuses on informing decisions under conditions of what is called "deep uncertainty", that is, conditions where the parties to a decision do not know or do not agree on the system models relating actions to consequences or the prior probability distributions for the key input parameters to those models.

The weighted product model (WPM) is a popular multi-criteria decision analysis (MCDA) / multi-criteria decision making (MCDM) method. It is similar to the weighted sum model (WSM) in that it produces a simple score, but has the very important advantage of overcoming the issue of 'adding apples and pears' i.e. adding together quantities measured in different units. While there are various ways of normalizing the data beforehand, the results of the weighted sum model differ according to which normalization is chosen. The weighted product approach does not require any normalization because it uses multiplication instead of addition to aggregate the data.

The decision-making paradox is a phenomenon related to decision-making and the quest for determining reliable decision-making methods. It was first described by Triantaphyllou, and has been recognized in the related literature as a fundamental paradox in multi-criteria decision analysis (MCDA), multi-criteria decision making (MCDM) and decision analysis since then.

In multiple criteria decision aiding (MCDA), multicriteria classification involves problems where a finite set of alternative actions should be assigned into a predefined set of preferentially ordered categories (classes). For example, credit analysts classify loan applications into risk categories, customers rate products and classify them into attractiveness groups, candidates for a job position are evaluated and their applications are approved or rejected, technical systems are prioritized for inspection on the basis of their failure risk, clinicians classify patients according to the extent to which they have a complex disease or not, etc.

DecideIT is a decision-making software for the Microsoft Windows operating system. It is based on multi-criteria decision making (MCDM) and the multi-attribute value theory (MAVT). It supports both value tree analysis for multi-attribute decision problems as well as decision tree analysis for evaluating decisions under risk and can combine these structures in a common model.

D-Sight is a company that specializes in decision support software and associated services in the domains of project prioritization, supplier selection and collaborative decision-making. It was founded in 2010 as a spin-off from the Université Libre de Bruxelles (ULB). Their headquarters are located in Brussels, Belgium.

Intelligent Decision System (IDS) is a software package for multiple criteria decision analysis. It can handle hybrid types of uncertainty, including probability uncertainty, missing data, subjective judgements, interval data, and any combination of those types of uncertainty. It uses belief function for problem modelling and the Evidential Reasoning Approach for attribute aggregation. The outcomes of the analysis include not only ranking of alternative courses of action based on average scores, but also aggregated performance distribution of each alternative for supporting informed and transparent decision making.

<span class="mw-page-title-main">Problem structuring methods</span>

Problem structuring methods (PSMs) are a group of techniques used to model or to map the nature or structure of a situation or state of affairs that some people want to change. PSMs are usually used by a group of people in collaboration to create a consensus about, or at least to facilitate negotiations about, what needs to change. Some widely adopted PSMs include

Valerie Belton, commonly known as Val Belton, is a retired professor of management science at University of Strathclyde. She is a researcher who has worked on the design and application of multi-criteria decision making (MCDM) approaches for over 30 years. She co-authored a book on this field Multicriteria Decision Analysis: An Integrated Approach, that was released in 2002. She has attempted to incorporate multi-criteria decision analysis with problem structuring techniques, system dynamics, and other analytical approaches. She has a number of scholarly articles to her name and served as the editor of the journal Multi-Criteria Decision Analysis.

References

  1. Wang Y.M.; Yang J.B.; Xu D.L. (2006). "Environmental Impact Assessment Using the Evidential Reasoning Approach". European Journal of Operational Research. 174 (3): 1885–1913. doi:10.1016/j.ejor.2004.09.059.
  2. Siow C.H.R.; Yang J.B.; Dale B.G. (2001). "A new modelling framework for organisational self-assessment: development and application". Quality Management Journal. 8 (4): 34–47. doi:10.1080/10686967.2001.11918982.
  3. Keeney, R.L.; Raiffa, H. (1976). Decisions with Multiple Objectives. Cambridge University Press. ISBN   978-0-521-43883-4.
  4. Shafer, G.A. (1976). Mathematical Theory of Evidence . Princeton University Press. ISBN   978-0-691-08175-5.
  5. Yang J.B.; Xu D.L. (2002). "On the evidential reasoning algorithm for multiple attribute decision analysis under uncertainty". IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans. 32 (3): 289–304. doi:10.1109/TSMCA.2002.802746.
  6. Xu D.L.; Yang J.B.; Wang Y.M. (2006). "The ER approach for multi-attribute decision analysis under interval uncertainties". European Journal of Operational Research. 174 (3): 1914–43. doi:10.1016/j.ejor.2005.02.064.
  7. Yang J.B.; Xu D.L. (2013). "Evidential Reasoning Rule for Evidence Combination". Artificial Intelligence. 205: 1–29. doi: 10.1016/j.artint.2013.09.003 .
  8. Sachan, S.; Almaghrabi, F.; Yang, J.B.; Xu, D.L. (2024). "Human-AI collaboration to mitigate decision noise in financial underwriting: A study on FinTech innovation in a lending firm". International Review of Financial Analysis. 93: 103149. doi: 10.1016/j.irfa.2024.103149 .