Belief merging, also called belief fusion or propositional belief merging, is a process in which an individual agent aggregates possibly conflicting pieces of information, expressed in logical formulae, into a consistent knowledge-base. Applications include combining conflicting sensor information received by the same agent (see sensor fusion) and combining multiple databases to build an expert system. [1] [2] [3] [4] It also has applications in multi-agent systems.
In the combination approach, we take the union of the knowledge bases (a finite set of logical formulas). If the union is consistent, we are done. Otherwise, we select some maximal consistent subset of it. Baral, Kraus, Minker and Subrahmanian [5] [2] present algorithms for combining knowledge-bases consisting of first-order theories, and to resolve inconsistencies among them.Subrahamanian [3] presents a uniform theoretical framework, based on annotated logics, for combining multiple knowledge bases which may have inconsistencies, uncertainties, and nonmonotonic modes of negation.
In the arbitration approach, the assumption is that all sources of information (both old and new) are equally reliable, so the resulting base should contain as much as possible of both sources. [6] [7]
The merging approach was presented by Konieczny and Perez. [8] There are several differences between combination operators and merging operators: [9]
Konieczny and Perez [10] [11] [12] extended their framework to merging under a set of exogenously imposed constraints that have to be satisfied by the combined database. Their framework is now the standard framework for belief merging. [13] In their framework, a merging operator is a function f that takes as input a vector of n consistent (satisfiable) propositional formulas, P=(p1,...,pn), representing e.g. claims made by n different experts, and another formula c, representing constraints. It should satisfy the following postulates:
They present several operators that satisfy all these properties, e.g.:
Konieczny, Lang and Marquis [14] present the DA2 framework, which generalizes the merging framework. They prove that, in this framework, query entailment from merged bases is only at the first level of the polynomial hierarchy.
Belief merging is somewhat related to social choice, in which opinions of different citizens have to be combined into a single "social" opinion. Meyer, Ghose and Chopra [15] relate belief-merging to social choice, elections and preference aggregation.
Chpora, Ghose and Meyer [16] relate belief-merging to strategyproofness. They show that the Arrow's impossibility theorem and Gibbard–Satterthwaite theorem do not hold in their belief-merging framework.
Everaere, Konieczny and Marquis [17] study belief-merging operators in settings in which the different information sources are strategic, and may try to change their stated beliefs in order to influence the outcome. They study strategyproof merging operators.
Haret and Wallner [18] show that most aggregation procedures are manipulable, and study the computational complexity of finding a manipulation.
Haret, Pfandler and Woltran [19] consider some classic social choice axioms in the context of belief merging.
Haret, Lackner, Pfandler and Wallner [20] study belief-merging operators that satisfy fairness properties, similar to justified representation. To illustrate, suppose three experts support propositions x1,x2,x3,x4 and oppose propositions y1,y2,y3,y4, whereas a fourth expert opposes propositions x1,x2,x3,x4 and supports propositions y1,y2,y3,y4. Then:
Multiwinner voting can be seen as a special case of belief-merging with constraints, where the constraints encode the size of the committee. [21] : Sub.6.7
The formal methods developed for belief merging have been applied in other areas of social epistemology, such as:
In logic and computer science, the Boolean satisfiability problem (sometimes called propositional satisfiability problem and abbreviated SATISFIABILITY, SAT or B-SAT) is the problem of determining if there exists an interpretation that satisfies a given Boolean formula. In other words, it asks whether the variables of a given Boolean formula can be consistently replaced by the values TRUE or FALSE in such a way that the formula evaluates to TRUE. If this is the case, the formula is called satisfiable. On the other hand, if no such assignment exists, the function expressed by the formula is FALSE for all possible variable assignments and the formula is unsatisfiable. For example, the formula "a AND NOT b" is satisfiable because one can find the values a = TRUE and b = FALSE, which make (a AND NOT b) = TRUE. In contrast, "a AND NOT a" is unsatisfiable.
Logic programming is a programming, database and knowledge representation paradigm based on formal logic. A logic program is a set of sentences in logical form, representing knowledge about some problem domain. Computation is performed by applying logical reasoning to that knowledge, to solve problems in the domain. Major logic programming language families include Prolog, Answer Set Programming (ASP) and Datalog. In all of these languages, rules are written in the form of clauses:
In epistemology, the regress argument is the argument that any proposition requires a justification. However, any justification itself requires support. This means that any proposition whatsoever can be endlessly (infinitely) questioned, resulting in infinite regress. It is a problem in epistemology and in any general situation where a statement has to be justified.
Reification is the process by which an abstract idea about a computer program is turned into an explicit data model or other object created in a programming language. A computable/addressable object—a resource—is created in a system as a proxy for a non computable/addressable object. By means of reification, something that was previously implicit, unexpressed, and possibly inexpressible is explicitly formulated and made available to conceptual manipulation. Informally, reification is often referred to as "making something a first-class citizen" within the scope of a particular system. Some aspect of a system can be reified at language design time, which is related to reflection in programming languages. It can be applied as a stepwise refinement at system design time. Reification is one of the most frequently used techniques of conceptual analysis and knowledge representation.
In geometry, an incidence relation is a heterogeneous relation that captures the idea being expressed when phrases such as "a point lies on a line" or "a line is contained in a plane" are used. The most basic incidence relation is that between a point, P, and a line, l, sometimes denoted P I l. If P I l the pair (P, l) is called a flag. There are many expressions used in common language to describe incidence (for example, a line passes through a point, a point lies in a plane, etc.) but the term "incidence" is preferred because it does not have the additional connotations that these other terms have, and it can be used in a symmetric manner. Statements such as "line l1 intersects line l2" are also statements about incidence relations, but in this case, it is because this is a shorthand way of saying that "there exists a point P that is incident with both line l1 and line l2". When one type of object can be thought of as a set of the other type of object (viz., a plane is a set of points) then an incidence relation may be viewed as containment.
Truthmaker theory is "the branch of metaphysics that explores the relationships between what is true and what exists". The basic intuition behind truthmaker theory is that truth depends on being. For example, a perceptual experience of a green tree may be said to be true because there actually is a green tree. But if there were no tree there, it would be false. So the experience by itself does not ensure its truth or falsehood, it depends on something else. Expressed more generally, truthmaker theory is the thesis that "the truth of truthbearers depends on the existence of truthmakers". A perceptual experience is the truthbearer in the example above. Various representational entities, like beliefs, thoughts or assertions can act as truthbearers. Truthmaker theorists are divided about what type of entity plays the role of truthmaker; popular candidates include states of affairs and tropes.
In mathematical logic, a superintuitionistic logic is a propositional logic extending intuitionistic logic. Classical logic is the strongest consistent superintuitionistic logic; thus, consistent superintuitionistic logics are called intermediate logics.
Belief revision is the process of changing beliefs to take into account a new piece of information. The logical formalization of belief revision is researched in philosophy, in databases, and in artificial intelligence for the design of rational agents.
The closed-world assumption (CWA), in a formal system of logic used for knowledge representation, is the presumption that a statement that is true is also known to be true. Therefore, conversely, what is not currently known to be true, is false. The same name also refers to a logical formalization of this assumption by Raymond Reiter. The opposite of the closed-world assumption is the open-world assumption (OWA), stating that lack of knowledge does not imply falsity. Decisions on CWA vs. OWA determine the understanding of the actual semantics of a conceptual expression with the same notations of concepts. A successful formalization of natural language semantics usually cannot avoid an explicit revelation of whether the implicit logical backgrounds are based on CWA or OWA.
Symbolic Link (SYLK) is a Microsoft file format typically used to exchange data between applications, specifically spreadsheets. SYLK files conventionally have a .slk
suffix. Composed of only displayable ANSI characters, it can be easily created and processed by other applications, such as databases.
A "production system" is a computer program typically used to provide some form of artificial intelligence, which consists primarily of a set of rules about behavior, but it also includes the mechanism necessary to follow those rules as the system responds to states of the world. Those rules, termed productions, are a basic representation found useful in automated planning, expert systems and action selection.
In computer science and recursion theory the McCarthy Formalism (1963) of computer scientist John McCarthy clarifies the notion of recursive functions by use of the IF-THEN-ELSE construction common to computer science, together with four of the operators of primitive recursive functions: zero, successor, equality of numbers and composition. The conditional operator replaces both primitive recursion and the mu-operator.
Imperative logic is the field of logic concerned with imperatives. In contrast to declaratives, it is not clear whether imperatives denote propositions or more generally what role truth and falsity play in their semantics. Thus, there is almost no consensus on any aspect of imperative logic.
In artificial intelligence, action description language (ADL) is an automated planning and scheduling system in particular for robots. It is considered an advancement of STRIPS. Edwin Pednault proposed this language in 1987. It is an example of an action language.
The PROPT MATLAB Optimal Control Software is a new generation platform for solving applied optimal control and parameters estimation problems.
In probability theory and statistics, there are several relationships among probability distributions. These relations can be categorized in the following groups:
Concepts are an extension to the templates feature provided by the C++ programming language. Concepts are named Boolean predicates on template parameters, evaluated at compile time. A concept may be associated with a template, in which case it serves as a constraint: it limits the set of arguments that are accepted as template parameters.
LOOP is a simple register language that precisely captures the primitive recursive functions. The language is derived from the counter-machine model. Like the counter machines the LOOP language comprises a set of one or more unbounded registers, each of which can hold a single non-negative integer. A few arithmetic instructions operate on the registers. The only control flow instruction is 'LOOP x DO...END'. It causes the instructions within its scope to be repeated x times.
Probabilistic Soft Logic (PSL) is a statistical relational learning (SRL) framework for modeling probabilistic and relational domains. It is applicable to a variety of machine learning problems, such as collective classification, entity resolution, link prediction, and ontology alignment. PSL combines two tools: first-order logic, with its ability to succinctly represent complex phenomena, and probabilistic graphical models, which capture the uncertainty and incompleteness inherent in real-world knowledge. More specifically, PSL uses "soft" logic as its logical component and Markov random fields as its statistical model. PSL provides sophisticated inference techniques for finding the most likely answer (i.e. the maximum a posteriori (MAP) state). The "softening" of the logical formulas makes inference a polynomial time operation rather than an NP-hard operation.
Dynamic epistemic logic (DEL) is a logical framework dealing with knowledge and information change. Typically, DEL focuses on situations involving multiple agents and studies how their knowledge changes when events occur. These events can change factual properties of the actual world : for example a red card is painted in blue. They can also bring about changes of knowledge without changing factual properties of the world : for example a card is revealed publicly to be red. Originally, DEL focused on epistemic events. We only present in this entry some of the basic ideas of the original DEL framework; more details about DEL in general can be found in the references.
{{cite journal}}
: Cite journal requires |journal=
(help)This article needs additional or more specific categories .(November 2023) |