Inquisitive semantics

Last updated

Inquisitive semantics is a framework in logic and natural language semantics. In inquisitive semantics, the semantic content of a sentence captures both the information that the sentence conveys and the issue that it raises. The framework provides a foundation for the linguistic analysis of statements and questions. [1] [2] It was originally developed by Ivano Ciardelli, Jeroen Groenendijk, Salvador Mascarenhas, and Floris Roelofsen. [3] [4] [5] [6] [7]

Contents

Basic notions

The essential notion in inquisitive semantics is that of an inquisitive proposition .

Inquisitive propositions encode informational content via the region of logical space that their information states cover. For instance, the inquisitive proposition encodes the information that {w} is the actual world. The inquisitive proposition encodes that the actual world is either or .

An inquisitive proposition encodes inquisitive content via its maximal elements, known as alternatives. For instance, the inquisitive proposition has two alternatives, namely and . Thus, it raises the issue of whether the actual world is or while conveying the information that it must be one or the other. The inquisitive proposition encodes the same information but does not raise an issue since it contains only one alternative.

The informational content of an inquisitive proposition can be isolated by pooling its constituent information states as shown below.

Inquisitive propositions can be used to provide a semantics for the connectives of propositional logic since they form a Heyting algebra when ordered by the subset relation. For instance, for every proposition P there exists a relative pseudocomplement , which amounts to . Similarly, any two propositions P and Q have a meet and a join, which amount to and respectively. Thus inquisitive propositions can be assigned to formulas of as shown below.

Given a model where W is a set of possible worlds and V is a valuation function:

The operators ! and ? are used as abbreviations in the manner shown below.

Conceptually, the !-operator can be thought of as cancelling the issues raised by whatever it applies to while leaving its informational content untouched. For any formula , the inquisitive proposition expresses the same information as , but it may differ in that it raises no nontrivial issues. For example, if is the inquisitive proposition P from a few paragraphs ago, then is the inquisitive proposition Q.

The ?-operator trivializes the information expressed by whatever it applies to, while converting information states that would establish that its issues are unresolvable into states that resolve it. This is very abstract, so consider another example. Imagine that logical space consists of four possible worlds, w1, w2, w3, and w4, and consider a formula such that contains {w1}, {w2}, and of course . This proposition conveys that the actual world is either w1 or w2 and raises the issue of which of those worlds it actually is. Therefore, the issue it raises would not be resolved if we learned that the actual world is in the information state {w3, w4}. Rather, learning this would show that the issue raised by our toy proposition is unresolvable. As a result, the proposition contains all the states of , along with {w3, w4} and all of its subsets.

See also

Related Research Articles

First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.

Propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. It deals with propositions and relations between propositions, including the construction of arguments based on them. Compound propositions are formed by connecting propositions by logical connectives. Propositions that contain no logical connectives are called atomic propositions.

Contradiction Logical incompatibility between two or more propositions

In traditional logic, a contradiction occurs when a proposition conflicts either with itself or established fact. It is often used as a tool to detect disingenuous beliefs and bias. Illustrating a general tendency in applied logic, Aristotle's law of noncontradiction states that "It is impossible that the same thing can at the same time both belong and not belong to the same object and in the same respect."

Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems of intuitionistic logic do not include the law of the excluded middle and double negation elimination, which are fundamental inference rules in classical logic.

In propositional logic, double negation is the theorem that states that "If a statement is true, then it is not the case that the statement is not true." This is expressed by saying that a proposition A is logically equivalent to not (not-A), or by the formula A ≡ ~(~A) where the sign ≡ expresses logical equivalence and the sign ~ expresses negation.

Computation tree logic (CTL) is a branching-time logic, meaning that its model of time is a tree-like structure in which the future is not determined; there are different paths in the future, any one of which might be an actual path that is realized. It is used in formal verification of software or hardware artifacts, typically by software applications known as model checkers, which determine if a given artifact possesses safety or liveness properties. For example, CTL can specify that when some initial condition is satisfied, then all possible executions of a program avoid some undesirable condition. In this example, the safety property could be verified by a model checker that explores all possible transitions out of program states satisfying the initial condition and ensures that all such executions satisfy the property. Computation tree logic belongs to a class of temporal logics that includes linear temporal logic (LTL). Although there are properties expressible only in CTL and properties expressible only in LTL, all properties expressible in either logic can also be expressed in CTL*.

Independence-friendly logic is an extension of classical first-order logic (FOL) by means of slashed quantifiers of the form and , where is a finite set of variables. The intended reading of is "there is a which is functionally independent from the variables in ". IF logic allows one to express more general patterns of dependence between variables than those which are implicit in first-order logic. This greater level of generality leads to an actual increase in expressive power; the set of IF sentences can characterize the same classes of structures as existential second-order logic. For example, it can express branching quantifier sentences, such as the formula which expresses infinity in the empty signature; this cannot be done in FOL. Therefore, first-order logic cannot, in general, express this pattern of dependency, in which depends only on and , and depends only on and . IF logic is more general than branching quantifiers, for example in that it can express dependencies that are not transitive, such as in the quantifier prefix , which expresses that depends on , and depends on , but does not depend on .

In mathematical logic, Craig's interpolation theorem is a result about the relationship between different logical theories. Roughly stated, the theorem says that if a formula φ implies a formula ψ, and the two have at least one atomic variable symbol in common, then there is a formula ρ, called an interpolant, such that every non-logical symbol in ρ occurs both in φ and ψ, φ implies ρ, and ρ implies ψ. The theorem was first proved for first-order logic by William Craig in 1957. Variants of the theorem hold for other logics, such as propositional logic. A stronger form of Craig's interpolation theorem for first-order logic was proved by Roger Lyndon in 1959; the overall result is sometimes called the Craig–Lyndon theorem.

In propositional logic, transposition is a valid rule of replacement that permits one to switch the antecedent with the consequent of a conditional statement in a logical proof if they are also both negated. It is the inference from the truth of "A implies B" to the truth of "Not-B implies not-A", and conversely. It is very closely related to the rule of inference modus tollens. It is the rule that

Epistemic modal logic is a subfield of modal logic that is concerned with reasoning about knowledge. While epistemology has a long philosophical tradition dating back to Ancient Greece, epistemic logic is a much more recent development with applications in many fields, including philosophy, theoretical computer science, artificial intelligence, economics and linguistics. While philosophers since Aristotle have discussed modal logic, and Medieval philosophers such as Avicenna, Ockham, and Duns Scotus developed many of their observations, it was C. I. Lewis who created the first symbolic and systematic approach to the topic, in 1912. It continued to mature as a field, reaching its modern form in 1963 with the work of Kripke.

The term "information algebra" refers to mathematical techniques of information processing. Classical information theory goes back to Claude Shannon. It is a theory of information transmission, looking at communication and storage. However, it has not been considered so far that information comes from different sources and that it is therefore usually combined. It has furthermore been neglected in classical information theory that one wants to extract those parts out of a piece of information that are relevant to specific questions.

In theoretical computer science, the modal μ-calculus is an extension of propositional modal logic by adding the least fixed point operator μ and the greatest fixed point operator ν, thus a fixed-point logic.

In logic, especially mathematical logic, a Hilbert system, sometimes called Hilbert calculus, Hilbert-style deductive system or Hilbert–Ackermann system, is a type of system of formal deduction attributed to Gottlob Frege and David Hilbert. These deductive systems are most often studied for first-order logic, but are of interest for other logics as well.

Dynamic semantics is a framework in logic and natural language semantics that treats the meaning of a sentence as its potential to update a context. In static semantics, knowing the meaning of a sentence amounts to knowing when it is true; in dynamic semantics, knowing the meaning of a sentence means knowing "the change it brings about in the information state of anyone who accepts the news conveyed by it." In dynamic semantics, sentences are mapped to functions called context change potentials, which take an input context and return an output context. Dynamic semantics was originally developed by Irene Heim and Hans Kamp in 1981 to model anaphora, but has since been applied widely to phenomena including presupposition, plurals, questions, discourse relations, and modality.

In modal logic, standard translation is a way of transforming formulas of modal logic into formulas of first-order logic which capture the meaning of the modal formulas. Standard translation is defined inductively on the structure of the formula. In short, atomic formulas are mapped onto unary predicates and the objects in the first-order language are the accessible worlds. The logical connectives from propositional logic remain untouched and the modal operators are transformed into first-order formulas according to their semantics.

In mathematical logic, a redundant proof is a proof that has a subset that is a shorter proof of the same result. That is, a proof of is considered redundant if there exists another proof of such that and where is the number of nodes in .

Dynamic epistemic logic (DEL) is a logical framework dealing with knowledge and information change. Typically, DEL focuses on situations involving multiple agents and studies how their knowledge changes when events occur. These events can change factual properties of the actual world : for example a red card is painted in blue. They can also bring about changes of knowledge without changing factual properties of the world : for example a card is revealed publicly to be red. Originally, DEL focused on epistemic events. We only present in this entry some of the basic ideas of the original DEL framework; more details about DEL in general can be found in the references.

Metric temporal logic (MTL) is a special case of temporal logic. It is an extension of temporal logic in which temporal operators are replaced by time-constrained versions like until, next, since and previous operators. It is a linear-time logic that assumes both the interleaving and fictitious-clock abstractions. It is defined over a point-based weakly-monotonic integer-time semantics. For MTL, the exact complexity of the satisfiability problems is known and independent of interval-based or point-based, synchronous or asynchronous interpretation: EXPSPACE-complete.

In formal semantics, a Hurford disjunction is a disjunction in which one of the disjuncts entails the other. The concept was first identified by British linguist James Hurford. The sentence "Mary is in the Netherlands or she is in Amsterdam" is an example of a Hurford disjunction since one cannot be in Amsterdam without being in the Netherlands. Other examples are shown below:

  1. #Tamina saw a Beatle or Paul McCartney.
  2. #The number I'm thinking of is divisible by 4 or it's even.
  3. #Is Wilbur a pig or an animal?

References

  1. "What is inquisitive semantics?". Institute for Logic, Language and Computation, University of Amsterdam.
  2. Ciardelli, Ivano; Groenendijk, Jeroen; Roelofsen, Floris (2019). Inquisitive Semantics (PDF). Oxford University Press.
  3. Ciardelli, I. (2009). "Inquisitive semantics and intermediate logics" (PDF). Master Thesis, ILLC University of Amsterdam.
  4. Ciardelli, Ivano; Roelofsen, Floris (2009). "Generalized inquisitive logic: completeness via intuitionistic Kripke models" (PDF). Proceedings of the 12th Conference on Theoretical Aspacts of Rationality and Knowledge. ACM: 71–80.
  5. Jeroen Groenendijk (2009). "Inquisitive semantics: Two possibilities for disjunction" (PDF). Proceedings of the 7th International Tbilisi Symposium on Language, Logic, and Computation. Springer: 80–94.
  6. Groenendijk, Jeroen; Roelofsen, Floris (2009). "Inquisitive semantics and pragmatics" (PDF). Proceedings of the ILCLI International Workshop on Semantics, Pragmatics and Rhetoric: 41–72.
  7. Mascarenhas, Salvador (2009). "Inquisitive semantics and logic" (PDF). Master Thesis, ILLC University of Amsterdam.

Further reading