Dynamic semantics

Last updated

Dynamic semantics is a framework in logic and natural language semantics that treats the meaning of a sentence as its potential to update a context. In static semantics, knowing the meaning of a sentence amounts to knowing when it is true; in dynamic semantics, knowing the meaning of a sentence means knowing "the change it brings about in the information state of anyone who accepts the news conveyed by it." [1] In dynamic semantics, sentences are mapped to functions called context change potentials, which take an input context and return an output context. Dynamic semantics was originally developed by Irene Heim and Hans Kamp in 1981 to model anaphora, but has since been applied widely to phenomena including presupposition, plurals, questions, discourse relations, and modality. [2]

Contents

Dynamics of anaphora

The first systems of dynamic semantics were the closely related File Change Semantics and discourse representation theory , developed simultaneously and independently by Irene Heim and Hans Kamp. These systems were intended to capture donkey anaphora, which resists an elegant compositional treatment in classic approaches to semantics such as Montague grammar. [2] [3] Donkey anaphora is exemplified by the infamous donkey sentences, first noticed by the medieval logician Walter Burley and brought to modern attention by Peter Geach. [4] [5]

Donkey sentence (relative clause): Every farmer who owns a donkey beats it.
Donkey sentence (conditional): If a farmer owns a donkey, he beats it.

To capture the empirically observed truth conditions of such sentences in first order logic, one would need to translate the indefinite noun phrase "a donkey" as a universal quantifier scoping over the variable corresponding to the pronoun "it".

FOL translation of donkey sentence:  :

While this translation captures (or approximates) the truth conditions of the natural language sentences, its relationship to the syntactic form of the sentence is puzzling in two ways. First, indefinites in non-donkey contexts normally express existential rather than universal quantification. Second, the syntactic position of the donkey pronoun would not normally allow it to be bound by the indefinite.

To explain these peculiarities, Heim and Kamp proposed that natural language indefinites are special in that they introduce a new discourse referent that remains available outside the syntactic scope of the operator that introduced it. To cash this idea out, they proposed their respective formal systems that capture donkey anaphora because they validate Egli's theorem and its corollary. [6]

Egli's theorem:
Egli's corollary:

Update semantics

Update semantics is a framework within dynamic semantics that was developed by Frank Veltman. [1] [7] In update semantics, each formula is mapped to a function that takes and returns a discourse context. Thus, if is a context, then is the context one gets by updating with . Systems of update semantics vary both in how they define a context and in the semantic entries they assign to formulas. The simplest update systems are intersective ones, which simply lift static systems into the dynamic framework. However, update semantics includes systems more expressive than what can be defined in the static framework. In particular, it allows information sensitive semantic entries, in which the information contributed by updating with some formula can depend on the information already present in the context. [8] This property of update semantics has led to its widespread application to presuppositions, modals, and conditionals.

Intersective update

An update with is called intersective if it amounts to taking the intersection of the input context with the proposition denoted by . Crucially, this definition assumes that there is a single fixed proposition that always denotes, regardless of the context. [8]

Intersective update was proposed by Robert Stalnaker in 1978 as a way of formalizing the speech act of assertion. [9] [8] In Stalnaker's original system, a context (or context set) is defined as a set of possible worlds representing the information in the common ground of a conversation. For instance, if this represents a scenario where the information agreed upon by all participants in the conversation indicates that the actual world must be either , , or . If , then updating with would return a new context . Thus, an assertion of would be understood as an attempt to rule out the possibility that the actual world is .

From a formal perspective, intersective update can be taken as a recipe for lifting one's preferred static semantics to dynamic semantics. For instance, if we take classical propositional semantics as our starting point, this recipe delivers the following intersective update semantics. [8]

The notion of intersectivity can be decomposed into the two properties known as eliminativity and distributivity. Eliminativity says that an update can only ever remove worlds from the context—it can't add them. Distributivity says that updating with is equivalent to updating each singleton subset of with and then pooling the results. [8]

Intersectivity amounts to the conjunction of these two properties, as proven by Johan van Benthem. [8] [10]

The test semantics for modals

The framework of update semantics is more general than static semantics because it is not limited to intersective meanings. Nonintersective meanings are theoretically useful because they contribute different information depending on what information is already present in the context. For instance, if is intersective, then it will update any input context with the exact same information, namely the information encoded by the proposition . On the other hand, if is nonintersective, it could contribute when it updates some contexts, but some completely different information when it updates other contexts. [8]

Many natural language expressions have been argued to have nonintersective meanings. The nonintersectivity of epistemic modals can be seen in the infelicity of epistemic contradictions. [11] [8]

Epistemic contradiction: #It's raining and it might not be raining.

These sentences have been argued to be bona fide logical contradictions, unlike superficially similar examples such as Moore sentences, which can be given a pragmatic explanation. [12] [8]

Epistemic contradiction principle:

These sentences cannot be analysed as logical contradictions within purely intersective frameworks such as the relational semantics for modal logic. The Epistemic Contradiction Principle only holds on the class of relational frames such that . However, such frames also validate an entailment from to . Thus, accounting for the infelicity of epistemic contradictions within a classical semantics for modals would bring along the unwelcome prediction that "It might be raining" entails "It is raining". [12] [8] Update Semantics skirts this problem by providing a nonintersective denotation for modals. When given such a denotation, the formula can update input contexts differently depending on whether they already contain the information that provides. The most widely adopted semantic entry for modals in update semantics is the test semantics proposed by Frank Veltman. [1]

On this semantics, tests whether the input context could be updated with without getting trivialized, i.e. without returning the empty set. If the input context passes the test, it remains unchanged. If it fails the test, the update trivializes the context by returning the empty set. This semantics can handle epistemic contradictions because no matter the input context, updating with will always output a context that fails the test imposed by . [8] [13]

See also

Notes

  1. 1 2 3 Veltman, Frank (1996). "Defaults in Update Semantics" (PDF). Journal of Philosophical Logic. 25 (3). doi:10.1007/BF00248150. S2CID   19377671.
  2. 1 2 Nowen, Rick; Brasoveanu, Adrian; van Eijck, Jan; Visser, Albert (2016). "Dynamic Semantics". In Zalta, Edward (ed.). The Stanford Encyclopedia of Philosophy. Retrieved 2020-08-11.
  3. Geurts, Bart; Beaver, David; Maier, Emar (2020). "Discourse Representation Theory". In Zalta, Edward (ed.). The Stanford Encyclopedia of Philosophy. Retrieved 2020-08-11.
  4. Peter Geach (1962). Reference and Generality: An Examination of Some Medieval and Modern Theories.
  5. King, Jeffrey; Lewis, Karen (2018). "Anaphora". In Zalta, Edward (ed.). The Stanford Encyclopedia of Philosophy. Retrieved 2020-08-11.
  6. Dekker, Paul (2001). "On If And Only If". In Hastings, R; Jackson, B; Zvolenszky, Z (eds.). Proceedings of SALT XI. Semantics and Linguistic Theory. Vol. 11. Linguistic Society of America.
  7. Goldstein, Simon (2019). "Generalized Update Semantics" (PDF). Mind. 128 (511): 795–835. doi:10.1093/mind/fzy076.
  8. 1 2 3 4 5 6 7 8 9 10 11 Goldstein, Simon (2017). "Introduction". Informative Dynamic Semantics (PhD). Rutgers University.
  9. Stalnaker, Robert (1978). "Assertion". In Cole, Peter (ed.). Pragmatics. Brill. pp. 315–332. doi:10.1163/9789004368873_001.
  10. van Benthem, Johan (1986). Essays in logical semantics. Dordrecht: Reidel.
  11. Yalcin, Seth (2007). "Epistemic Modals" (PDF). Mind. 116 (464): 983–1026. doi:10.1093/mind/fzm983.
  12. 1 2 Yalcin, Seth (2007). "Epistemic Modals" (PDF). Mind. 116 (464): 983–1026. doi:10.1093/mind/fzm983.
  13. For a complete derivation of the Epistemic Contradiction Principle within Update Semantics, see for instance Goldstein (2016), p. 13. This derivation crucially depends on a particular definition of entailment, as well as an intersective semantic entry for and a treatment of as updating consecutively with the conjuncts in their linear order.

Related Research Articles

<span class="mw-page-title-main">Logical disjunction</span> Logical connective OR

In logic, disjunction is a logical connective typically notated as and read aloud as "or". For instance, the English language sentence "it is sunny or it is warm" can be represented in logic using the disjunctive formula , assuming that abbreviates "it is sunny" and abbreviates "it is warm".

First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.

Propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. It deals with propositions and relations between propositions, including the construction of arguments based on them. Compound propositions are formed by connecting propositions by logical connectives. Propositions that contain no logical connectives are called atomic propositions.

In classical deductive logic, a consistent theory is one that does not lead to a logical contradiction. The lack of contradiction can be defined in either semantic or syntactic terms. The semantic definition states that a theory is consistent if it has a model, i.e., there exists an interpretation under which all formulas in the theory are true. This is the sense used in traditional Aristotelian logic, although in contemporary mathematical logic the term satisfiable is used instead. The syntactic definition states a theory is consistent if there is no formula such that both and its negation are elements of the set of consequences of . Let be a set of closed sentences and the set of closed sentences provable from under some formal deductive system. The set of axioms is consistent when for no formula .

<span class="mw-page-title-main">Contradiction</span> Logical incompatibility between two or more propositions

In traditional logic, a contradiction occurs when a proposition conflicts either with itself or established fact. It is often used as a tool to detect disingenuous beliefs and bias. Illustrating a general tendency in applied logic, Aristotle's law of noncontradiction states that "It is impossible that the same thing can at the same time both belong and not belong to the same object and in the same respect."

Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems of intuitionistic logic do not assume the law of the excluded middle and double negation elimination, which are fundamental inference rules in classical logic.

Modal logic is a collection of formal systems developed to represent statements about necessity and possibility. It plays a major role in philosophy of language, epistemology, metaphysics, and natural language semantics. Modal logics extend other systems by adding unary operators and , representing possibility and necessity respectively. For instance the modal formula can be read as "possibly " while can be read as "necessarily ". Modal logics can be used to represent different phenomena depending on what kind of necessity and possibility is under consideration. When is used to represent epistemic necessity, states that is epistemically necessary, or in other words that it is known. When is used to represent deontic necessity, states that is a moral or legal obligation.

In propositional logic, double negation is the theorem that states that "If a statement is true, then it is not the case that the statement is not true." This is expressed by saying that a proposition A is logically equivalent to not (not-A), or by the formula A ≡ ~(~A) where the sign ≡ expresses logical equivalence and the sign ~ expresses negation.

Common knowledge is a special kind of knowledge for a group of agents. There is common knowledge of p in a group of agents G when all the agents in G know p, they all know that they know p, they all know that they all know that they know p, and so on ad infinitum. It can be denoted as .

Independence-friendly logic is an extension of classical first-order logic (FOL) by means of slashed quantifiers of the form and , where is a finite set of variables. The intended reading of is "there is a which is functionally independent from the variables in ". IF logic allows one to express more general patterns of dependence between variables than those which are implicit in first-order logic. This greater level of generality leads to an actual increase in expressive power; the set of IF sentences can characterize the same classes of structures as existential second-order logic.

Epistemic modal logic is a subfield of modal logic that is concerned with reasoning about knowledge. While epistemology has a long philosophical tradition dating back to Ancient Greece, epistemic logic is a much more recent development with applications in many fields, including philosophy, theoretical computer science, artificial intelligence, economics and linguistics. While philosophers since Aristotle have discussed modal logic, and Medieval philosophers such as Avicenna, Ockham, and Duns Scotus developed many of their observations, it was C. I. Lewis who created the first symbolic and systematic approach to the topic, in 1912. It continued to mature as a field, reaching its modern form in 1963 with the work of Kripke.

In theoretical computer science, the modal μ-calculus is an extension of propositional modal logic by adding the least fixed point operator μ and the greatest fixed point operator ν, thus a fixed-point logic.

In logic, especially mathematical logic, a Hilbert system, sometimes called Hilbert calculus, Hilbert-style deductive system or Hilbert–Ackermann system, is a type of system of formal deduction attributed to Gottlob Frege and David Hilbert. These deductive systems are most often studied for first-order logic, but are of interest for other logics as well.

An interpretation is an assignment of meaning to the symbols of a formal language. Many formal languages used in mathematics, logic, and theoretical computer science are defined in solely syntactic terms, and as such do not have any meaning until they are given some interpretation. The general study of interpretations of formal languages is called formal semantics.

Donkey sentences are sentences that contain a pronoun with clear meaning but whose syntactical role in the sentence poses challenges to grammarians. Such sentences defy straightforward attempts to generate their formal language equivalents. The difficulty is with understanding how English speakers parse such sentences.

In modal logic, standard translation is a logic translation that transforms formulas of modal logic into formulas of first-order logic which capture the meaning of the modal formulas. Standard translation is defined inductively on the structure of the formula. In short, atomic formulas are mapped onto unary predicates and the objects in the first-order language are the accessible worlds. The logical connectives from propositional logic remain untouched and the modal operators are transformed into first-order formulas according to their semantics.

In modal logic, the modal depth of a formula is the deepest nesting of modal operators. Modal formulas without modal operators have a modal depth of zero.

Dynamic epistemic logic (DEL) is a logical framework dealing with knowledge and information change. Typically, DEL focuses on situations involving multiple agents and studies how their knowledge changes when events occur. These events can change factual properties of the actual world : for example a red card is painted in blue. They can also bring about changes of knowledge without changing factual properties of the world : for example a card is revealed publicly to be red. Originally, DEL focused on epistemic events. We only present in this entry some of the basic ideas of the original DEL framework; more details about DEL in general can be found in the references.

Inquisitive semantics is a framework in logic and natural language semantics. In inquisitive semantics, the semantic content of a sentence captures both the information that the sentence conveys and the issue that it raises. The framework provides a foundation for the linguistic analysis of statements and questions. It was originally developed by Ivano Ciardelli, Jeroen Groenendijk, Salvador Mascarenhas, and Floris Roelofsen.