Formal semantics (natural language)

Last updated

Formal semantics is the study of grammatical meaning in natural languages using formal tools from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

Contents

Overview

Formal semantics studies the denotations of natural language expressions. High-level concerns include compositionality, reference, and the nature of meaning. Key topic areas include scope, modality, binding, tense, and aspect. Semantics is distinct from pragmatics, which encompasses aspects of meaning which arise from interaction and communicative intent.

Formal semantics is an interdisciplinary field, often viewed as a subfield of both linguistics and philosophy, while also incorporating work from computer science, mathematical logic, and cognitive psychology. Within philosophy, formal semanticists typically adopt a Platonistic ontology and an externalist view of meaning. [1] Within linguistics, it is more common to view formal semantics as part of the study of linguistic cognition. As a result, philosophers put more of an emphasis on conceptual issues while linguists are more likely to focus on the syntax–semantics interface and crosslinguistic variation. [2] [3]

Central concepts

Truth conditions

The fundamental question of formal semantics is what you know when you know how to interpret expressions of a language. A common assumption is that knowing the meaning of a sentence requires knowing its truth conditions, or in other words knowing what the world would have to be like for the sentence to be true. For instance, to know the meaning of the English sentence "Nancy smokes" one has to know that it is true when the person Nancy performs the action of smoking. [1] [4]

However, many current approaches to formal semantics posit that there is more to meaning than truth-conditions. [5] In the formal semantic framework of inquisitive semantics, knowing the meaning of a sentence also requires knowing what issues (i.e. questions) it raises. For instance "Nancy smokes, but does she drink?" conveys the same truth-conditional information as the previous example but also raises an issue of whether Nancy drinks. [6] Other approaches generalize the concept of truth conditionality or treat it as epiphenomenal. For instance in dynamic semantics, knowing the meaning of a sentence amounts to knowing how it updates a context. [7] Pietroski treats meanings as instructions to build concepts. [8]

Compositionality

The Principle of Compositionality is the fundamental assumption in formal semantics. This principle states that the denotation of a complex expression is determined by the denotations of its parts along with their mode of composition. For instance, the denotation of the English sentence "Nancy smokes" is determined by the meaning of "Nancy", the denotation of "smokes", and whatever semantic operations combine the meanings of subjects with the meanings of predicates. In a simplified semantic analysis, this idea would be formalized by positing that "Nancy" denotes Nancy herself, while "smokes" denotes a function which takes some individual x as an argument and returns the truth value "true" if x indeed smokes. Assuming that the words "Nancy" and "smokes" are semantically composed via function application, this analysis would predict that the sentence as a whole is true if Nancy indeed smokes. [9] [10] [11]

Phenomena


Scope

Scope can be thought of as the semantic order of operations. For instance, in the sentence "Paulina doesn't drink beer but she does drink wine," the proposition that Paulina drinks beer occurs within the scope of negation, but the proposition that Paulina drinks wine does not. One of the major concerns of research in formal semantics is the relationship between operators' syntactic positions and their semantic scope. This relationship is not transparent, since the scope of an operator need not directly correspond to its surface position and a single surface form can be semantically ambiguous between different scope construals. Some theories of scope posit a level of syntactic structure called logical form, in which an item's syntactic position corresponds to its semantic scope. Others theories compute scope relations in the semantics itself, using formal tools such as type shifters, monads, and continuations. [12] [13] [14] [15]

Binding

Binding is the phenomenon in which anaphoric elements such as pronouns are grammatically associated with their antecedents. For instance in the English sentence "Mary saw herself", the anaphor "herself" is bound by its antecedent "Mary". Binding can be licensed or blocked in certain contexts or syntactic configurations, e.g. the pronoun "her" cannot be bound by "Mary" in the English sentence "Mary saw her". While all languages have binding, restrictions on it vary even among closely related languages. Binding was a major component to the government and binding theory paradigm.

Modality

Modality is the phenomenon whereby language is used to discuss potentially non-actual scenarios. For instance, while a non-modal sentence such as "Nancy smoked" makes a claim about the actual world, modalized sentences such as "Nancy might have smoked" or "If Nancy smoked, I'll be sad" make claims about alternative scenarios. The most intensely studied expressions include modal auxiliaries such as "could", "should", or "must"; modal adverbs such as "possibly" or "necessarily"; and modal adjectives such as "conceivable" and "probable". However, modal components have been identified in the meanings of countless natural language expressions including counterfactuals, propositional attitudes, evidentials, habituals and generics. The standard treatment of linguistic modality was proposed by Angelika Kratzer in the 1970s, building on an earlier tradition of work in modal logic. [16] [17] [18]

History

Formal semantics emerged as a major area of research in the early 1970s, with the pioneering work of the philosopher and logician Richard Montague. Montague proposed a formal system now known as Montague grammar which consisted of a novel syntactic formalism for English, a logical system called Intensional Logic, and a set of homomorphic translation rules linking the two. In retrospect, Montague Grammar has been compared to a Rube Goldberg machine, but it was regarded as earth-shattering when first proposed, and many of its fundamental insights survive in the various semantic models which have superseded it. [19] [20] [21]

Barbara Partee is one of the founders and major contributors to the field. Barbara partee.jpg
Barbara Partee is one of the founders and major contributors to the field.

Montague Grammar was a major advance because it showed that natural languages could be treated as interpreted formal languages. Before Montague, many linguists had doubted that this was possible, and logicians of that era tended to view logic as a replacement for natural language rather than a tool for analyzing it. [21] Montague's work was published during the Linguistics Wars, and many linguists were initially puzzled by it. While linguists wanted a restrictive theory that could only model phenomena that occur in human languages, Montague sought a flexible framework that characterized the concept of meaning at its most general. At one conference, Montague told Barbara Partee that she was "the only linguist who it is not the case that I can't talk to". [21]

Formal semantics grew into a major subfield of linguistics in the late 1970s and early 1980s, due to the seminal work of Barbara Partee. Partee developed a linguistically plausible system which incorporated the key insights of both Montague Grammar and Transformational grammar. Early research in linguistic formal semantics used Partee's system to achieve a wealth of empirical and conceptual results. [21] Later work by Irene Heim, Angelika Kratzer, Tanya Reinhart, Robert May and others built on Partee's work to further reconcile it with the generative approach to syntax. The resulting framework is known as the Heim and Kratzer system, after the authors of the textbook Semantics in Generative Grammar which first codified and popularized it. The Heim and Kratzer system differs from earlier approaches in that it incorporates a level of syntactic representation called logical form which undergoes semantic interpretation. Thus, this system often includes syntactic representations and operations which were introduced by translation rules in Montague's system. [22] [21] However, work by others such as Gerald Gazdar proposed models of the syntax-semantics interface which stayed closer to Montague's, providing a system of interpretation in which denotations could be computed on the basis of surface structures. These approaches live on in frameworks such as categorial grammar and combinatory categorial grammar. [23] [21]

Cognitive semantics emerged as a reaction against formal semantics, but there have been recently several attempts at reconciling both positions. [24]

See also

Related Research Articles

The following outline is provided as an overview and topical guide to linguistics:

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

In linguistics and philosophy, the denotation of an expression is its literal meaning. For instance, the English word "warm" denotes the property of having high temperature. Denotation is contrasted with other aspects of meaning including connotation. For instance, the word "warm" may evoke calmness or coziness, but these associations are not part of the word's denotation. Similarly, an expression's denotation is separate from pragmatic inferences it may trigger. For instance, describing something as "warm" often implicates that it is not hot, but this is once again not part of the word's denotation.

Montague grammar is an approach to natural language semantics, named after American logician Richard Montague. The Montague grammar is based on mathematical logic, especially higher-order predicate logic and lambda calculus, and makes use of the notions of intensional logic, via Kripke models. Montague pioneered this approach in the 1960s and early 1970s.

In semantics, mathematical logic and related disciplines, the principle of compositionality is the principle that the meaning of a complex expression is determined by the meanings of its constituent expressions and the rules used to combine them. The principle is also called Frege's principle, because Gottlob Frege is widely credited for the first modern formulation of it. However, the principle has never been explicitly stated by Frege, and arguably it was already assumed by George Boole decades before Frege's work.

Irene Roswitha Heim is a linguist and a leading specialist in semantics. She was a professor at the University of Texas at Austin and UCLA before moving to the Massachusetts Institute of Technology in 1989, where she is Professor Emerita of Linguistics. She served as Head of the Linguistics Section of the Department of Linguistics and Philosophy.

The term predicate is used in two ways in linguistics and its subfields. The first defines a predicate as everything in a standard declarative sentence except the subject, and the other defines it as only the main content verb or associated predicative expression of a clause. Thus, by the first definition, the predicate of the sentence Frank likes cake is likes cake, while by the second definition, it is only the content verb likes, and Frank and cake are the arguments of this predicate. The conflict between these two definitions can lead to confusion.

Generative semantics was a research program in theoretical linguistics which held that syntactic structures are computed on the basis of meanings rather than the other way around. Generative semantics developed out of transformational generative grammar in the mid-1960s, but stood in opposition to it. The period in which the two research programs coexisted was marked by intense and often personal clashes now known as the linguistics wars. Its proponents included Haj Ross, Paul Postal, James McCawley, and George Lakoff, who dubbed themselves "The Four Horsemen of the Apocalypse".

<span class="mw-page-title-main">Barbara Partee</span> American linguist

Barbara Hall Partee is a Distinguished University Professor Emerita of Linguistics and Philosophy at the University of Massachusetts Amherst (UMass).

In generative grammar and related approaches, the logical form (LF) of a linguistic expression is the variant of its syntactic structure which undergoes semantic interpretation. It is distinguished from phonetic form, the structure which corresponds to a sentence's pronunciation. These separate representations are postulated in order to explain the ways in which an expression's meaning can be partially independent of its pronunciation, e.g. scope ambiguities.

Angelika Kratzer is a professor emerita of linguistics in the Department of Linguistics at the University of Massachusetts Amherst.

In situation theory, situation semantics attempts to provide a solid theoretical foundation for reasoning about common-sense and real world situations, typically in the context of theoretical linguistics, theoretical philosophy, or applied natural language processing,

In semantics, donkey sentences are sentences that contain a pronoun with clear meaning but whose syntactic role in the sentence poses challenges to linguists. Such sentences defy straightforward attempts to generate their formal language equivalents. The difficulty is with understanding how English speakers parse such sentences.

In logic and linguistics, an expression is syncategorematic if it lacks a denotation but can nonetheless affect the denotation of a larger expression which contains it. Syncategorematic expressions are contrasted with categorematic expressions, which have their own denotations.

Dynamic semantics is a framework in logic and natural language semantics that treats the meaning of a sentence as its potential to update a context. In static semantics, knowing the meaning of a sentence amounts to knowing when it is true; in dynamic semantics, knowing the meaning of a sentence means knowing "the change it brings about in the information state of anyone who accepts the news conveyed by it." In dynamic semantics, sentences are mapped to functions called context change potentials, which take an input context and return an output context. Dynamic semantics was originally developed by Irene Heim and Hans Kamp in 1981 to model anaphora, but has since been applied widely to phenomena including presupposition, plurals, questions, discourse relations, and modality.

<span class="mw-page-title-main">Formalism (linguistics)</span> Concept in linguistics

In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.

In linguistics, an expression is semantically ambiguous when it can have multiple meanings. The higher the number of synonyms a word has, the higher the degree of ambiguity. Like other kinds of ambiguity, semantic ambiguities are often clarified by context or by prosody. One's comprehension of a sentence in which a semantically ambiguous word is used is strongly influenced by the general structure of the sentence. The language itself is sometimes a contributing factor in the overall effect of semantic ambiguity, in the sense that the level of ambiguity in the context can change depending on whether or not a language boundary is crossed.

In formal semantics, the scope of a semantic operator is the semantic object to which it applies. For instance, in the sentence "Paulina doesn't drink beer but she does drink wine," the proposition that Paulina drinks beer occurs within the scope of negation, but the proposition that Paulina drinks wine does not. Scope can be thought of as the semantic order of operations.

In formal semantics, a type shifter is an interpretation rule that changes an expression's semantic type. For instance, the English expression "John" might ordinarily denote John himself, but a type shifting rule called Lift can raise its denotation to a function which takes a property and returns "true" if John himself has that property. Lift can be seen as mapping an individual onto the principal ultrafilter that it generates.

  1. Without type shifting:
  2. Type shifting with Lift:

In linguistics, the syntax–semantics interface is the interaction between syntax and semantics. Its study encompasses phenomena that pertain to both syntax and semantics, with the goal of explaining correlations between form and meaning. Specific topics include scope, binding, and lexical semantic properties such as verbal aspect and nominal individuation, semantic macroroles, and unaccusativity.

References

  1. 1 2 Lewis, David (December 1970). "General Semantics". Synthese. 22 (1/2): 18–67. doi:10.1007/BF00413598. S2CID   14877324.
  2. Seth Yalcin (2014). "Semantics and metasemantics in the context of generative grammar". In Alexis Burgess; Brett Sherman (eds.). Metasemantics: new essays on the foundations of meaning. Oxford University Press. ISBN   9780199669592.
  3. Borg, Emma (2004). Minimal semantics. Oxford University Press. ISBN   978-0199206926.
  4. Irene Heim; Angelika Kratzer (1998). Semantics in generative grammar. Wiley-Blackwell. ISBN   978-0-631-19713-3.
  5. Stefano Predelli (2013). Meaning without truth. Oxford Scholarship. ISBN   9780199695638.
  6. Ciardelli, Ivano; Groenendijk, Jeroen; Roelofsen, Floris (2019). Inquisitive Semantics (PDF). Oxford University Press.
  7. Veltman, Frank (1996). "Defaults in Update Semantics" (PDF). Journal of Philosophical Logic. 25 (3). doi:10.1007/BF00248150. S2CID   19377671.
  8. Paul Pietroski (2018). Conjoining meanings. Oxford University Press. ISBN   9780198812722.
  9. Irene Heim; Angelika Kratzer (1998). Semantics in generative grammar. Wiley-Blackwell. pp. 2–3, 14–22. ISBN   978-0-631-19713-3.
  10. Kroeger, Paul (2019). Analyzing Meaning. Language Science Press. pp. 217–219. ISBN   978-3-96110-136-8.
  11. Coppock, Elizabeth; Champollion, Lucas (2019). Invitation to Formal Semantics (PDF). Manuscript. p. 42.
  12. Heim, Irene; Kratzer, Angelika (1998). Semantics in Generative Grammar. Oxford: Wiley Blackwell. pp. 194–198.
  13. Ruys, Eddy; Winter, Yoad (2011). "Quantifier scope in formal linguistics." (PDF). In Gabbay, Dov; Guenthner, Franz (eds.). Handbook of Philosophical Logic (2 ed.). Dordrecht: Springer. pp. 159–225. doi:10.1007/978-94-007-0479-4_3. ISBN   978-94-007-0478-7.
  14. Barker, Chris (2015). "Scope" (PDF). In Lappin, Shalom; Fox, Chris (eds.). Handbook of Contemporary Semantics (2 ed.). Wiley Blackwell. Section 4.3. doi:10.1002/9781118882139.ch2. ISBN   9781118882139.
  15. Szabolcsi, Anna (2010). Quantification. Cambridge University Press. p. 92.
  16. Portner, Paul (2009). Modality. Oxford: Oxford University Press. ISBN   978-0-19-929242-4.
  17. Kaufmann, S.; Condoravdi, C. & Harizanov, V. (2006) Formal approaches to modality. Formal approaches to modality. In: Frawley, W. (Ed.). The Expression of Modality. Berlin, New York: Mouton de Gruyter
  18. Starr, Will (2019). "Supplement to "Counterfactuals": Indicative and Subjunctive Conditionals". In Zalta, Edward N. (ed.). The Stanford Encyclopedia of Philosophy.
  19. Barwise, Jon; Cooper, Robin (1981). "Generalized quantifiers and natural language". In Kulas, J; Fetzer, J.H.; Rankin, T.L. (eds.). Philosophy, Language, and Artificial Intelligence. Studies in Cognitive Systems. Vol. 2. Springer. pp. 241–301. doi:10.1007/978-94-009-2727-8_10. ISBN   978-94-010-7726-2. S2CID   62189594.
  20. For a very readable and succinct overview of how formal semantics found its way into linguistics, see The formal approach to meaning: Formal semantics and its recent developments by Barbara Abbott. In: Journal of Foreign Languages (Shanghai), 119:1 (January 1999), 2–20.
  21. 1 2 3 4 5 6 Partee, Barbara (2011). "Formal semantics: Origins, issues, early impact". The Baltic International Yearbook of Cognition, Logic and Communication. 6. CiteSeerX   10.1.1.826.5720 .
  22. Crnič, Luka; Pesetsky, David; Sauerland, Uli (2014). "Introduction: Biographical Notes" (PDF). In Crnič, Luka; Sauerland, Uli (eds.). The art and craft of semantics: A Festschrift for Irene Heim.
  23. Michael Moortgat (1988). Categorial investigations: logical and linguistic aspects of the Lambek calculus. Walter de Gruyter. ISBN   978-90-6765-387-9 . Retrieved 5 April 2011.
  24. Hamm, Fritz; Kamp, Hans; Lambalgen, Michiel van (2006-09-01). "There is no opposition between Formal and Cognitive Semantics". Theoretical Linguistics. 32 (1): 1–40. CiteSeerX   10.1.1.80.6574 . doi:10.1515/tl.2006.001. ISSN   1613-4060. S2CID   17691054.

Further reading