Type shifter

Last updated

In formal semantics, a type shifter is an interpretation rule which changes an expression's semantic type. For instance, while the English expression "John" might ordinarily denote John himself, a type shifting rule called Lift can raise its denotation to a function which takes a property and returns "true" if John himself has that property. Lift can be seen as mapping an individual onto the principal ultrafilter which it generates. [1] [2] [3]

  1. Without type shifting:
  2. Type shifting with Lift:

Type shifters were proposed by Barbara Partee and Mats Rooth in 1983 to allow for systematic type ambiguity. Work of this period assumed that syntactic categories corresponded directly with semantic types and researchers thus had to "generalize to the worst case" when particular uses of particular expressions from a given category required an especially high type. Moreover, Partee argued that evidence in fact supported expressions having different types in different contexts. Thus, she and Rooth proposed type shifting as a principled mechanism for generating this ambiguity. [1] [2] [3]

Type shifters remain a standard tool in formal semantic work, particularly in categorial grammar and related frameworks. Type shifters have also been used to interpret quantifiers in object position and to capture scope ambiguities. In this regard, they serve as an alternative to syntactic operations such as quantifier raising used in mainstream generative approaches to semantics. [4] [5] Type shifters have also been used to generate and compose alternative sets without the need to fully adopt an alternative-based semantics. [6] [7]

See also

Notes

  1. 1 2 Partee, Barbara; Rooth, Mats (1983). "Generalized conjunction and type ambiguity" (PDF). In Portner, Paul; Partee, Barbara (eds.). Formal semantics: The essential readings. Wiley. pp. 334–356. doi:10.1002/9780470758335.ch14. ISBN   9780470758335.
  2. 1 2 Partee, Barbara (1983). "Noun phrase interpretation and type shifting principles" (PDF). In Portner, Paul; Partee, Barbara (eds.). Formal semantics: The essential readings. Wiley. pp. 357–381. doi:10.1002/9780470758335.ch15. ISBN   9780470758335.
  3. 1 2 Heim, Irene; Kratzer, Angelika (1998). Semantics in Generative Grammar. Oxford: Wiley Blackwell. Chapter 4.
  4. Heim, Irene; Kratzer, Angelika (1998). Semantics in Generative Grammar. Oxford: Wiley Blackwell. pp. 184–188.
  5. Jacobson, Pauline (2014). Compositional semantics: An introduction to the syntax/semantics interface. Oxford University Press. ISBN   9780199677153.
  6. Charlow, Simon (2015). "Alternatives via scope" (PDF). Unpublished Course Notes.
  7. Charlow, Simon (2020). "The scope of alternatives: Indefiniteness and islands". Linguistics and Philosophy. 43 (4): 427–472. doi:10.1007/s10988-019-09278-3. S2CID   254749307.

Related Research Articles

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

Semantics is the study of reference, meaning, or truth. The term can be used to refer to subfields of several distinct disciplines, including philosophy, linguistics and computer science.

Montague grammar is an approach to natural language semantics, named after American logician Richard Montague. The Montague grammar is based on mathematical logic, especially higher-order predicate logic and lambda calculus, and makes use of the notions of intensional logic, via Kripke models. Montague pioneered this approach in the 1960s and early 1970s.

Higher order grammar (HOG) is a grammar theory based on higher-order logic. It can be viewed simultaneously as generative-enumerative or model theoretic.

Categorial grammar is a family of formalisms in natural language syntax that share the central assumption that syntactic constituents combine as functions and arguments. Categorial grammar posits a close relationship between the syntax and semantic composition, since it typically treats syntactic categories as corresponding to semantic types. Categorial grammars were developed in the 1930s by Kazimierz Ajdukiewicz and in the 1950s by Yehoshua Bar-Hillel and Joachim Lambek. It saw a surge of interest in the 1970s following the work of Richard Montague, whose Montague grammar assumed a similar view of syntax. It continues to be a major paradigm, particularly within formal semantics.

Irene Roswitha Heim is a linguist and a leading specialist in semantics. She was a professor at the University of Texas at Austin and UCLA before moving to the Massachusetts Institute of Technology in 1989, where she is Professor Emerita of Linguistics. She served as Head of the Linguistics Section of the Department of Linguistics and Philosophy.

The term predicate is used in one of two ways in linguistics and its subfields. The first defines a predicate as everything in a standard declarative sentence except the subject, and the other views it as just the main content verb or associated predicative expression of a clause. Thus, by the first definition the predicate of the sentence Frank likes cake is likes cake. By the second definition, the predicate of the same sentence is just the content verb likes, whereby Frank and cake are the arguments of this predicate. Differences between these two definitions can lead to confusion.

Generative semantics was a research program in theoretical linguistics which held that syntactic structures are computed on the basis of meanings rather than the other way around. Generative semantics developed out of transformational generative grammar in the mid-1960s, but stood in opposition to it. The period in which the two research programs coexisted was marked by intense and often personal clashes now known as the linguistics wars. Its proponents included Haj Ross, Paul Postal, James McCawley, and George Lakoff, who dubbed themselves "The Four Horsemen of the Apocalypse".

In generative grammar and related approaches, the logical form (LF) of a linguistic expression is the variant of its syntactic structure which undergoes semantic interpretation. It is distinguished from phonetic form, the structure which corresponds to a sentence's pronunciation. These separate representations are postulated in order to explain the ways in which an expression's meaning can be partially independent of its pronunciation, e.g. scope ambiguities.

Angelika Kratzer is a professor emerita of linguistics in the department of linguistics at the University of Massachusetts Amherst.

Donkey sentences are sentences that contain a pronoun with clear meaning but whose syntactical role in the sentence poses challenges to grammarians. Such sentences defy straightforward attempts to generate their formal language equivalents. The difficulty is with understanding how English speakers parse such sentences.

Combinatory categorial grammar (CCG) is an efficiently parsable, yet linguistically expressive grammar formalism. It has a transparent interface between surface syntax and underlying semantic representation, including predicate–argument structure, quantification and information structure. The formalism generates constituency-based structures and is therefore a type of phrase structure grammar.

Formal semantics is the study of grammatical meaning in natural languages using formal tools from logic and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

In formal language theory, weak equivalence of two grammars means they generate the same set of strings, i.e. that the formal language they generate is the same. In compiler theory the notion is distinguished from strongequivalence, which additionally means that the two parse trees are reasonably similar in that the same semantic interpretation can be assigned to both.

<span class="mw-page-title-main">Formalism (linguistics)</span>

In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.

In linguistics, an expression is semantically ambiguous when it can have multiple meanings. The higher the amount of synonyms a word has, the higher the degree of ambiguity. Like other kinds of ambiguity, semantic ambiguities are often clarified by context or by prosody. One's comprehension of a sentence in which a semantically ambiguous word is used is strongly influenced by the general structure of the sentence. The language itself is sometimes a contributing factor in the overall effect of semantic ambiguity, in the sense that the level of ambiguity in the context can change depending on whether or not a language boundary is crossed.

Veneeta Dayal is an American linguist. She is currently the Dorothy R. Diebold Professor of Linguistics at Yale University.

In formal semantics, the scope of a semantic operator is the semantic object to which it applies. For instance, in the sentence "Paulina doesn't drink beer but she does drink wine," the proposition that Paulina drinks beer occurs within the scope of negation, but the proposition that Paulina drinks wine does not. Scope can be thought of as the semantic order of operations.

In linguistics, the syntax–semantics interface is the interaction between syntax and semantics. Its study encompasses phenomena that pertain to both syntax and semantics, with the goal of explaining correlations between form and meaning. Specific topics include scope, binding, and lexical semantic properties such as verbal aspect and nominal individuation, semantic macroroles, and unaccusativity.

Amy Rose Deal is associate professor of linguistics at the University of California, Berkeley. She works in the areas of syntax, semantics and morphology, on topics including agreement, indexical shift, ergativity, the person-case constraint, the mass/count distinction, and relative clauses. She has worked extensively on the grammar of the Sahaptin language Nez Perce. Deal is Editor-in-Chief of Natural Language Semantics, a major journal in the field.