Principle of compositionality

Last updated

In semantics, mathematical logic and related disciplines, the principle of compositionality is the principle that the meaning of a complex expression is determined by the meanings of its constituent expressions and the rules used to combine them. The principle is also called Frege's principle, because Gottlob Frege is widely credited for the first modern formulation of it. However, the principle has never been explicitly stated by Frege, [1] and arguably it was already assumed by George Boole [2] decades before Frege's work.

Contents

The principle of compositionality (also known as semantic compositionalism) is highly debated in linguistics. Among its most challenging problems there are the issues of contextuality, the non-compositionality of idiomatic expressions, and the non-compositionality of quotations. [3]

History

Discussion of compositionality started to appear at the beginning of the 19th century, during which it was debated whether what was most fundamental in language was compositionality or contextuality, and compositionality was usually preferred. [4] Gottlob Frege never adhered to the principle of compositionality as it is known today (Frege endorsed the context principle instead), and the first to explicitly formulate it was Rudolf Carnap in 1947. [4]

Overview

A common formulation [4] of the principle of compositionality comes from Barbara Partee, stating: "The meaning of a compound expression is a function of the meanings of its parts and of the way they are syntactically combined." [5]

It is possible to distinguish different levels of compositionality. Strong compositionality refers to compound expressions that are determined by the meaning of its immediate parts and a top-level syntactic function that describes their combination. Weak compositionality refers to compound expressions that are determined by the meaning of its parts as well as their complete syntactic combination. [6] [7] However, there can also be further gradations in between these two extremes. This is possible, if one not only allows the meaning of immediate parts but also the meaning of the second-highest parts (third-highest parts, fourth-highest parts, etc.) together with functions that describes their respective combinations. [7]

On a sentence level, the principle claims that what remains if one removes the lexical parts of a meaningful sentence, are the rules of composition. The sentence "Socrates was a man", for example, becomes "S was a M" once the meaningful lexical items—"Socrates" and "man"—are taken away. The task of finding the rules of composition, then becomes a matter of describing what the connection between S and M is.

Among the most prominent linguistic problems that challenge the principle of compositionality are the issues of contextuality, the non compositionality of idiomatic expressions, and the non compositionality of quotations. [3]

It is frequently taken to mean that every operation of the syntax should be associated with an operation of the semantics that acts on the meanings of the constituents combined by the syntactic operation. As a guideline for constructing semantic theories, this is generally taken, as in the influential work on the philosophy of language by Donald Davidson, to mean that every construct of the syntax should be associated by a clause of the T-schema with an operator in the semantics that specifies how the meaning of the whole expression is built from constituents combined by the syntactic rule. In some general mathematical theories (especially those in the tradition of Montague grammar), this guideline is taken to mean that the interpretation of a language is essentially given by a homomorphism between an algebra of syntactic representations and an algebra of semantic objects.

The principle of compositionality also exists in a similar form in the compositionality of programming languages.

Critiques

The principle of compositionality has been the subject of intense debate. Indeed, there is no general agreement as to how the principle is to be interpreted, although there have been several attempts to provide formal definitions of it. [8]

Scholars are also divided as to whether the principle should be regarded as a factual claim, open to empirical testing; an analytic truth, obvious from the nature of language and meaning; or a methodological principle to guide the development of theories of syntax and semantics. The Principle of Compositionality has been attacked in all three spheres, although so far none of the criticisms brought against it have been generally regarded as compelling.[ citation needed ] Most proponents of the principle, however, make certain exceptions for idiomatic expressions in natural language. [8]

The principle of compositionality usually holds when only syntactic factors play in the increased complexity of sentence processing, while it becomes more problematic and questionable when the complexity increase is due to sentence or discourse context, semantic memory, or sensory cues. [9] Among the problematic phenomena for traditional theories of compositionality is that of logical metonymy, which has been studied at least since the mid 1990s by linguists James Pustejovsky and Ray Jackendoff. [10] [11] [12] Logical metonymies are sentences like John began the book, where the verb to begin requires (subcategorizes) an event as its argument, but in a logical metonymy an object (i.e. the book) is found instead, and this forces to interpret the sentence by inferring an implicit event ("reading", "writing", or other prototypical actions performed on a book). [10] The problem for compositionality is that the meaning of reading or writing is not present in the words of the sentence, neither in "begin" nor in "book".

Further, in the context of the philosophy of language, the principle of compositionality does not explain all of meaning. For example, you cannot infer sarcasm purely on the basis of words and their composition, yet a phrase used sarcastically means something completely different from the same phrase uttered straightforwardly. Thus, some theorists argue that the principle has to be revised to take into account linguistic and extralinguistic context, which includes the tone of voice used, common ground between the speakers, the intentions of the speaker, and so on. [8]

See also

Notes

  1. Pelletier, Francis Jeffry (2001). "Did Frege Believe Frege's Principle?". Journal of Logic, Language and Information. 10 (1): 87–114. doi:10.1023/A:1026594023292.
  2. Boole, G. (1854). An investigation of the laws of thought: on which are founded the mathematical theories of logic and probabilities. Walton and Maberly.
  3. 1 2 Pelletier (2016) section "12 This Chapter"
  4. 1 2 3 Janssen, Theo (2012). "Compositionality: Its Historic Context". Oxford handbook of compositionality: 19–46. doi:10.1093/oxfordhb/9780199541072.013.0001.
  5. Partee, Barbara (1984). "Compositionality". Varieties of formal semantics. 3: 281–311.
  6. Coopmans, Cas W.; Kaushik, Karthikeya; Martin, Andrea E. (2023). "Hierarchical structure in language and action: A formal comparison". Psychological Review. 130 (4): 935–952. doi:10.1037/rev0000429. ISSN   1939-1471.
  7. 1 2 Pagin, Peter; Westerståhl, Dag (2010). "Compositionality I: Definitions and Variants". Philosophy Compass. 5 (3): 250–264. doi:10.1111/j.1747-9991.2009.00228.x.
  8. 1 2 3 Szabó, Zoltán Gendler (2012) "Compositionality". In Zalta, Edward N. (ed.). Stanford Encyclopedia of Philosophy. First published Thu Apr 8, 2004; substantive revision Fri Dec 7, 2012
  9. Baggio et al. (2012), Conclusions.
  10. 1 2 Chersoni, E., Lenci, A., & Blache, P. (2017, August). Logical metonymy in a distributional model of sentence comprehension . In Sixth Joint Conference on Lexical and Computational Semantics (* SEM 2017) (pp. 168-177).
  11. James Pustejovsky. 1995. The Generative Lexicon. The MIT Press, Cambridge, MA
  12. Ray Jackendoff. 1997. The Architecture of the Language Faculty. The MIT Press, Cambridge, MA.

Related Research Articles

<span class="mw-page-title-main">Semantics</span> Study of meaning in language

Semantics is the study of linguistic meaning. It examines what meaning is, how words get their meaning, and how the meaning of a complex expression depends on its parts. Part of this process involves the distinction between sense and reference. Sense is given by the ideas and concepts associated with an expression while reference is the object to which an expression points. Semantics contrasts with syntax, which studies the rules that dictate how to create grammatically correct sentences, and pragmatics, which investigates how people use language in communication.

In linguistics and related fields, pragmatics is the study of how context contributes to meaning. The field of study evaluates how human language is utilized in social interactions, as well as the relationship between the interpreter and the interpreted. Linguists who specialize in pragmatics are called pragmaticians. The field has been represented since 1986 by the International Pragmatics Association (IPrA).

An idiom is a phrase or expression that usually presents a figurative, non-literal meaning attached to the phrase. Some phrases which become figurative idioms, however, do retain the phrase's literal meaning. Categorized as formulaic language, an idiom's figurative meaning is different from the literal meaning. Idioms occur frequently in all languages; in English alone there are an estimated twenty-five thousand idiomatic expressions.

Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.

<span class="mw-page-title-main">Syntax (logic)</span> Rules used for constructing, or transforming the symbols and words of a language

In logic, syntax is anything having to do with formal languages or formal systems without regard to any interpretation or meaning given to them. Syntax is concerned with the rules used for constructing, or transforming the symbols and words of a language, as contrasted with the semantics of a language which is concerned with its meaning.

Conceptual semantics is a framework for semantic analysis developed mainly by Ray Jackendoff in 1976. Its aim is to provide a characterization of the conceptual elements by which a person understands words and sentences, and thus to provide an explanatory semantic representation. Explanatory in this sense refers to the ability of a given linguistic theory to describe how a component of language is acquired by a child.

In linguistics, focus is a grammatical category that conveys which part of the sentence contributes new, non-derivable, or contrastive information. In the English sentence "Mary only insulted BILL", focus is expressed prosodically by a pitch accent on "Bill" which identifies him as the only person whom Mary insulted. By contrast, in the sentence "Mary only INSULTED Bill", the verb "insult" is focused and thus expresses that Mary performed no other actions towards Bill. Focus is a cross-linguistic phenomenon and a major topic in linguistics. Research on focus spans numerous subfields including phonetics, syntax, semantics, pragmatics, and sociolinguistics.

Intensional logic is an approach to predicate logic that extends first-order logic, which has quantifiers that range over the individuals of a universe (extensions), by additional quantifiers that range over terms that may have such individuals as their value (intensions). The distinction between intensional and extensional entities is parallel to the distinction between sense and reference.

Semantic holism is a theory in the philosophy of language to the effect that a certain part of language, be it a term or a complete sentence, can only be understood through its relations to a larger segment of language. There is substantial controversy, however, as to exactly what the larger segment of language in question consists of. In recent years, the debate surrounding semantic holism, which is one among the many forms of holism that are debated and discussed in contemporary philosophy, has tended to centre on the view that the "whole" in question consists of an entire language.

In philosophy—more specifically, in its sub-fields semantics, semiotics, philosophy of language, metaphysics, and metasemantics—meaning "is a relationship between two sorts of things: signs and the kinds of things they intend, express, or signify".

Kent Bach is an American philosopher and Professor of Philosophy at San Francisco State University. His primary areas of research include the philosophy of language, linguistics and epistemology. He is the author of three books: Exit-existentialism: A philosophy of self-awareness, Linguistic Communication and Speech Acts, and Thought and Reference published by Wadsworth, the MIT Press, and Oxford University Press, respectively.

Transparent intensional logic is a logical system created by Pavel Tichý. Due to its rich procedural semantics TIL is in particular apt for the logical analysis of natural language. From the formal point of view, TIL is a hyperintensional, partial, typed lambda calculus.

Philosophy of language investigates the nature of language and the relations between language, language users, and the world. Investigations may include inquiry into the nature of meaning, intentionality, reference, the constitution of sentences, concepts, learning, and thought.

The linguistics wars were extended disputes among American theoretical linguists that occurred mostly during the 1960s and 1970s, stemming from a disagreement between Noam Chomsky and several of his associates and students. The debates started in 1967 when linguists Paul Postal, John R. Ross, George Lakoff, and James D. McCawley —self-dubbed the "Four Horsemen of the Apocalypse"—proposed an alternative approach in which the relation between semantics and syntax is viewed differently, which treated deep structures as meaning rather than syntactic objects. While Chomsky and other generative grammarians argued that meaning is driven by an underlying syntax, generative semanticists posited that syntax is shaped by an underlying meaning. This intellectual divergence led to two competing frameworks in generative semantics and interpretive semantics.

In semiotics, linguistics, sociology and anthropology, context refers to those objects or entities which surround a focal event, in these disciplines typically a communicative event, of some kind. Context is "a frame that surrounds the event and provides resources for its appropriate interpretation". It is thus a relative concept, only definable with respect to some focal event within a frame, not independently of that frame.

In semantics, a donkey sentence is a sentence containing a pronoun which is semantically bound but syntactically free. They are a classic puzzle in formal semantics and philosophy of language because they are fully grammatical and yet defy straightforward attempts to generate their formal language equivalents. In order to explain how speakers are able to understand them, semanticists have proposed a variety of formalisms including systems of dynamic semantics such as Discourse representation theory. Their name comes from the example sentence "Every farmer who owns a donkey beats it", in which "it" acts as a donkey pronoun because it is semantically but not syntactically bound by the indefinite noun phrase "a donkey". The phenomenon is known as donkey anaphora.

This is an index of Wikipedia articles in philosophy of language

Formal semantics is the study of grammatical meaning in natural languages using formal concepts from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

In linguistics, an expression is semantically ambiguous when it can have multiple meanings. The higher the number of synonyms a word has, the higher the degree of ambiguity. Like other kinds of ambiguity, semantic ambiguities are often clarified by context or by prosody. One's comprehension of a sentence in which a semantically ambiguous word is used is strongly influenced by the general structure of the sentence. The language itself is sometimes a contributing factor in the overall effect of semantic ambiguity, in the sense that the level of ambiguity in the context can change depending on whether or not a language boundary is crossed.

In linguistics, the syntax–semantics interface is the interaction between syntax and semantics. Its study encompasses phenomena that pertain to both syntax and semantics, with the goal of explaining correlations between form and meaning. Specific topics include scope, binding, and lexical semantic properties such as verbal aspect and nominal individuation, semantic macroroles, and unaccusativity.

References

Further reading