Generative semantics

Last updated

Generative semantics was a research program in theoretical linguistics which held that syntactic structures are computed on the basis of meanings rather than the other way around. Generative semantics developed out of transformational generative grammar in the mid-1960s, but stood in opposition to it. The period in which the two research programs coexisted was marked by intense and often personal clashes now known as the linguistics wars. Its proponents included Haj Ross, Paul Postal, James McCawley, and George Lakoff, who dubbed themselves "The Four Horsemen of the Apocalypse".

Contents

Generative semantics is no longer practiced under that name, though many of its central ideas have blossomed in the cognitive linguistics tradition. It is also regarded as a key part of the intellectual heritage of head-driven phrase structure grammar (HPSG) and construction grammar, and some of its insights live on in mainstream generative grammar. Pieter Seuren has developed a semantic syntax which is very close in spirit to the original generative semantics framework, which he played a role in developing. [1] [2]

Interpretive or generative?

The controversy surrounding generative semantics stemmed in part from the competition between two fundamentally different approaches to semantics within transformational generative syntax. In the 1960s, work in the generative tradition assumed that semantics was interpretive in the sense that the meaning of a sentence was computed on the basis of its syntactic structure rather than the other way around. In these approaches, syntactic structures were generated by rules stated in terms of syntactic structure alone, with no reference to meaning. Once generated, these structures would serve as the input to a semantic computation which would output a denotation. This approach captured the relationship between syntactic and semantic patterns, while allowing the syntax to work independently of the semantics, as Chomsky and others had argued for on the basis of empirical observations such as the famous "colorless green ideas sleep furiously" sentence.

The generative semantics framework took the opposite view, positing that syntactic structures are computed on the basis of meanings. In this approach, meanings were generated directly by the grammar as deep structures, and were subsequently transformed into recognizable sentences by transformations. This approach necessitated more complex underlying structures than those proposed by Chomsky, and thus more complex transformations. Despite this additional complexity, the approach was appealing in several respects. First, it offered a powerful mechanism for explaining synonymity. In his initial work in generative syntax, Chomsky motivated transformations using active/passive pairs such as "I hit John" and "John was hit by me", which have different surface forms despite their identical truth conditions. Generative semanticists wanted to account for all cases of synonymity in a similar fashion, which proved to be a challenge given the tools available at the time. Second, the theory had a pleasingly intuitive structure: the form of a sentence was quite literally derived from its meaning via transformations. To some, interpretive semantics seemed rather "clunky" and ad hoc in comparison. This was especially so before the development of trace theory.

Despite its opposition to generative grammar, the generative semantics project operated largely in Chomskyan terms. Most importantly, the generative semanticists, following Chomsky, were opposed to behaviorism and accepted his idea that language is acquired and not learned. [3] Chomsky and Lakoff were united by their opposition to the establishment of formal semantics in the 1970s. [4] The notion that meaning generates grammar is itself old and fundamental to the Port-Royal Grammar (1660), Saussure's Course in General Linguistics (1916), and Tesnière's dependency grammar (1957) among others. By contrast, generative semantics was faced with the problem of explaining the emergence of meaning in neuro-biological rather than social and rational terms. This problem was solved in the 1980s by Lakoff in his version of Cognitive Linguistics, according to which language generates through sensory experience. Thus, engaging with the physical world provides the person with visual, tactile and other sensory input, which crystallizes into language in the form of conceptual metaphors, organizing rational thinking. [5] Such a view of the mind has not been fully approved by neuroscientists. [6]

Notes

^ There is little agreement concerning the question of whose idea generative semantics was. All of the people mentioned here have been credited with its invention (often by each other).

^ Strictly speaking, it was not the fact that active/passive pairs are synonymous that motivated the passive transformation, but the fact that active and passive verb forms have the same selectional requirements. For example, the agent of the verb kick (i.e. the thing that's doing the kicking) must be animate whether it is the subject of the active verb (as in "John kicked the ball") or appears in a by phrase after the passive verb ("The ball was kicked by John").

See also

Related Research Articles

<span class="mw-page-title-main">Syntax</span> System responsible for combining morphemes into complex structures

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones.

Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.

<span class="mw-page-title-main">Generative grammar</span> Theory in linguistics

Generative grammar, or generativism, is a linguistic theory that regards linguistics as the study of a hypothesised innate grammatical structure. It is a biological or biologistic modification of earlier structuralist theories of linguistics, deriving from logical syntax and glossematics. Generative grammar considers grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. It is a system of explicit rules that may apply repeatedly to generate an indefinite number of sentences which can be as long as one wants them to be. The difference from structural and functional models is that the object is base-generated within the verb phrase in generative grammar. This purportedly cognitive structure is thought of as being a part of a universal grammar, a syntactic structure which is caused by a genetic mutation in humans.

Theta roles are the names of the participant roles associated with a predicate: the predicate may be a verb, an adjective, a preposition, or a noun. If an object is in motion or in a steady state as the speakers perceives the state, or it is the topic of discussion, it is called a theme. The participant is usually said to be an argument of the predicate. In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.

Construction grammar is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human language. Constructions include words, morphemes, fixed expressions and idioms, and abstract grammatical rules such as the passive voice or the ditransitive. Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts, or from other constructions that are recognized to exist. In construction grammar, every utterance is understood to be a combination of multiple different constructions, which together specify its precise meaning and form.

<i>Syntactic Structures</i> Book by Noam Chomsky

Syntactic Structures is an important work in linguistics by American linguist Noam Chomsky, originally published in 1957. A short monograph of about a hundred pages, it is recognized as one of the most significant and influential linguistic studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning, thus arguing for the independence of syntax from semantics.

The term predicate is used in two ways in linguistics and its subfields. The first defines a predicate as everything in a standard declarative sentence except the subject, and the other defines it as only the main content verb or associated predicative expression of a clause. Thus, by the first definition, the predicate of the sentence Frank likes cake is likes cake, while by the second definition, it is only the content verb likes, and Frank and cake are the arguments of this predicate. The conflict between these two definitions can lead to confusion.

<span class="mw-page-title-main">Charles J. Fillmore</span> American linguist

Charles J. Fillmore was an American linguist and Professor of Linguistics at the University of California, Berkeley. He received his Ph.D. in Linguistics from the University of Michigan in 1961. Fillmore spent ten years at Ohio State University and a year as a Fellow at the Center for Advanced Study in the Behavioral Sciences at Stanford University before joining Berkeley's Department of Linguistics in 1971. Fillmore was extremely influential in the areas of syntax and lexical semantics.

<span class="mw-page-title-main">John R. Ross</span> American poet and linguist

John Robert "Haj" Ross is an American poet and linguist. He played a part in the development of generative semantics along with George Lakoff, James D. McCawley, and Paul Postal. He was a professor of linguistics at MIT from 1966 to 1985 and has worked in Brazil, Singapore and British Columbia, and until spring 2021, he taught at the University of North Texas.

In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.

Paul Martin Postal is an American linguist.

Ray C. Dougherty is an American linguist and was a member of the Arts and Science faculty at New York University until 2014 (retired). He received his bachelor's and master's degrees in engineering from Dartmouth College in the early 1960s and his Ph.D. in linguistics from the Massachusetts Institute of Technology in 1968. At MIT, Dougherty was one of the first students of Noam Chomsky, working in the field of transformational grammar. During the Linguistics Wars of the 1970s, Dougherty was a critic of the generative semantics movement. Specializing in computational linguistics, Dougherty has published several books and articles on the subject.

In certain theories of linguistics, thematic relations, also known as semantic roles, are the various roles that a noun phrase may play with respect to the action or state described by a governing verb, commonly the sentence's main verb. For example, in the sentence "Susan ate an apple", Susan is the doer of the eating, so she is an agent; an apple is the item that is eaten, so it is a patient.

The linguistics wars were a protracted academic dispute inside American theoretical linguistics that took place mostly in the 1960s and 1970s, stemming from an intellectual falling out between Noam Chomsky and some of his early colleagues and doctoral students. The debate began in 1967, when linguists Paul Postal, "Haj" Ross, George Lakoff, and James McCawley—self-dubbed the "Four Horsemen of the Apocalypse" —proposed an approach to the relationship between syntax and semantics, which treated deep structures as meanings rather than syntactic objects. While Chomsky and other generative grammarians argued that the meaning of a sentence was derived from its syntax, the generative semanticists argued that syntax was derived from meaning.

<i>Aspects of the Theory of Syntax</i> 1965 book by Noam Chomsky

Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.

Formal semantics is the study of grammatical meaning in natural languages using formal tools from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

<span class="mw-page-title-main">Formalism (linguistics)</span> Concept in linguistics

In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.

In linguistics, the autonomy of syntax is the assumption that syntax is arbitrary and self-contained with respect to meaning, semantics, pragmatics, discourse function, and other factors external to language. The autonomy of syntax is advocated by linguistic formalists, and in particular by generative linguistics, whose approaches have hence been called autonomist linguistics.

In linguistics, the syntax–semantics interface is the interaction between syntax and semantics. Its study encompasses phenomena that pertain to both syntax and semantics, with the goal of explaining correlations between form and meaning. Specific topics include scope, binding, and lexical semantic properties such as verbal aspect and nominal individuation, semantic macroroles, and unaccusativity.

References

  1. Newmeyer, Frederick, J. (1986). Linguistic Theory in America (Second ed.). Academic Press.{{cite book}}: CS1 maint: multiple names: authors list (link) See p. 138.
  2. Seuren, Pieter (28 January 2021). "Essentials of Semantic Syntax: an Appetiser". Cadernos de Linguística. 2 (1): 01–20. doi: 10.25189/2675-4916.2021.V2.N1.ID290 . hdl: 21.11116/0000-0007-DAE7-F . Retrieved 27 March 2022.
  3. The Linguistics Wars: Chomsky, Lakoff, and the Battle over Deep Structure. Oxford University Press. 15 October 2021. ISBN   978-0-19-974033-8.
  4. Partee, Barbara (2011). "Formal Semantics: Origins, Issues, Early Impact". The Baltic International Yearbook of Cognition, Logic and Communication. Vol. 6. BIYCLC. pp. 1–52. doi:10.4148/biyclc.v6i0.1580.
  5. Lakoff, George (1990). "Invariance hypothesis: is abstract reasoning based on image-schemas?". Cognitive Linguistics. 1 (1): 39–74. doi:10.1515/cogl.1990.1.1.39. S2CID   144380802.
  6. Freeman, Jeremy (2008). "Mind Games". 9 (Jul 03).{{cite journal}}: Cite journal requires |journal= (help)

Bibliography