Part of a series on |
Linguistics |
---|
Portal |
Construction grammar (often abbreviated CxG) is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human language. Constructions include words (aardvark, avocado), morphemes (anti-, -ing), fixed expressions and idioms (by and large, jog X's memory), and abstract grammatical rules such as the passive voice (The cat was hit by a car) or the ditransitive (Mary gave Alex the ball). Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts, or from other constructions that are recognized to exist. In construction grammar, every utterance is understood to be a combination of multiple different constructions, which together specify its precise meaning and form. [1]
Advocates of construction grammar argue that language and culture are not designed by people, but are 'emergent' or automatically constructed in a process which is comparable to natural selection in species [2] [3] [4] [5] or the formation of natural constructions such as nests made by social insects. [6] Constructions correspond to replicators or memes in memetics and other cultural replicator theories. [7] [8] [5] [9] It is argued that construction grammar is not an original model of cultural evolution, but for essential part the same as memetics. [10] Construction grammar is associated with concepts from cognitive linguistics that aim to show in various ways how human rational and creative behaviour is automatic and not planned. [11] [6]
Construction grammar was first developed in the 1980s by linguists such as Charles Fillmore, Paul Kay, and George Lakoff, in order to analyze idioms and fixed expressions. [12] Lakoff's 1977 paper "Linguistic Gestalts" put forward an early version of CxG, arguing that the meaning of an expression was not simply a function of the meanings of its parts. Instead, he suggested, constructions themselves must have meanings.
Another early study was "There-Constructions," which appeared as Case Study 3 in George Lakoff's Women, Fire, and Dangerous Things . [13] It argued that the meaning of the whole was not a function of the meanings of the parts, that odd grammatical properties of Deictic There-constructions followed from the pragmatic meaning of the construction, and that variations on the central construction could be seen as simple extensions using form-meaning pairs of the central construction.
Fillmore et al.'s (1988) paper on the English let alone construction was a second classic. These two papers propelled cognitive linguists into the study of CxG. Since the late 1990s there has been a shift towards a general preference for the usage-based model.[ citation needed ] The shift towards the usage-based approach in construction grammar has inspired the development of several corpus-based methodologies of constructional analysis (for example, collostructional analysis).
One of the most distinctive features of CxG is its use of multi-word expressions and phrasal patterns as the building blocks of syntactic analysis. [14] One example is the Correlative Conditional construction, found in the proverbial expression The bigger they come, the harder they fall. [15] [16] [17] Construction grammarians point out that this is not merely a fixed phrase; the Correlative Conditional is a general pattern (The Xer, the Yer) with "slots" that can be filled by almost any comparative phrase (e.g. The more you think about it, the less you understand). Advocates of CxG argue these kinds of idiosyncratic patterns are more common than is often recognized, and that they are best understood as multi-word, partially filled constructions. [1]
Construction grammar rejects the idea that there is a sharp dichotomy between lexical items, which are arbitrary and specific, and grammatical rules, which are completely general. Instead, CxG posits that there are linguistic patterns at every level of generality and specificity: from individual words, to partially filled constructions (e.g. drive X crazy), to fully abstract rules (e.g. subject–auxiliary inversion). All of these patterns are recognized as constructions. [18]
In contrast to theories that posit an innate universal grammar for all languages, construction grammar holds that speakers learn constructions inductively as they are exposed to them, using general cognitive processes. It is argued that children pay close attention to each utterance they hear, and gradually make generalizations based on the utterances they have heard. Because constructions are learned, they are expected to vary considerably across different languages. [19]
In construction grammar, as in general semiotics, the grammatical construction is a pairing of form and content. The formal aspect of a construction is typically described as a syntactic template, but the form covers more than just syntax, as it also involves phonological aspects, such as prosody and intonation. The content covers semantic as well as pragmatic meaning.
The semantic meaning of a grammatical construction is made up of conceptual structures postulated in cognitive semantics: image-schemas, frames, conceptual metaphors, conceptual metonymies, prototypes of various kinds, mental spaces, and bindings across these (called "blends"). Pragmatics just becomes the cognitive semantics of communication—the modern version of the old Ross-Lakoff performative hypothesis from the 1960s.
The form and content are symbolically linked in the sense advocated by Langacker.
Thus a construction is treated like a sign in which all structural aspects are integrated parts and not distributed over different modules as they are in the componential model. Consequentially, not only constructions that are lexically fixed, like many idioms, but also more abstract ones like argument structure schemata, are pairings of form and conventionalized meaning. For instance, the ditransitive schema [S V IO DO] is said to express semantic content X CAUSES Y TO RECEIVE Z, just like kill means X CAUSES Y TO DIE.
In construction grammar, a grammatical construction, regardless of its formal or semantic complexity and make up, is a pairing of form and meaning. Thus words and word classes may be regarded as instances of constructions. Indeed, construction grammarians argue that all pairings of form and meaning are constructions, including phrase structures, idioms, words and even morphemes.
Unlike the componential model, construction grammar denies any strict distinction between the two and proposes a syntax–lexicon continuum. [20] The argument goes that words and complex constructions are both pairs of form and meaning and differ only in internal symbolic complexity. Instead of being discrete modules and thus subject to very different processes they form the extremes of a continuum (from regular to idiosyncratic): syntax > subcategorization frame > idiom > morphology > syntactic category > word/lexicon (these are the traditional terms; construction grammars use a different terminology).
In construction grammar, the grammar of a language is made up of taxonomic networks of families of constructions, which are based on the same principles as those of the conceptual categories known from cognitive linguistics, such as inheritance, prototypicality, extensions, and multiple parenting.
Four different models are proposed in relation to how information is stored in the taxonomies:
Because construction grammar does not operate with surface derivations from underlying structures, it adheres to functionalist linguist Dwight Bolinger's principle of no synonymy, on which Adele Goldberg elaborates in her book. [21]
This means that construction grammarians argue, for instance, that active and passive versions of the same proposition are not derived from an underlying structure, but are instances of two different constructions. As constructions are pairings of form and meaning, [22] active and passive versions of the same proposition are not synonymous, but display differences in content: in this case the pragmatic content.
As mentioned above, Construction grammar is a "family" of theories rather than one unified theory. There are a number of formalized Construction grammar frameworks. Some of these are:
Berkeley Construction Grammar (BCG: formerly also called simply Construction Grammar in upper case) focuses on the formal aspects of constructions and makes use of a unification-based framework for description of syntax, not unlike head-driven phrase structure grammar. Its proponents/developers include Charles Fillmore, Paul Kay, Laura Michaelis, and to a certain extent Ivan Sag. Immanent within BCG works like Fillmore and Kay 1995 [23] and Michaelis and Ruppenhofer 2001 [24] is the notion that phrasal representations—embedding relations—should not be used to represent combinatoric properties of lexemes or lexeme classes. For example, BCG abandons the traditional practice of using non-branching domination (NP over N' over N) to describe undetermined nominals that function as NPs, instead introducing a determination construction that requires ('asks for') a non-maximal nominal sister and a lexical 'maximality' feature for which plural and mass nouns are unmarked. BCG also offers a unification-based representation of 'argument structure' patterns as abstract verbal lexeme entries ('linking constructions'). These linking constructions include transitive, oblique goal and passive constructions. These constructions describe classes of verbs that combine with phrasal constructions like the VP construction but contain no phrasal information in themselves.
In the mid-2000s, several of the developers of BCG, including Charles Fillmore, Paul Kay, Ivan Sag and Laura Michaelis, collaborated in an effort to improve the formal rigor of BCG and clarify its representational conventions. The result was Sign Based Construction Grammar (SBCG). SBCG [25] [26] is based on a multiple-inheritance hierarchy of typed feature structures. The most important type of feature structure in SBCG is the sign, with subtypes word, lexeme and phrase. The inclusion of phrase within the canon of signs marks a major departure from traditional syntactic thinking. In SBCG, phrasal signs are licensed by correspondence to the mother of some licit construct of the grammar. A construct is a local tree with signs at its nodes. Combinatorial constructions define classes of constructs. Lexical class constructions describe combinatoric and other properties common to a group of lexemes. Combinatorial constructions include both inflectional and derivational constructions. SBCG is both formal and generative; while cognitive-functional grammarians have often opposed their standards and practices to those of formal, generative grammarians, there is in fact no incompatibility between a formal, generative approach and a rich, broad-coverage, functionally based grammar. It simply happens that many formal, generative theories are descriptively inadequate grammars. SBCG is generative in a way that prevailing syntax-centered theories are not: its mechanisms are intended to represent all of the patterns of a given language, including idiomatic ones; there is no 'core' grammar in SBCG. SBCG a licensing-based theory, as opposed to one that freely generates syntactic combinations and uses general principles to bar illicit ones: a word, lexeme or phrase is well formed if and only if it is described by a lexeme or construction. Recent SBCG works have expanded on the lexicalist model of idiomatically combining expressions sketched out in Sag 2012. [27]
The type of construction grammar associated with linguists like Goldberg and Lakoff looks mainly at the external relations of constructions and the structure of constructional networks. In terms of form and function, this type of construction grammar puts psychological plausibility as its highest desideratum. It emphasizes experimental results and parallels with general cognitive psychology. It also draws on certain principles of cognitive linguistics. In the Goldbergian strand, constructions interact with each other in a network via four inheritance relations: polysemy link, subpart link, metaphorical extension, and finally instance link. [21]
Sometimes, Ronald Langacker's cognitive grammar framework is described as a type of construction grammar. [28] Cognitive grammar deals mainly with the semantic content of constructions, and its central argument is that conceptual semantics is primary to the degree that form mirrors, or is motivated by, content. Langacker argues that even abstract grammatical units like part-of-speech classes are semantically motivated and involve certain conceptualizations.
William A. Croft's radical construction grammar is designed for typological purposes and takes into account cross-linguistic factors. It deals mainly with the internal structure of constructions. Radical construction grammar is totally non-reductionist, and Croft argues that constructions are not derived from their parts, but that the parts are derived from the constructions they appear in. Thus, in radical construction grammar, constructions are linked to Gestalts. Radical construction grammar rejects the idea that syntactic categories, roles, and relations are universal and argues that they are not only language-specific, but also construction specific. Thus, there are no universals that make reference to formal categories, since formal categories are language- and construction-specific. The only universals are to be found in the patterns concerning the mapping of meaning onto form. Radical construction grammar rejects the notion of syntactic relations altogether and replaces them with semantic relations. Like Goldbergian/Lakovian construction grammar and cognitive grammar, radical construction grammar is closely related to cognitive linguistics, and like cognitive grammar, radical construction grammar appears to be based on the idea that form is semantically motivated.
Embodied construction grammar (ECG), which is being developed by the Neural Theory of Language (NTL) group at ICSI, UC Berkeley, and the University of Hawaiʻi, particularly including Benjamin Bergen and Nancy Chang, adopts the basic constructionist definition of a grammatical construction, but emphasizes the relation of constructional semantic content to embodiment and sensorimotor experiences. A central claim is that the content of all linguistic signs involves mental simulations and is ultimately dependent on basic image schemas of the kind advocated by Mark Johnson and George Lakoff, and so ECG aligns itself with cognitive linguistics. Like construction grammar, embodied construction grammar makes use of a unification-based model of representation. A non-technical introduction to the NTL theory behind embodied construction grammar as well as the theory itself and a variety of applications can be found in Jerome Feldman's From Molecule to Metaphor: A Neural Theory of Language (MIT Press, 2006).
Fluid construction grammar (FCG) was designed by Luc Steels and his collaborators for doing experiments on the origins and evolution of language. [29] [30] FCG is a fully operational and computationally implemented formalism for construction grammars and proposes a uniform mechanism for parsing and production. Moreover, it has been demonstrated through robotic experiments that FCG grammars can be grounded in embodiment and sensorimotor experiences. [31] FCG integrates many notions from contemporary computational linguistics such as feature structures and unification-based language processing. Constructions are considered bidirectional and hence usable both for parsing and production. Processing is flexible in the sense that it can even cope with partially ungrammatical or incomplete sentences. FCG is called 'fluid' because it acknowledges the premise that language users constantly change and update their grammars. The research on FCG is conducted at Sony CSL Paris and the AI Lab at the Vrije Universiteit Brussel.
Most of the above approaches to construction grammar have not been implemented as a computational model for large scale practical usage in Natural Language Processing frameworks but interest in construction grammar has been shown by more traditional computational linguists as a contrast to the current boom in more opaque deep learning models. This is largely due to the representational convenience of CxG models and their potential to integrate with current tokenizers as a perceptual layer for further processing in neurally inspired models. [32] Approaches to integrate constructional grammar with existing Natural Language Processing frameworks include hand-built feature sets and templates and used computational models to identify their prevalence in text collections, but some suggestions for more emergent models have been proposed, e.g. in the 2023 Georgetown University Roundtable on Linguistics. [33]
Esa Itkonen, who defends humanistic linguistics and opposes Darwinian linguistics, [34] questions the originality of the work of Adele Goldberg, Michael Tomasello, Gilles Fauconnier, William Croft and George Lakoff. According to Itkonen, construction grammarians have appropriated old ideas in linguistics adding some false claims. [35] For example, construction type and conceptual blending correspond to analogy and blend, respectively, in the works of William Dwight Whitney, Leonard Bloomfield, Charles Hockett, and others. [36]
At the same time, the claim made by construction grammarians, that their research represents a continuation of Saussurean linguistics, has been considered misleading. [37] German philologist Elisabeth Leiss regards construction grammar as regress, linking it with the 19th century social darwinism of August Schleicher. [38] There is a dispute between the advocates of construction grammar and memetics, an evolutionary approach which adheres to the Darwinian view of language and culture. Advocates of construction grammar argue that memetics takes the perspective of intelligent design to cultural evolution while construction grammar rejects human free will in language construction; [39] but, according to memetician Susan Blackmore, this makes construction grammar the same as memetics. [10]
Lastly, the most basic syntactic patterns of English, namely the core grammatical relations subject-verb, verb object and verb-indirect object, are counter-evidence for the very concept of constructions as pairings of linguistic patterns with meanings. [40] Instead of the postulated form-meaning pairing, core grammatical relations possess a wide variability of semantics, exhibiting a neutralization of semantic distinctions. [41] [42] [43] [44] [45] For instance, in a detailed discussion of the dissociation of grammatical case-roles from semantics, Talmy Givon lists the multiple semantic roles of subjects and direct objects in English. [46] As these phenomena are well-established, some linguists propose that core grammatical relations be excluded from CxG as they are not constructions, leaving the theory to be a model merely of idioms or infrequently used, minor patterns. [47] [48]
As the pairing of the syntactic construction and its prototypical meaning are learned in early childhood, [21] children should initially learn the basic constructions with their prototypical semantics, that is, 'agent of action' for the subject in the SV relation, 'affected object of agent's action' for the direct object term in VO, and 'recipient in transfer of possession of object' for the indirect-object in VI. [49] Anat Ninio examined the speech of a large sample of young English-speaking children and found that they do not in fact learn the syntactic patterns with the prototypical semantics claimed to be associated with them, or with any single semantics. [40] The major reason is that such pairings are not consistently modelled for them in parental speech. Examining the maternal speech addressed to the children, Ninio also found that the pattern of subjects, direct objects and indirect objects in mothers’ speech does not provide the required prototypical semantics for the construction to be established. Adele Goldberg and her associates had previously reported similar negative results concerning the pattern of direct objects in parental speech. [1] [50] These findings are a blow to the CxG theory that relies on a learned association of form and prototypical meaning in order to set up the constructions said to form the basic units of syntax.
The following outline is provided as an overview and topical guide to linguistics:
In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.
Cognitive linguistics is an interdisciplinary branch of linguistics, combining knowledge and research from cognitive science, cognitive psychology, neuropsychology and linguistics. Models and theoretical accounts of cognitive linguistics are considered as psychologically real, and research in cognitive linguistics aims to help understand cognition in general and is seen as a road into the human mind.
Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.
Theta roles are the names of the participant roles associated with a predicate: the predicate may be a verb, an adjective, a preposition, or a noun. If an object is in motion or in a steady state as the speakers perceives the state, or it is the topic of discussion, it is called a theme. The participant is usually said to be an argument of the predicate. In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.
In linguistics, a grammatical construction is any syntactic string of words ranging from sentences over phrasal structures to certain complex lexemes, such as phrasal verbs.
Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently, not necessarily some difference between a person's conceptual world and the real world.
Charles J. Fillmore was an American linguist and Professor of Linguistics at the University of California, Berkeley. He received his Ph.D. in Linguistics from the University of Michigan in 1961. Fillmore spent ten years at Ohio State University and a year as a Fellow at the Center for Advanced Study in the Behavioral Sciences at Stanford University before joining Berkeley's Department of Linguistics in 1971. Fillmore was extremely influential in the areas of syntax and lexical semantics.
In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.
Generative semantics was a research program in theoretical linguistics which held that syntactic structures are computed on the basis of meanings rather than the other way around. Generative semantics developed out of transformational generative grammar in the mid-1960s, but stood in opposition to it. The period in which the two research programs coexisted was marked by intense and often personal clashes now known as the linguistics wars. Its proponents included Haj Ross, Paul Postal, James McCawley, and George Lakoff, who dubbed themselves "The Four Horsemen of the Apocalypse".
Frame semantics is a theory of linguistic meaning developed by Charles J. Fillmore that extends his earlier case grammar. It relates linguistic semantics to encyclopedic knowledge. The basic idea is that one cannot understand the meaning of a single word without access to all the essential knowledge that relates to that word. For example, one would not be able to understand the word "sell" without knowing anything about the situation of commercial transfer, which also involves, among other things, a seller, a buyer, goods, money, the relation between the money and the goods, the relations between the seller and the goods and the money, the relation between the buyer and the goods and the money and so on. Thus, a word activates, or evokes, a frame of semantic knowledge relating to the specific concept to which it refers.
Cognitive grammar is a cognitive approach to language developed by Ronald Langacker, which hypothesizes that grammar, semantics, and lexicon exist on a continuum instead of as separate processes altogether. This approach to language was one of the first projects of cognitive linguistics. In this system, grammar is not a formal system operating independently of meaning. Rather, grammar is itself meaningful and inextricable from semantics.
Laura A. Michaelis is a Professor in the Department of Linguistics and a faculty fellow in the Institute of Cognitive Science at the University of Colorado Boulder.
In certain theories of linguistics, thematic relations, also known as semantic roles, are the various roles that a noun phrase may play with respect to the action or state described by a governing verb, commonly the sentence's main verb. For example, in the sentence "Susan ate an apple", Susan is the doer of the eating, so she is an agent; an apple is the item that is eaten, so it is a patient.
The linguistic wars were extended deputes among American theoretical linguists that occurred mostly during the 1960s and 1970s, stemming from a disagreement between Noam Chomsky and several of his associates and students. The debates started in 1967 when linguists Paul Postal, John R. Ross, George Lakoff, and James D. McCawley—self-dubbed the "Four Horsemen of the Apocalypse"—proposed an alternative approach in which the relation between semantics and syntax is viewed differently, which treated deep structures as meaning rather than syntactic objects. While Chomsky and other generative grammarians argued that meaning is driven by an underlying syntax, generative semanticists posited that syntax is shaped by an underlying meaning. This intellectual divergence led to two competing frameworks in generative semantics and interpretive semantics.
Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.
In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.
The Integrational theory of language is the general theory of language that has been developed within the general linguistic approach of integrational linguistics.
The usage-based linguistics is a linguistics approach within a broader functional/cognitive framework, that emerged since the late 1980s, and that assumes a profound relation between linguistic structure and usage. It challenges the dominant focus, in 20th century linguistics, on considering language as an isolated system removed from its use in human interaction and human cognition. Rather, usage-based models posit that linguistic information is expressed via context-sensitive mental processing and mental representations, which have the cognitive ability to succinctly account for the complexity of actual language use at all levels. Broadly speaking, a usage-based model of language accounts for language acquisition and processing, synchronic and diachronic patterns, and both low-level and high-level structure in language, by looking at actual language use.
In linguistics, the autonomy of syntax is the assumption that syntax is arbitrary and self-contained with respect to meaning, semantics, pragmatics, discourse function, and other factors external to language. The autonomy of syntax is advocated by linguistic formalists, and in particular by generative linguistics, whose approaches have hence been called autonomist linguistics.
p. 600 "So what is supposed to be new? Mainly that "argument structure constructions thus have their own meaning, independent of lexical material" ... But this is not new, this is ancient." [Minkä siis pitäisi olla uutta? Lähinnä sen, että "argumenttirakennekonstruktioilla on siis oma, leksikaalisesta aineistosta riippumaton merkityksensä" ... Mutta tämä ei ole uutta, tämä on ikivanhaa.]