Construction grammar

Last updated

Construction grammar (often abbreviated CxG) is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human language. Constructions include words (aardvark, avocado), morphemes (anti-, -ing), fixed expressions and idioms (by and large, jog X's memory), and abstract grammatical rules such as the passive voice (The cat was hit by a car) or the ditransitive (Mary gave Alex the ball). Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts, or from other constructions that are recognized to exist. In construction grammar, every utterance is understood to be a combination of multiple different constructions, which together specify its precise meaning and form. [1]

Contents

Advocates of construction grammar argue that language and culture are not designed by people, but are 'emergent' or automatically constructed in a process which is comparable to natural selection in species [2] [3] [4] [5] or the formation of natural constructions such as nests made by social insects. [6] Constructions correspond to replicators or memes in memetics and other cultural replicator theories. [7] [8] [5] [9] It is argued that construction grammar is not an original model of cultural evolution, but for essential part the same as memetics. [10] Construction grammar is associated with concepts from cognitive linguistics that aim to show in various ways how human rational and creative behaviour is automatic and not planned. [11] [6]

History

Construction grammar was first developed in the 1980s by linguists such as Charles Fillmore, Paul Kay, and George Lakoff, in order to analyze idioms and fixed expressions. [12] Lakoff's 1977 paper "Linguistic Gestalts" put forward an early version of CxG, arguing that the meaning of an expression was not simply a function of the meanings of its parts. Instead, he suggested, constructions themselves must have meanings.

Another early study was "There-Constructions," which appeared as Case Study 3 in George Lakoff's Women, Fire, and Dangerous Things . [13] It argued that the meaning of the whole was not a function of the meanings of the parts, that odd grammatical properties of Deictic There-constructions followed from the pragmatic meaning of the construction, and that variations on the central construction could be seen as simple extensions using form-meaning pairs of the central construction.

Fillmore et al.'s (1988) paper on the English let alone construction was a second classic. These two papers propelled cognitive linguists into the study of CxG. Since the late 1990s there has been a shift towards a general preference for the usage-based model.[ citation needed ] The shift towards the usage-based approach in construction grammar has inspired the development of several corpus-based methodologies of constructional analysis (for example, collostructional analysis).

Concepts

One of the most distinctive features of CxG is its use of multi-word expressions and phrasal patterns as the building blocks of syntactic analysis. [14] One example is the Correlative Conditional construction, found in the proverbial expression The bigger they come, the harder they fall. [15] [16] [17] Construction grammarians point out that this is not merely a fixed phrase; the Correlative Conditional is a general pattern (The Xer, the Yer) with "slots" that can be filled by almost any comparative phrase (e.g. The more you think about it, the less you understand). Advocates of CxG argue these kinds of idiosyncratic patterns are more common than is often recognized, and that they are best understood as multi-word, partially filled constructions. [1]

Construction grammar rejects the idea that there is a sharp dichotomy between lexical items, which are arbitrary and specific, and grammatical rules, which are completely general. Instead, CxG posits that there are linguistic patterns at every level of generality and specificity: from individual words, to partially filled constructions (e.g. drive X crazy), to fully abstract rules (e.g. subject–auxiliary inversion). All of these patterns are recognized as constructions. [18]

In contrast to theories that posit an innate universal grammar for all languages, construction grammar holds that speakers learn constructions inductively as they are exposed to them, using general cognitive processes. It is argued that children pay close attention to each utterance they hear, and gradually make generalizations based on the utterances they have heard. Because constructions are learned, they are expected to vary considerably across different languages. [19]

Grammatical construction

In construction grammar, as in general semiotics, the grammatical construction is a pairing of form and content. The formal aspect of a construction is typically described as a syntactic template, but the form covers more than just syntax, as it also involves phonological aspects, such as prosody and intonation. The content covers semantic as well as pragmatic meaning.

The semantic meaning of a grammatical construction is made up of conceptual structures postulated in cognitive semantics: image-schemas, frames, conceptual metaphors, conceptual metonymies, prototypes of various kinds, mental spaces, and bindings across these (called "blends"). Pragmatics just becomes the cognitive semantics of communication—the modern version of the old Ross-Lakoff performative hypothesis from the 1960s.

The form and content are symbolically linked in the sense advocated by Langacker.

Thus a construction is treated like a sign in which all structural aspects are integrated parts and not distributed over different modules as they are in the componential model. Consequentially, not only constructions that are lexically fixed, like many idioms, but also more abstract ones like argument structure schemata, are pairings of form and conventionalized meaning. For instance, the ditransitive schema [S V IO DO] is said to express semantic content X CAUSES Y TO RECEIVE Z, just like kill means X CAUSES Y TO DIE.

In construction grammar, a grammatical construction, regardless of its formal or semantic complexity and make up, is a pairing of form and meaning. Thus words and word classes may be regarded as instances of constructions. Indeed, construction grammarians argue that all pairings of form and meaning are constructions, including phrase structures, idioms, words and even morphemes.

Syntax–lexicon continuum

Unlike the componential model, construction grammar denies any strict distinction between the two and proposes a syntax–lexicon continuum. [20] The argument goes that words and complex constructions are both pairs of form and meaning and differ only in internal symbolic complexity. Instead of being discrete modules and thus subject to very different processes they form the extremes of a continuum (from regular to idiosyncratic): syntax > subcategorization frame > idiom > morphology > syntactic category > word/lexicon (these are the traditional terms; construction grammars use a different terminology).

Grammar as an inventory of constructions

In construction grammar, the grammar of a language is made up of taxonomic networks of families of constructions, which are based on the same principles as those of the conceptual categories known from cognitive linguistics, such as inheritance, prototypicality, extensions, and multiple parenting.

Four different models are proposed in relation to how information is stored in the taxonomies:

  1. Full-entry model
    In the full-entry model information is stored redundantly at all relevant levels in the taxonomy, which means that it operates, if at all, with minimal generalization.[ example needed ]
  2. Usage-based model
    The usage-based model is based on inductive learning, meaning that linguistic knowledge is acquired in a bottom-up manner through use. It allows for redundancy and generalizations, because the language user generalizes over recurring experiences of use.[ example needed ]
  3. Default inheritance model
    According to the default inheritance model, each network has a default central form-meaning pairing from which all instances inherit their features. It thus operates with a fairly high level of generalization, but does also allow for some redundancy in that it recognizes extensions of different types.[ example needed ]
  4. Complete inheritance model
    In the complete inheritance model, information is stored only once at the most superordinate level of the network. Instances at all other levels inherit features from the superordinate item. The complete inheritance does not allow for redundancy in the networks.[ example needed ]

Principle of no synonymy

Because construction grammar does not operate with surface derivations from underlying structures, it adheres to functionalist linguist Dwight Bolinger's principle of no synonymy, on which Adele Goldberg elaborates in her book. [21]

This means that construction grammarians argue, for instance, that active and passive versions of the same proposition are not derived from an underlying structure, but are instances of two different constructions. As constructions are pairings of form and meaning, [22] active and passive versions of the same proposition are not synonymous, but display differences in content: in this case the pragmatic content.

Some construction grammars

As mentioned above, Construction grammar is a "family" of theories rather than one unified theory. There are a number of formalized Construction grammar frameworks. Some of these are:

Berkeley Construction Grammar

Berkeley Construction Grammar (BCG: formerly also called simply Construction Grammar in upper case) focuses on the formal aspects of constructions and makes use of a unification-based framework for description of syntax, not unlike head-driven phrase structure grammar. Its proponents/developers include Charles Fillmore, Paul Kay, Laura Michaelis, and to a certain extent Ivan Sag. Immanent within BCG works like Fillmore and Kay 1995 [23] and Michaelis and Ruppenhofer 2001 [24] is the notion that phrasal representations—embedding relations—should not be used to represent combinatoric properties of lexemes or lexeme classes. For example, BCG abandons the traditional practice of using non-branching domination (NP over N' over N) to describe undetermined nominals that function as NPs, instead introducing a determination construction that requires ('asks for') a non-maximal nominal sister and a lexical 'maximality' feature for which plural and mass nouns are unmarked. BCG also offers a unification-based representation of 'argument structure' patterns as abstract verbal lexeme entries ('linking constructions'). These linking constructions include transitive, oblique goal and passive constructions. These constructions describe classes of verbs that combine with phrasal constructions like the VP construction but contain no phrasal information in themselves.

Sign Based Construction Grammar

In the mid-2000s, several of the developers of BCG, including Charles Fillmore, Paul Kay, Ivan Sag and Laura Michaelis, collaborated in an effort to improve the formal rigor of BCG and clarify its representational conventions. The result was Sign Based Construction Grammar (SBCG). SBCG [25] [26] is based on a multiple-inheritance hierarchy of typed feature structures. The most important type of feature structure in SBCG is the sign, with subtypes word, lexeme and phrase. The inclusion of phrase within the canon of signs marks a major departure from traditional syntactic thinking. In SBCG, phrasal signs are licensed by correspondence to the mother of some licit construct of the grammar. A construct is a local tree with signs at its nodes. Combinatorial constructions define classes of constructs. Lexical class constructions describe combinatoric and other properties common to a group of lexemes. Combinatorial constructions include both inflectional and derivational constructions. SBCG is both formal and generative; while cognitive-functional grammarians have often opposed their standards and practices to those of formal, generative grammarians, there is in fact no incompatibility between a formal, generative approach and a rich, broad-coverage, functionally based grammar. It simply happens that many formal, generative theories are descriptively inadequate grammars. SBCG is generative in a way that prevailing syntax-centered theories are not: its mechanisms are intended to represent all of the patterns of a given language, including idiomatic ones; there is no 'core' grammar in SBCG. SBCG a licensing-based theory, as opposed to one that freely generates syntactic combinations and uses general principles to bar illicit ones: a word, lexeme or phrase is well formed if and only if it is described by a lexeme or construction. Recent SBCG works have expanded on the lexicalist model of idiomatically combining expressions sketched out in Sag 2012. [27]

Goldbergian/Lakovian construction grammar

The type of construction grammar associated with linguists like Goldberg and Lakoff looks mainly at the external relations of constructions and the structure of constructional networks. In terms of form and function, this type of construction grammar puts psychological plausibility as its highest desideratum. It emphasizes experimental results and parallels with general cognitive psychology. It also draws on certain principles of cognitive linguistics. In the Goldbergian strand, constructions interact with each other in a network via four inheritance relations: polysemy link, subpart link, metaphorical extension, and finally instance link. [21]

Cognitive grammar

Sometimes, Ronald Langacker's cognitive grammar framework is described as a type of construction grammar. [28] Cognitive grammar deals mainly with the semantic content of constructions, and its central argument is that conceptual semantics is primary to the degree that form mirrors, or is motivated by, content. Langacker argues that even abstract grammatical units like part-of-speech classes are semantically motivated and involve certain conceptualizations.

Radical construction grammar

William A. Croft's radical construction grammar is designed for typological purposes and takes into account cross-linguistic factors. It deals mainly with the internal structure of constructions. Radical construction grammar is totally non-reductionist, and Croft argues that constructions are not derived from their parts, but that the parts are derived from the constructions they appear in. Thus, in radical construction grammar, constructions are linked to Gestalts. Radical construction grammar rejects the idea that syntactic categories, roles, and relations are universal and argues that they are not only language-specific, but also construction specific. Thus, there are no universals that make reference to formal categories, since formal categories are language- and construction-specific. The only universals are to be found in the patterns concerning the mapping of meaning onto form. Radical construction grammar rejects the notion of syntactic relations altogether and replaces them with semantic relations. Like Goldbergian/Lakovian construction grammar and cognitive grammar, radical construction grammar is closely related to cognitive linguistics, and like cognitive grammar, radical construction grammar appears to be based on the idea that form is semantically motivated.

Embodied construction grammar

Embodied construction grammar (ECG), which is being developed by the Neural Theory of Language (NTL) group at ICSI, UC Berkeley, and the University of Hawaiʻi, particularly including Benjamin Bergen and Nancy Chang, adopts the basic constructionist definition of a grammatical construction, but emphasizes the relation of constructional semantic content to embodiment and sensorimotor experiences. A central claim is that the content of all linguistic signs involves mental simulations and is ultimately dependent on basic image schemas of the kind advocated by Mark Johnson and George Lakoff, and so ECG aligns itself with cognitive linguistics. Like construction grammar, embodied construction grammar makes use of a unification-based model of representation. A non-technical introduction to the NTL theory behind embodied construction grammar as well as the theory itself and a variety of applications can be found in Jerome Feldman's From Molecule to Metaphor: A Neural Theory of Language (MIT Press, 2006).

Fluid construction grammar

Fluid construction grammar (FCG) was designed by Luc Steels and his collaborators for doing experiments on the origins and evolution of language. [29] [30] FCG is a fully operational and computationally implemented formalism for construction grammars and proposes a uniform mechanism for parsing and production. Moreover, it has been demonstrated through robotic experiments that FCG grammars can be grounded in embodiment and sensorimotor experiences. [31] FCG integrates many notions from contemporary computational linguistics such as feature structures and unification-based language processing. Constructions are considered bidirectional and hence usable both for parsing and production. Processing is flexible in the sense that it can even cope with partially ungrammatical or incomplete sentences. FCG is called 'fluid' because it acknowledges the premise that language users constantly change and update their grammars. The research on FCG is conducted at Sony CSL Paris and the AI Lab at the Vrije Universiteit Brussel.


Implemented construction grammar

Most of the above approaches to construction grammar have not been implemented as a computational model for large scale practical usage in Natural Language Processing frameworks but interest in construction grammar has been shown by more traditional computational linguists as a contrast to the current boom in more opaque deep learning models. This is largely due to the representational convenience of CxG models and their potential to integrate with current tokenizers as a perceptual layer for further processing in neurally inspired models. [32] Approaches to integrate constructional grammar with existing Natural Language Processing frameworks include hand-built feature sets and templates and used computational models to identify their prevalence in text collections, but some suggestions for more emergent models have been proposed, e.g. in the 2023 Georgetown University Roundtable on Linguistics. [33]

Criticism

Esa Itkonen, who defends humanistic linguistics and opposes Darwinian linguistics, [34] questions the originality of the work of Adele Goldberg, Michael Tomasello, Gilles Fauconnier, William Croft and George Lakoff. According to Itkonen, construction grammarians have appropriated old ideas in linguistics adding some false claims. [35] For example, construction type and conceptual blending correspond to analogy and blend, respectively, in the works of William Dwight Whitney, Leonard Bloomfield, Charles Hockett, and others. [36]

At the same time, the claim made by construction grammarians, that their research represents a continuation of Saussurean linguistics, has been considered misleading. [37] German philologist Elisabeth Leiss regards construction grammar as regress, linking it with the 19th century social darwinism of August Schleicher. [38] There is a dispute between the advocates of construction grammar and memetics, an evolutionary approach which adheres to the Darwinian view of language and culture. Advocates of construction grammar argue that memetics takes the perspective of intelligent design to cultural evolution while construction grammar rejects human free will in language construction; [39] but, according to memetician Susan Blackmore, this makes construction grammar the same as memetics. [10]

Lastly, the most basic syntactic patterns of English, namely the core grammatical relations subject-verb, verb object and verb-indirect object, are counter-evidence for the very concept of constructions as pairings of linguistic patterns with meanings. [40] Instead of the postulated form-meaning pairing, core grammatical relations possess a wide variability of semantics, exhibiting a neutralization of semantic distinctions. [41] [42] [43] [44] [45] For instance, in a detailed discussion of the dissociation of grammatical case-roles from semantics, Talmy Givon lists the multiple semantic roles of subjects and direct objects in English. [46] As these phenomena are well-established, some linguists propose that core grammatical relations be excluded from CxG as they are not constructions, leaving the theory to be a model merely of idioms or infrequently used, minor patterns. [47] [48]

As the pairing of the syntactic construction and its prototypical meaning are learned in early childhood, [21] children should initially learn the basic constructions with their prototypical semantics, that is, 'agent of action' for the subject in the SV relation, 'affected object of agent's action' for the direct object term in VO, and 'recipient in transfer of possession of object' for the indirect-object in VI. [49] Anat Ninio examined the speech of a large sample of young English-speaking children and found that they do not in fact learn the syntactic patterns with the prototypical semantics claimed to be associated with them, or with any single semantics. [40] The major reason is that such pairings are not consistently modelled for them in parental speech. Examining the maternal speech addressed to the children, Ninio also found that the pattern of subjects, direct objects and indirect objects in mothers’ speech does not provide the required prototypical semantics for the construction to be established. Adele Goldberg and her associates had previously reported similar negative results concerning the pattern of direct objects in parental speech. [1] [50] These findings are a blow to the CxG theory that relies on a learned association of form and prototypical meaning in order to set up the constructions said to form the basic units of syntax.

See also

Related Research Articles

The following outline is provided as an overview and topical guide to linguistics:

<span class="mw-page-title-main">Syntax</span> System responsible for combining morphemes into complex structures

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

Cognitive linguistics is an interdisciplinary branch of linguistics, combining knowledge and research from cognitive science, cognitive psychology, neuropsychology and linguistics. Models and theoretical accounts of cognitive linguistics are considered as psychologically real, and research in cognitive linguistics aims to help understand cognition in general and is seen as a road into the human mind.

Theta roles are the names of the participant roles associated with a predicate: the predicate may be a verb, an adjective, a preposition, or a noun. If an object is in motion or in a steady state as the speakers perceives the state, or it is the topic of discussion, it is called a theme. The participant is usually said to be an argument of the predicate. In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.

In linguistics, a grammatical construction is any syntactic string of words ranging from sentences over phrasal structures to certain complex lexemes, such as phrasal verbs.

Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently, not necessarily some difference between a person's conceptual world and the real world.

<span class="mw-page-title-main">Charles J. Fillmore</span> American linguist

Charles J. Fillmore was an American linguist and Professor of Linguistics at the University of California, Berkeley. He received his Ph.D. in Linguistics from the University of Michigan in 1961. Fillmore spent ten years at Ohio State University and a year as a Fellow at the Center for Advanced Study in the Behavioral Sciences at Stanford University before joining Berkeley's Department of Linguistics in 1971. Fillmore was extremely influential in the areas of syntax and lexical semantics.

In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.

Generative semantics was a research program in theoretical linguistics which held that syntactic structures are computed on the basis of meanings rather than the other way around. Generative semantics developed out of transformational generative grammar in the mid-1960s, but stood in opposition to it. The period in which the two research programs coexisted was marked by intense and often personal clashes now known as the linguistics wars. Its proponents included Haj Ross, Paul Postal, James McCawley, and George Lakoff, who dubbed themselves "The Four Horsemen of the Apocalypse".

Frame semantics is a theory of linguistic meaning developed by Charles J. Fillmore that extends his earlier case grammar. It relates linguistic semantics to encyclopedic knowledge. The basic idea is that one cannot understand the meaning of a single word without access to all the essential knowledge that relates to that word. For example, one would not be able to understand the word "sell" without knowing anything about the situation of commercial transfer, which also involves, among other things, a seller, a buyer, goods, money, the relation between the money and the goods, the relations between the seller and the goods and the money, the relation between the buyer and the goods and the money and so on. Thus, a word activates, or evokes, a frame of semantic knowledge relating to the specific concept to which it refers.

<span class="mw-page-title-main">Laura Michaelis</span> American linguist

Laura A. Michaelis is a Professor in the Department of Linguistics and a faculty fellow in the Institute of Cognitive Science at the University of Colorado Boulder.

In certain theories of linguistics, thematic relations, also known as semantic roles, are the various roles that a noun phrase may play with respect to the action or state described by a governing verb, commonly the sentence's main verb. For example, in the sentence "Susan ate an apple", Susan is the doer of the eating, so she is an agent; an apple is the item that is eaten, so it is a patient.

The linguistics wars were a protracted academic dispute inside American theoretical linguistics that took place mostly in the 1960s and 1970s, stemming from an intellectual falling out between Noam Chomsky and some of his early colleagues and doctoral students. The debate began in 1967, when linguists Paul Postal, "Haj" Ross, George Lakoff, and James McCawley—self-dubbed the "Four Horsemen of the Apocalypse" —proposed an approach to the relationship between syntax and semantics, which treated deep structures as meanings rather than syntactic objects. While Chomsky and other generative grammarians argued that the meaning of a sentence was derived from its syntax, the generative semanticists argued that syntax was derived from meaning.

Stratificational Linguistics, also known as Neurocognitive Linguistics (NCL) or Relational Network Theory (RNT), is an approach to linguistics advocated by Sydney Lamb that suggests language usage and production to be stratificational in nature. It regards the linguistic system of individual speakers as consisting of networks of relations which interconnect across different 'strata' or levels of language. These relational networks are hypothesized to correspond to maps of cortical columns in the human brain. Consequently, Stratificational Linguistics is related to the wider family of cognitive linguistic theories. Furthermore, as a functionalist approach to linguistics, Stratificational Linguistics shares a close relationship with Systemic Functional Linguistics (SFL).

<i>Aspects of the Theory of Syntax</i> 1965 book by Noam Chomsky

Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.

Formal semantics is the study of grammatical meaning in natural languages using formal tools from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

<span class="mw-page-title-main">Formalism (linguistics)</span> Concept in linguistics

In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.

The Integrational theory of language is the general theory of language that has been developed within the general linguistic approach of integrational linguistics.

The usage-based linguistics is a linguistics approach within a broader functional/cognitive framework, that emerged since the late 1980s, and that assumes a profound relation between linguistic structure and usage. It challenges the dominant focus, in 20th century linguistics, on considering language as an isolated system removed from its use in human interaction and human cognition. Rather, usage-based models posit that linguistic information is expressed via context-sensitive mental processing and mental representations, which have the cognitive ability to succinctly account for the complexity of actual language use at all levels. Broadly speaking, a usage-based model of language accounts for language acquisition and processing, synchronic and diachronic patterns, and both low-level and high-level structure in language, by looking at actual language use.

In linguistics, the autonomy of syntax is the assumption that syntax is arbitrary and self-contained with respect to meaning, semantics, pragmatics, discourse function, and other factors external to language. The autonomy of syntax is advocated by linguistic formalists, and in particular by generative linguistics, whose approaches have hence been called autonomist linguistics.

References

  1. 1 2 3 Goldberg, Adele (2006). Constructions at Work: The Nature of Generalization in Language. New York: Oxford University Press. pp. 5–10. ISBN   0-19-9-268525.
  2. Croft, William (2006). "The relevance of an evolutionary model to historical linguistics". In Nedergaard Thomsen, Ole (ed.). Competing Models of Linguistic Change: Evolution and Beyond. Current Issues in Linguistic Theory. Vol. 279. John Benjamins. pp. 91–132. doi:10.1075/cilt.279.08cro. ISBN   978-90-272-4794-0.
  3. Beckner, Clay; Blythe, Richard; Bybee, Joan; Christiansen, Morten H.; Croft, William; Ellis, Nick C.; Holland, John; Ke, Jinyun; Larsen-Freeman, Diane; Schoenemann, Tom (2009). "Language is a Complex Adaptive System: Position Paper" (PDF). Language Learning. 59 (1): 1–26. doi:10.1111/j.1467-9922.2009.00533.x . Retrieved 2020-03-04.
  4. Cornish, Hannah; Tamariz, Monica; Kirby, Simon (2009). "Complex Adaptive Systems and the Origins of Adaptive Structure: What Experiments Can Tell Us" (PDF). Language Learning. 59 (1): 187–205. doi:10.1111/j.1467-9922.2009.00540.x. S2CID   56199987 . Retrieved 2020-06-30.
  5. 1 2 MacWhinney, Brian (2015). "Introduction – language emergence". In MacWhinney, Brian; O'Grady, William (eds.). Handbook of Language Emergence. Wiley. pp. 1–31. ISBN   9781118346136.
  6. 1 2 Dahl, Östen (2004). The Growth and Maintenance of Linguistic Complexity. John Benjamins. ISBN   9781588115546.
  7. Kirby, Simon (2013). "Transitions: the evolution of linguistic replicators". In Binder; Smith (eds.). The Language Phenomenon (PDF). The Frontiers Collection. Springer. pp. 121–138. doi:10.1007/978-3-642-36086-2_6. ISBN   978-3-642-36085-5 . Retrieved 2020-03-04.
  8. Zehentner, Eva (2019). Competition in Language Change: the Rise of the English Dative Alternation. De Gruyter Mouton. ISBN   978-3-11-063385-6.
  9. Peschek, Ilka (2010). "Die Konstruktion als kulturelle Einheit". Zeitschrift für Germanistische Linguistik. 38 (3): 451–457. doi:10.1515/ZGL.2010.031. S2CID   143951283.
  10. 1 2 Blackmore, Susan (2008). "Memes shape brains shape memes". Behavioral and Brain Sciences. 31 (5): 513. doi:10.1017/S0140525X08005037 . Retrieved 2020-12-22.
  11. Lakoff, George; Johnson, Mark (1999). Philosophy in the Flesh : the Embodied Mind and its Challenge to Western Thought. Basic Books. ISBN   0465056733.
  12. Croft, William (2001). Radical Construction Grammar: Syntactic Theory in Typological Perspective. New York: Oxford University Press. p. 15. ISBN   978-0-19-829954-7.
  13. Lakoff, George (1987). Women, Fire, and Dangerous Things: What Categories Reveal about the Mind . Chicago: University of Chicago Press. ISBN   9780226468037.
  14. Boas, Hans, Ivan Sag and Paul Kay (2012). "Introducing Sign Based Construction Grammar" (PDF). p. 19. Retrieved 2019-09-14.
  15. Sag, Ivan (2010). "English Filler-Gap Constructions". Language. 86 (3): 486–545. CiteSeerX   10.1.1.138.2274 . doi:10.1353/lan.2010.0002. S2CID   14934876.
  16. Fillmore, Charles (1986). "Varieties of Conditional Sentences". Eastern States Conference on Linguistics. 3: 163–182.
  17. Michaelis, Laura A. (1994-01-01). "A Case of Constructional Polysemy in Latin". Studies in Language. 18 (1): 45–70. CiteSeerX   10.1.1.353.1125 . doi:10.1075/sl.18.1.04mic. ISSN   0378-4177.
  18. Kay, Paul and Michaelis, Laura A. (2012). Constructional Meaning and Compositionality. In C. Maienborn, K. von Heusinger and P. Portner (eds.), Semantics: An International Handbook of Natural Language Meaning. Vol. 3. Berlin: de Gruyter. 2271-2296
  19. Goldberg, Adele E. (2003). "Constructions: a new theoretical approach to language" (PDF). Trends in Cognitive Sciences. 7 (5): 219–224. doi:10.1016/S1364-6613(03)00080-9. PMID   12757824. S2CID   12393863.
  20. Dufter, Andreas, and Stark, Elisabeth (eds., 2017) Manual of Romance Morphosyntax and Syntax , Walter de Gruyter GmbH & Co KG
  21. 1 2 3 Goldberg, Adele (1995). Constructions: A construction grammar approach to argument structure. Chicago/London: University of Chicago Press.
  22. Hoffmann, Thomas; Trousdale, Graeme (2013-04-18). Hoffmann, Thomas; Trousdale, Graeme (eds.). Construction Grammar. Vol. 1. doi:10.1093/oxfordhb/9780195396683.013.0001.
  23. Fillmore, Charles J. and Paul Kay. 1995. A Construction Grammar Coursebook. Unpublished ms, University of California, Berkeley.
  24. Michaelis, L. A., & Ruppenhofer, J. 2001. Beyond alternations: A constructional account of the applicative pattern in German. Stanford: CSLI Publications.
  25. Boas, H.C. and Sag, I.A. eds., 2012. Sign-based construction grammar (pp. xvi+-391). CSLI Publications/Center for the Study of Language and Information.
  26. Michaelis, L.A., 2009. Sign-based construction grammar. The Oxford handbook of linguistic analysis, pp.155-176.
  27. Sag, Ivan A. (2012) Sign-Based Construction Grammar: An informal synopsis, in H. C. Boas and I. A. Sag, (eds), Sign-Based Construction Grammar. Stanford: CSLI Publications). 69-202
  28. Langacker, Ronald (2016-10-05). "Trees, Assemblies, Chains, and Windows". YouTube. FrameNet Brazil. Archived from the original on 2021-12-12. Retrieved 2021-05-07.
  29. Steels, Luc, ed. (2011). Design Patterns in Fluid Construction Grammar. Amsterdam: John Benjamins.
  30. Steels, Luc, ed. (2012). Computational Issues in Fluid Construction Grammar. Heidelberg: Springer.
  31. Steels, Luc; Hild, Manfred, eds. (2012). Language Grounding in Robots. New York: Springer.
  32. Karlgren, Jussi; Kanerva, Pentti (2019). "High-dimensional distributed semantic spaces for utterances". Natural Language Engineering. 25 (4). Retrieved 28 September 2023.
  33. "Workshop on CxG + NLP". Georgetown University 2023 Roundtable on Linguistics. Georgetown University. Retrieved 28 September 2023.
  34. Itkonen, Esa (2011). "On Coseriu's legacy" (PDF). Energeia (III): 1–29. doi:10.55245/energeia.2011.001. S2CID   247142924 . Retrieved 2020-01-14.
  35. Itkonen, Esa (2011). "Konstruktiokielioppi ja analogia". Virittäjä (in Finnish) (4): 81–117. Retrieved 2020-06-29. p. 600 "So what is supposed to be new? Mainly that "argument structure constructions thus have their own meaning, independent of lexical material" ... But this is not new, this is ancient." [Minkä siis pitäisi olla uutta? Lähinnä sen, että "argumenttirakennekonstruktioilla on siis oma, leksikaalisesta aineistosta riippumaton merkityksensä" ... Mutta tämä ei ole uutta, tämä on ikivanhaa.]
  36. Itkonen, Esa (2005). Analogy as Structure and Process. Approaches in linguistics, cognitive psychology and philosophy of science. John Benjamins. ISBN   9789027294012.
  37. Elffers, Els (2012). "Saussurean structuralism and cognitive linguistics". Histoire épistemologique langage. 34 (1): 19–40. doi:10.3406/hel.2012.3235. S2CID   170602847 . Retrieved 2020-06-29.
  38. Leiss, Elisabeth (2009). Sprachphilosophie. De Gruyter. ISBN   9783110217001.
  39. Christiansen, Morten H.; Chater, Nick (2008). "Language as shaped by the brain" (PDF). Behavioral and Brain Sciences. 31 (5): 489–558. doi:10.1017/S0140525X08004998. PMID   18826669 . Retrieved 2020-12-22.
  40. 1 2 Ninio, Anat (2011). Syntactic development, its input and output. Oxford: Oxford University Press. ISBN   9780199565962.
  41. Andrews, Avery D. (1985). The major functions of the noun phrase. In T. Shopen (ed), Language typology and syntactic description, Vol. 1: Clause structure (pp. 62 154). Cambridge: Cambridge University Press.
  42. Dik, Simon C. (1997). The theory of functional grammar, Part 1. Berlin: Mouton de Gruyter.
  43. Kibrik, Alexander E. (1997). Beyond subject and object: toward a comprehensive relational typology. Linguistic Typology, 13, 279 346.
  44. Lyons, John (1968). Introduction to theoretical linguistics. Cambridge: Cambridge University Press. (p. 439)
  45. Van Valin, Robert D. Jr. and LaPolla, Randy (1997). Syntax: structure, meaning and function. Cambridge: Cambridge University Press.
  46. Givón, Talmy (1997). Grammatical relations: an introduction. In T. Givón (ed), Grammatical relations: A functionalist perspective (pp. 1 84). Amsterdam: John Benjamins. (pp. 2-3)
  47. Jackendoff, Ray S. (1997). Twistin' the night away. Language, 73, 534 559.
  48. Ariel, Mira (2008). A review of A. E. Goldberg (2006). Constructions at work: The nature of generalizations in language. Oxford: Oxford University Press. Language, 84, 632 636.
  49. Taylor, John R. (1998). Syntactic constructions as prototype categories. In M. Tomasello, (ed), The new psychology of language: Cognitive and functional approaches to language structure (pp. 177-202). Mahwah, NJ: Lawrence Erlbaum. (p. 187)
  50. Sethuraman, Nitya and Goodman, Judith C. (2004). Children's mastery of the transitive construction. In E. V. Clark (ed), Online proceedings of the 32nd session of the Stanford Child Language Research Forum (pp. 60-67). Stanford, CA: CSLI Publications. http://www csli.stanford.edu/pubs

Further reading