Part of a series on |
Linguistics |
---|
Portal |
The usage-based linguistics is a linguistics approach within a broader functional/cognitive framework, that emerged since the late 1980s, and that assumes a profound relation between linguistic structure and usage. [1] It challenges the dominant focus, in 20th century linguistics (and in particular in formalism-generativism), on considering language as an isolated system removed from its use in human interaction and human cognition. [1] Rather, usage-based models posit that linguistic information is expressed via context-sensitive mental processing and mental representations, which have the cognitive ability to succinctly account for the complexity of actual language use at all levels (phonetics and phonology, morphology and syntax, pragmatics and semantics). Broadly speaking, a usage-based model of language accounts for language acquisition and processing, synchronic and diachronic patterns, and both low-level and high-level structure in language, by looking at actual language use.
The term usage-based was coined by Ronald Langacker in 1987. [2] Usage-based models of language have become a significant new trend in linguistics since the early 2000s. [1] Influential proponents of usage-based linguistics include Michael Tomasello, Joan Bybee and Morten Christiansen.
Together with related approaches, such as construction grammar, emergent grammar, and language as a complex adaptive system, usage-based linguistics belongs to the wider framework of evolutionary linguistics. It studies the lifespan of linguistic units (e.g. words, suffixes), arguing that they can survive language change through frequent usage or by participating in usage-based generalizations if their syntactic, semantic or pragmatic features overlap with other similar constructions. [3] There is disagreement as to whether the approach is different from memetics or essentially the same. [4]
West coast cognitive functionalism
West Coast cognitive functionalism (WCCF) played a major role in the creation of the usage-based enterprise. Firstly, a crucial point in WCCF was Eleanor Rosch’s paper on semantic categories in human cognition, [5] which studied fuzzy semantic categories with central and peripheral concepts. Subsequently, Robin Lakoff (1987) applied these concepts to linguistic studies. For usage-based models of language, these discoveries legitimized interest in the peripheral phenomena and inspired the examination of the ontological status of the rules themselves. [6] Secondly, WCCF focuses on the effects of social/ textual context and cognitive processes on human thought, instead of established systems and representations, which motivated the study of external sources in usage-based language research. For example, in analyzing the differences between the grammatical notions of subject vs. topic, Li and Thompson (1976), found that the repetition of certain topics by a speech community resulted in the surfacing and crystallization of formal properties into syntactic entities, namely the subject. [7] [8] [9] This notion of syntax and morphology being an outcome of pragmatic and cognitive factors [10] was influential in the development of usage-based models. Thirdly, the WCCF methodology of linguistic typology [11] is similarly practised in usage-based models, in collecting data from real communicative contexts and analyzing them for typological regularities. This highlights an important aspect of usage-based research, the study of methods for the integration of synchrony and diachrony.
Langacker’s Cognitive Grammar
The term ‘usage-based’ was coined by Ronald Langacker in 1987, while doing research on Cognitive Grammar. Langacker identified commonly recurring linguistic patterns (patterns such as those associated with Wh- fronting, subject-verb agreement, the use of present participles, etc.) and represented these supposed rule-governed behaviours on a hierarchical structure. The Cognitive Grammar model represented grammar, semantics and lexicon as associated processes that were laid on a continuum, which provided a theoretical framework that was significant in studying the usage-based conception of language. [12] Consequently, a usage-based model accounts for these rule-governed language behaviours by providing a representational scheme that is entirely instance-based, and able to recognize and uniquely represent each familiar pattern, which occurs with varying strengths at different instances. His usage-based model draws on the cognitive psychology of schemata, [13] which are flexible hierarchical structures that are able to accommodate the complexity of mental stimuli. Similarly, as humans perceive linguistic abstractions as multilayered, ranging from patterns that occur across whole utterances to those that occur in phonetic material, the usage-based model acknowledges the differing levels of granularity in speakers’ knowledge of their language. Langacker's work emphasizes that both abstract structure and instance-based detail are contained in language, differing in granularity but not in basic principles.
Bybee's Dynamic Usage-based framework
Bybee’s work [14] [15] [16] [17] greatly inspired the creation of usage-based models of language. Bybee’s model makes predictions about and explains synchronic, diachronic and typological patterns within languages, such as which variants will occur in which contexts, what forms they will take, and about their diachronic consequences. Using the linguistic phenomenon of splits (when a word starts to show subtle polysemy, and morphological possibilities for the originally single form ensue), Bybee proves that even irreducibly irregular word-forms are seen to be non-arbitrary when the context it occurs in is taken into consideration in the very representation of morphology. Simultaneously, she shows that even seemingly regular allomorphy is context-sensitive. Splits also aligns with the idea that linguistic forms cannot be studied as isolated entities, but rather in relation to the strength of their attachment to other entities. [18]
Hans-Jörg Schmid’s "Entrenchment-and-Conventionalization" Model offers a comprehensive recent summary approach to usage-based thinking. [19] In great detail and with reference to many sub-disciplines and concepts in linguistics he shows how usage mediates between entrenchment, the establishment of linguistic habits in individuals via repetition and associations, and conventionalization, a continuous feedback cycle which builds shared collective linguistic knowledge. All three components connect linguistic utterance types with their respective situative settings and extralinguistic associations.
Advocates of usage-based linguistics including Joan Bybee and Martin Haspelmath argue that statistics of language usage depend on frequency. For instance, it is argued that the English verb tell always has two arguments ('tell something to someone') unlike the verb sell, which more frequently only has a direct object in actual language usage ('sell something'). It is hypothesized that such differences in the recurrence of the indirect object depend on statistical learning based on the language usage encountered by the individual. Jae Jung Song argues that the frequency explanation is circular—certain patterns are often used by people because they are frequent—and that the explanation of frequency issues must be found outside themselves. [20]
Constructions have direct pairing of form to meaning without intermediate structures, making them appropriate for usage-based models. The usage-based model adopts constructions as the basic unit of form-meaning correspondence. [22] [23] [24] [25] A construction is commonly regarded to be a conventionalized string of words. A key feature of a grammar based on constructions is that it can reflect the deeply intertwined lexical items and grammar structure.
From a grammarian perspective, constructions are groupings of words with idiosyncratic behaviour to a certain extent. They mostly take on an unpredictable meaning or pragmatic effect, or are formally special. From a broader perspective, construction can also be seen as processing units or chunks, such as sequences of words (or morphemes) which have been used often enough to be accessed together. This implicates that common words sequences are sometimes constructions even if they do not have idiosyncrasies or form. [26] [27] [28] Additionally, chunks or conventionalized sequences can tend to develop special pragmatic implications that can lead to special meaning over time. They can also develop idiosyncrasies of form in a variety of ways.
Adjectives shown here include crazy, mad, and up the wall, which are semantically related to the word drive. In exemplar models, the idea that memory for linguistic experience is similar to memory for other types of memories is proposed. Every token of linguistic experience impacts cognitive representation. And when stored representations are accessed, the representations change. Additionally, memory storage can store detailed information about processed tokens during linguistic experience, including form and context that these tokens were used. In this model, general categories and grammar units can emerge from linguistic experiences stored in memories, as exemplars are categorized by similarity to each other. Contiguous experiences such as meaning and acoustic shape are also recorded to be linked to each other.
Constructions as chunks
By these means repeated sequences become more fluent. Within a chunk, sequential links are graded in strength based on the frequency of the chunk or perhaps the transitions between the elements of a chunk. A construction is a chunk even though it may contain schematic slots, that is, the elements of a chunk can be interrupted.
Memory storage requires links to connect idiomatic phrases together. In chunking, repeated sequences are represented together as units which can be accessed directly. [29] [30] Through this, repeated sequences are more frequent. Sequential links are assessed in strength based on the frequency of the chunk or transitions between elements within a chunk. Additionally, the individual elements of a chunk can link to elements in other contexts. The example of ‘drive someone crazy’ forms a chunk, however items that compose it are not analyzable individually as words that occur elsewhere in cognitive representation. As chunks are used more frequently, words can lose their associations with exemplars of the same word. This is known as de-categorialization.
In linguistics, the grammar of a natural language is its set of structural rules on speakers' or writers' usage and creation of clauses, phrases, and words. The term can also refer to the study of such rules, a subject that includes phonology, morphology, and syntax, together with phonetics, semantics, and pragmatics. There are, broadly speaking, two different ways to study grammar: traditional grammar and theoretical grammar.
The following outline is provided as an overview and topical guide to linguistics:
Natural language processing (NLP) is an interdisciplinary subfield of computer science and linguistics. It is primarily concerned with giving computers the ability to support and manipulate human language. It involves processing natural language datasets, such as text corpora or speech corpora, using either rule-based or probabilistic machine learning approaches. The goal is a computer capable of "understanding" the contents of documents, including the contextual nuances of the language within them. The technology can then accurately extract information and insights contained in the documents as well as categorize and organize the documents themselves.
Cognitive linguistics is an interdisciplinary branch of linguistics, combining knowledge and research from cognitive science, cognitive psychology, neuropsychology and linguistics. Models and theoretical accounts of cognitive linguistics are considered as psychologically real, and research in cognitive linguistics aims to help understand cognition in general and is seen as a road into the human mind.
Construction grammar is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human language. Constructions include words, morphemes, fixed expressions and idioms, and abstract grammatical rules such as the passive voice or the ditransitive. Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts, or from other constructions that are recognized to exist. In construction grammar, every utterance is understood to be a combination of multiple different constructions, which together specify its precise meaning and form.
In linguistics, a grammatical construction is any syntactic string of words ranging from sentences over phrasal structures to certain complex lexemes, such as phrasal verbs.
In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.
Thomas Givon is a linguist and writer. He is one of the founders of "West Coast Functionalism", today classified as a usage-based model of language, and of the linguistics department at the University of Oregon. Givón advocates an evolutionary approach to language and communication.
Frame semantics is a theory of linguistic meaning developed by Charles J. Fillmore that extends his earlier case grammar. It relates linguistic semantics to encyclopedic knowledge. The basic idea is that one cannot understand the meaning of a single word without access to all the essential knowledge that relates to that word. For example, one would not be able to understand the word "sell" without knowing anything about the situation of commercial transfer, which also involves, among other things, a seller, a buyer, goods, money, the relation between the money and the goods, the relations between the seller and the goods and the money, the relation between the buyer and the goods and the money and so on. Thus, a word activates, or evokes, a frame of semantic knowledge relating to the specific concept to which it refers.
In linguistics and social sciences, markedness is the state of standing out as nontypical or divergent as opposed to regular or common. In a marked–unmarked relation, one term of an opposition is the broader, dominant one. The dominant default or minimum-effort form is known as unmarked; the other, secondary one is marked. In other words, markedness involves the characterization of a "normal" linguistic unit against one or more of its possible "irregular" forms.
Ronald Wayne Langacker is an American linguist and professor emeritus at the University of California, San Diego. He is best known as one of the founders of the cognitive linguistics movement and the creator of cognitive grammar. He has also made significant contributions to the comparative study of Uto-Aztecan languages, publishing several articles on historical Uto-Aztecan linguistics, as well as editing collections of grammar sketches of under-described Uto-Aztecan languages.
Cognitive grammar is a cognitive approach to language developed by Ronald Langacker, which hypothesizes that grammar, semantics, and lexicon exist on a continuum instead of as separate processes altogether. This approach to language was one of the first projects of cognitive linguistics. In this system, grammar is not a formal system operating independently of meaning. Rather, grammar is itself meaningful and inextricable from semantics.
In linguistics, the term lexis designates the complete set of all possible words in a language, or a particular subset of words that are grouped by some specific linguistic criteria. For example, the general term English lexis refers to all words of the English language, while more specific term English religious lexis refers to a particular subset within English lexis, encompassing only words that are semantically related to the religious sphere of life.
Artificial grammar learning (AGL) is a paradigm of study within cognitive psychology and linguistics. Its goal is to investigate the processes that underlie human language learning by testing subjects' ability to learn a made-up grammar in a laboratory setting. It was developed to evaluate the processes of human language learning but has also been utilized to study implicit learning in a more general sense. The area of interest is typically the subjects' ability to detect patterns and statistical regularities during a training phase and then use their new knowledge of those patterns in a testing phase. The testing phase can either use the symbols or sounds used in the training phase or transfer the patterns to another set of symbols or sounds as surface structure.
Joan Lea Bybee is an American linguist and professor emerita at the University of New Mexico. Much of her work concerns grammaticalization, stochastics, modality, morphology, and phonology. Bybee is best known for proposing the theory of usage-based phonology and for her contributions to cognitive and historical linguistics.
Interactional linguistics (IL) is an interdisciplinary approach to grammar and interaction in the field of linguistics, that applies the methods of Conversation Analysis to the study of linguistic structures, including syntax, phonetics, morphology, and so on. Interactional linguistics is based on the principle that linguistic structures and uses are formed through interaction and it aims at understanding how languages are shaped through interaction. The approach focuses on temporality, activity implication and embodiment in interaction. Interactional linguistics asks research questions such as "How are linguistic patterns shaped by interaction?" and "How do linguistic patterns themselves shape interaction?".
Linguistics is the scientific study of language. Linguistics is based on a theoretical as well as a descriptive study of language and is also interlinked with the applied fields of language studies and language learning, which entails the study of specific languages. Before the 20th century, linguistics evolved in conjunction with literary study and did not exclusively employ scientific methods.
In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.
The following outline is provided as an overview of and topical guide to natural-language processing:
Cognitive sociolinguistics is an emerging field of linguistics that aims to account for linguistic variation in social settings with a cognitive explanatory framework. The goal of cognitive sociolinguists is to build a mental model of society, individuals, institutions and their relations to one another. Cognitive sociolinguists also strive to combine theories and methods used in cognitive linguistics and sociolinguistics to provide a more productive framework for future research on language variation. This burgeoning field concerning social implications on cognitive linguistics has yet received universal recognition.
{{cite journal}}
: CS1 maint: multiple names: authors list (link){{cite book}}
: CS1 maint: multiple names: authors list (link){{cite book}}
: CS1 maint: location missing publisher (link){{cite book}}
: |journal=
ignored (help)