Joan Bybee | |
---|---|
Born | February 11, 1945 |
Nationality | American |
Education | |
Known for | Usage-based Phonology, Grammaticalization, Complex Dynamic Systems Theory |
Scientific career | |
Fields | Phonology, Morphology, Linguistic typology, Cognitive Linguistics |
Institutions | |
Website | unm |
Joan Lea Bybee (previously: Hooper; born 11 February 1945 in New Orleans, Louisiana [1] ) is an American linguist and professor emerita at the University of New Mexico. Much of her work concerns grammaticalization, stochastics, modality, morphology, and phonology. Bybee is best known for proposing the theory of usage-based phonology and for her contributions to cognitive and historical linguistics. [2]
Bybee's earliest work in linguistics was framed within a Generative perspective, the dominant theoretical approach to phonology at the time. As her career developed, Bybee's contributions moved progressively from formalist theories towards a functional and cognitive perspective, incorporating insights from morphology, semantics, syntax, child language acquisition and historical linguistics.
In the early and mid-70's, Bybee proposed that the connection between the abstract phonological representation of a word and the actual forms experienced by language users was a more direct one than previously postulated. Her theory of Natural Generative Phonology, elaborated upon and expanded the work of Theo Vennemann, proposing less abstract mental representations of sound structure while arguing for greater proximity between phonetic and phonological forms.
Although belonging to a formalist tradition, Bybee's early work already contained elements that challenged the performance/competence model that underlay all Generative assumptions. Natural Generative Phonology proposed that the mental representation of language results from speakers’ exposure to actual language in use. [3] The proposal that the structure of language derives from actual communication rather than from abstract rules wired in the brain represented a major departure from the mainstream linguistics, an idea Bybee pursued in all her subsequent work.
In 1985, Bybee published her influential volume Morphology: A study of the Relation between Meaning and Form, in which she uncovered semantic regularities across 50 genetically and geographically diverse languages. These meaning similarities manifest themselves in recurring cross-linguistic patterns in morphological systems with respect to tense, aspect and mood. This work runs counter to Chomskyan generative theory, which describes grammar as an independent module of the brain that works in an abstract manner completely detached from semantic considerations.
Alongside linguists Dan Slobin and Carol Moder, Bybee's work helped popularize the concept of mental schemas (or schemata) to explain grammatical structure, especially in terms of connections between morphological forms within a paradigm. Bybee defines schemas as "an emergent generalization over words having similar patterns of semantic and phonological connections". [4] For instance, the English irregular verbs snuck, struck, strung, spun and hung are connected through a schema that builds on similarities between these verbs and across the lexicon: the meaning of past tense, the vowel [ʌ], the final nasal and/or (sequence of) velar consonants, as well as the initial fricative consonant /s/ or /h/.
Connections between individual forms and schemas exist in a network (see below) whose links can be strengthened, weakened and at times also severed or created. According to Bybee, the force that binds the links in a network is actual language usage.
Informed by studies on child language development, morphological change and psycholinguistic experimentation, Bybee proposed in the late 1980s and early 1990s a model to account for the cognitive representation of morphologically complex words: the Network Model. Words entered in the lexicon have varying degrees of lexical strength, due primarily to their token frequency. Words with high lexical strength are easy to access, serve as the bases of morphological relations and exhibit an autonomy that makes them resistant to change and prone to semantic independence. [5]
Diachronic studies figure prominently in Bybee's body of work. Specifically, her work has explored the ways in which grammar emerges through language use via grammaticalization. Grammaticalization describes the concept that individual words or constructions may come to express abstract grammatical meaning (e.g. future tense) as users increasingly pair frequent words with a given meaning.
Bybee served as president of the Linguistic Society of America in 2004. [6] She was named a Fellow of the Linguistic Society of America in 2006. [7]
A lexicon is the vocabulary of a language or branch of knowledge. In linguistics, a lexicon is a language's inventory of lexemes. The word lexicon derives from Greek word λεξικόν, neuter of λεξικός meaning 'of or for words'.
The following outline is provided as an overview and topical guide to linguistics:
Lexicology is the branch of linguistics that analyzes the lexicon of a specific language. A word is the smallest meaningful unit of a language that can stand on its own, and is made up of small components called morphemes and even smaller elements known as phonemes, or distinguishing sounds. Lexicology examines every feature of a word – including formation, spelling, origin, usage, and definition.
In linguistics, morphology is the study of words, including the principles by which they are formed, and how they relate to one another within a language. Most approaches to morphology investigate the structure of words in terms of morphemes, which are the smallest units in a language with some independent meaning. Morphemes include roots that can exist as words by themselves, but also categories such as affixes that can only appear as part of a larger word. For example, in English the root catch and the suffix -ing are both morphemes; catch may appear as its own word, or it may be combined with -ing to form the new word catching. Morphology also analyzes how words behave as parts of speech, and how they may be inflected to express grammatical categories including number, tense, and aspect. Concepts such as productivity are concerned with how speakers create words in specific contexts, which evolves over the history of a language.
Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.
Generative grammar is a research tradition in linguistics that aims to explain the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge. Generative linguists, or generativists, tend to share certain working assumptions such as the competence–performance distinction and the notion that some domain-specific aspects of grammar are partly innate in humans. These assumptions are rejected in non-generative approaches such as usage-based models of language. Generative linguistics includes work in core areas such as syntax, semantics, phonology, psycholinguistics, and language acquisition, with additional extensions to topics including biolinguistics and music cognition.
In generative linguistics, Distributed Morphology is a theoretical framework introduced in 1993 by Morris Halle and Alec Marantz. The central claim of Distributed Morphology is that there is no divide between the construction of words and sentences. The syntax is the single generative engine that forms sound-meaning correspondences, both complex phrases and complex words. This approach challenges the traditional notion of the Lexicon as the unit where derived words are formed and idiosyncratic word-meaning correspondences are stored. In Distributed Morphology there is no unified Lexicon as in earlier generative treatments of word-formation. Rather, the functions that other theories ascribe to the Lexicon are distributed among other components of the grammar.
Construction grammar is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human language. Constructions include words, morphemes, fixed expressions and idioms, and abstract grammatical rules such as the passive voice or the ditransitive. Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts, or from other constructions that are recognized to exist. In construction grammar, every utterance is understood to be a combination of multiple different constructions, which together specify its precise meaning and form.
In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.
Cognitive grammar is a cognitive approach to language developed by Ronald Langacker, which hypothesizes that grammar, semantics, and lexicon exist on a continuum instead of as separate processes altogether. This approach to language was one of the first projects of cognitive linguistics. In this system, grammar is not a formal system operating independently of meaning. Rather, grammar is itself meaningful and inextricable from semantics.
In historical linguistics, grammaticalization is a process of language change by which words representing objects and actions become grammatical markers. Thus it creates new function words from content words, rather than deriving them from existing bound, inflectional constructions. For example, the Old English verb willan 'to want', 'to wish' has become the Modern English auxiliary verb will, which expresses intention or simply futurity. Some concepts are often grammaticalized, while others, such as evidentiality, are not so much.
Langueandparole is a theoretical linguistic dichotomy distinguished by Ferdinand de Saussure in his Course in General Linguistics.
In linguistics, lexicalization is the process of adding words, set phrases, or word patterns to a language's lexicon.
The linguistics wars were extended disputes among American theoretical linguists that occurred mostly during the 1960s and 1970s, stemming from a disagreement between Noam Chomsky and several of his associates and students. The debates started in 1967 when linguists Paul Postal, John R. Ross, George Lakoff, and James D. McCawley —self-dubbed the "Four Horsemen of the Apocalypse"—proposed an alternative approach in which the relation between semantics and syntax is viewed differently, which treated deep structures as meaning rather than syntactic objects. While Chomsky and other generative grammarians argued that meaning is driven by an underlying syntax, generative semanticists posited that syntax is shaped by an underlying meaning. This intellectual divergence led to two competing frameworks in generative semantics and interpretive semantics.
In linguistics, well-formedness is the quality of a clause, word, or other linguistic element that conforms to the grammar of the language of which it is a part. Well-formed words or phrases are grammatical, meaning they obey all relevant rules of grammar. In contrast, a form that violates some grammar rule is ill-formed and does not constitute part of the language.
Linguistics is the scientific study of language. The areas of linguistic analysis are syntax, semantics (meaning), morphology, phonetics, phonology, and pragmatics. Subdisciplines such as biolinguistics and psycholinguistics bridge many of these divisions.
Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.
In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.
The Integrational theory of language is the general theory of language that has been developed within the general linguistic approach of integrational linguistics.
The usage-based linguistics is a linguistics approach within a broader functional/cognitive framework, that emerged since the late 1980s, and that assumes a profound relation between linguistic structure and usage. It challenges the dominant focus, in 20th century linguistics, on considering language as an isolated system removed from its use in human interaction and human cognition. Rather, usage-based models posit that linguistic information is expressed via context-sensitive mental processing and mental representations, which have the cognitive ability to succinctly account for the complexity of actual language use at all levels. Broadly speaking, a usage-based model of language accounts for language acquisition and processing, synchronic and diachronic patterns, and both low-level and high-level structure in language, by looking at actual language use.
{{cite book}}
: CS1 maint: location missing publisher (link)