Rochelle Lieber is an American Professor of Linguistics at the University of New Hampshire. She is a linguist known for her work in morphology, the syntax-morphology interface, and morphology and lexical semantics. [1]
After receiving an artium baccalaureus degree in anthropology from Vassar (1976), Lieber studied linguistics at the Massachusetts Institute of Technology, receiving her Ph.D. degree in 1980. Her dissertation, On the Organization of the Lexicon, was written under the direction of Morris Halle. [2] It was in this work that she proposed "feature percolation," a mechanism by which the properties of lexical items are inherited by their larger constituent structures, and which she articulates more fully in Lieber 1992 (77ff). Syntacticians and morphologists have made use of the concept of feature percolation in many different ways since Lieber's first proposal. [3] [4] [5]
Professor Lieber has taught at the University of New Hampshire since 1981. She received the University of New Hampshire Award for Excellence in Teaching in 1991.
Lieber is the author of Deconstructing Morphology: Word Formation in Syntactic Theory (Chicago: Chicago University Press, 1992), an influential[ citation needed ] attempt to reduce morphology to the syntactic principles of government and binding theory. In Deconstructing Morphology, Lieber makes two statements that are often quoted: "no one has yet succeeded in deriving the properties of words and the properties of sentences from the same principles of grammar," and "the conceptually simplest possible theory would then be the one in which all morphology is done as a part of syntax" (Lieber 1992: 21).
Lieber's monograph, Morphology and Lexical Semantics (Cambridge: Cambridge University Press, 2004), is the first attempt to develop a theory of the lexical semantics of derivation and compounding.
In addition to several monographs, she is the author of numerous articles and book chapters on morphology.
She served as the co-editor of the Wiley-Blackwell Language and Linguistics Compass.
In 2015 she and co-authors Laurie Bauer and Ingo Plag were the recipients of the Linguistic Society of America's Leonard Bloomfield Book Award for their 2013 work, The Oxford Reference Guide to English Morphology. [6]
In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.
Semantics is the study of reference, meaning, or truth. The term can be used to refer to subfields of several distinct disciplines, including philosophy, linguistics and computer science.
Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.
In linguistics, X-bar theory is a model of phrase-structure grammar and a theory of syntactic category formation that was first proposed by Noam Chomsky in 1970 and further developed by Ray Jackendoff, along the lines of the theory of generative grammar put forth in the 1950s by Chomsky. It attempts to capture the structure of phrasal categories with a single uniform structure called the X-bar schema, basing itself on the assumption that any phrase in natural language is an XP that is headed by a given syntactic category X. It played a significant role in resolving issues that phrase structure rules had, representative of which is the proliferation of grammatical rules, which is against the thesis of generative grammar.
Lexical functional grammar (LFG) is a constraint-based grammar framework in theoretical linguistics. It posits two separate levels of syntactic structure, a phrase structure grammar representation of word order and constituency, and a representation of grammatical functions such as subject and object, similar to dependency grammar. The development of the theory was initiated by Joan Bresnan and Ronald Kaplan in the 1970s, in reaction to the theory of transformational grammar which was current in the late 1970s. It mainly focuses on syntax, including its relation with morphology and semantics. There has been little LFG work on phonology.
In generative linguistics, Distributed Morphology is a theoretical framework introduced in 1993 by Morris Halle and Alec Marantz. The central claim of Distributed Morphology is that there is no divide between the construction of words and sentences. The syntax is the single generative engine that forms sound-meaning correspondences, both complex phrases and complex words. This approach challenges the traditional notion of the Lexicon as the unit where derived words are formed and idiosyncratic word-meaning correspondences are stored. In Distributed Morphology there is no unified Lexicon as in earlier generative treatments of word-formation. Rather, the functions that other theories ascribe to the Lexicon are distributed among other components of the grammar.
Ray Jackendoff is an American linguist. He is professor of philosophy, Seth Merrin Chair in the Humanities and, with Daniel Dennett, co-director of the Center for Cognitive Studies at Tufts University. He has always straddled the boundary between generative linguistics and cognitive linguistics, committed to both the existence of an innate universal grammar and to giving an account of language that is consistent with the current understanding of the human mind and cognition.
In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.
Conceptual semantics is a framework for semantic analysis developed mainly by Ray Jackendoff in 1976. Its aim is to provide a characterization of the conceptual elements by which a person understands words and sentences, and thus to provide an explanatory semantic representation. Explanatory in this sense refers to the ability of a given linguistic theory to describe how a component of language is acquired by a child.
In linguistics, valency or valence is the number and type of arguments controlled by a predicate, content verbs being typical predicates. Valency is related, though not identical, to subcategorization and transitivity, which count only object arguments – valency counts all arguments, including the subject. The linguistic meaning of valency derives from the definition of valency in chemistry. The valency metaphor appeared first in linguistics in Charles Sanders Peirce's essay "The Logic of Relatives" in 1897, and it then surfaced in the works of a number of linguists decades later in the late 1940s and 1950s. Lucien Tesnière is credited most with having established the valency concept in linguistics. A major authority on the valency of the English verbs is Allerton (1982), who made the important distinction between semantic and syntactic valency.
The term predicate is used in one of two ways in linguistics and its subfields. The first defines a predicate as everything in a standard declarative sentence except the subject, and the other views it as just the main content verb or associated predicative expression of a clause. Thus, by the first definition the predicate of the sentence Frank likes cake is likes cake. By the second definition, the predicate of the same sentence is just the content verb likes, whereby Frank and cake are the arguments of this predicate. Differences between these two definitions can lead to confusion.
John Robert "Haj" Ross is an American poet and linguist. He played a part in the development of generative semantics along with George Lakoff, James D. McCawley, and Paul Postal. He was a professor of linguistics at MIT from 1966–1985 and has worked in Brazil, Singapore and British Columbia, and until spring 2021, he taught at the University of North Texas.
Generative semantics was a research program in theoretical linguistics which held that syntactic structures are computed on the basis of meanings rather than the other way around. Generative semantics developed out of transformational generative grammar in the mid-1960s, but stood in opposition to it. The period in which the two research programs coexisted was marked by intense and often personal clashes now known as the linguistics wars. Its proponents included Haj Ross, Paul Postal, James McCawley, and George Lakoff, who dubbed themselves "The Four Horsemen of the Apocalypse".
In linguistics, nominalization or nominalisation is the use of a word that is not a noun as a noun, or as the head of a noun phrase. This change in functional category can occur through morphological transformation, but it does not always. Nominalization can refer, for instance, to the process of producing a noun from another part of speech by adding a derivational affix, but it can also refer to the complex noun that is formed as a result.
Heidi Britton Harley is a Professor of Linguistics at the University of Arizona. Her areas of specialization are formal syntactic theory, morphology, and lexical semantics.
Angelika Kratzer is a professor emerita of linguistics in the department of linguistics at the University of Massachusetts Amherst.
In certain theories of linguistics, thematic relations, also known as semantic roles, are the various roles that a noun phrase may play with respect to the action or state described by a governing verb, commonly the sentence's main verb. For example, in the sentence "Susan ate an apple", Susan is the doer of the eating, so she is an agent; an apple is the item that is eaten, so it is a patient.
Nanosyntax is an approach to syntax where the terminal nodes of syntactic parse trees may be reduced to units smaller than a morpheme. Each unit may stand as an irreducible element and not be required to form a further "subtree." Due to its reduction to the smallest terminal possible, the terminals are smaller than morphemes. Therefore, morphemes and words cannot be itemised as a single terminal, and instead are composed by several terminals. As a result, Nanosyntax can serve as a solution to phenomena that are inadequately explained by other theories of syntax.
The Lexical Integrity Hypothesis (LIH) or Lexical Integrity Principle is a hypothesis in linguistics which states that syntactic transformations do not apply to subparts of words. It functions as a constraint on transformational grammar.
In linguistics, the syntax–semantics interface is the interaction between syntax and semantics. Its study encompasses phenomena that pertain to both syntax and semantics, with the goal of explaining correlations between form and meaning. Specific topics include scope, binding, and lexical semantic properties such as verbal aspect and nominal individuation, semantic macroroles, and unaccusativity.