Ivan Andrew Sag (November 9,1949 –September 10,2013) was an American linguist and cognitive scientist. He did research in areas of syntax and semantics as well as work in computational linguistics.
Born in Alliance,Ohio on November 9,1949, [1] [2] Sag attended the Mercersburg Academy but was expelled shortly before graduation. [3] He received a BA from the University of Rochester,an MA from the University of Pennsylvania—where he studied comparative Indo-European languages,Sanskrit,and sociolinguistics—and a PhD from MIT in 1976,writing his dissertation (advised by Noam Chomsky) on ellipsis. [4]
Sag received a Mellon Fellowship at Stanford University in 1978–79,and remained in California from that point on. He was appointed a position in Linguistics at Stanford,and earned tenure there. He died of cancer in 2013. He was married to sociolinguist Penelope Eckert. [5]
Sag made notable contributions to the fields of syntax,semantics,pragmatics,and language processing. His early work was as a member of the research teams that invented and developed head-driven phrase structure grammar (HPSG) as well as generalized phrase structure grammar,HPSG's immediate intellectual predecessor. Later,he worked on Sign-Based Construction Grammar,which blended HPSG with ideas from Berkeley Construction Grammar.
He was the author or co-author of 10 books and over 100 articles. In general,his research late in life primarily concerned constraint-based,lexicalist models of grammar,and their relation to theories of language processing.
Sag was the Sadie Dernham Patek Professor in Humanities,Professor of Linguistics,and Director of the Symbolic Systems Program [6] at Stanford University. A fellow of the American Academy of Arts and Sciences and the Linguistic Society of America,in 2005 he received the LSA's Fromkin Prize for distinguished contributions to the field of linguistics. [7]
He was honored by a volume of studies published in 2013 in his honor,The Core and the Periphery:Data-Driven Perspectives on Syntax Inspired by Ivan A. Sag,edited by Philip Hofmeister and Elisabeth Norcliffe.
In linguistics,syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order,grammatical relations,hierarchical sentence structure (constituency),agreement,the nature of crosslinguistic variation,and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.
Head-driven phrase structure grammar (HPSG) is a highly lexicalized,constraint-based grammar developed by Carl Pollard and Ivan Sag. It is a type of phrase structure grammar,as opposed to a dependency grammar,and it is the immediate successor to generalized phrase structure grammar. HPSG draws from other fields such as computer science and uses Ferdinand de Saussure's notion of the sign. It uses a uniform formalism and is organized in a modular way which makes it attractive for natural language processing.
Generative grammar is a research tradition in linguistics that aims to explain the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge. Generative linguists,or generativists,tend to share certain working assumptions such as the competence–performance distinction and the notion that some domain-specific aspects of grammar are partly innate in humans. These assumptions are rejected in non-generative approaches such as usage-based models of language. Generative linguistics includes work in core areas such as syntax,semantics,phonology,psycholinguistics,and language acquisition,with additional extensions to topics including biolinguistics and music cognition.
Generalized phrase structure grammar (GPSG) is a framework for describing the syntax and semantics of natural languages. It is a type of constraint-based phrase structure grammar. Constraint based grammars are based around defining certain syntactic processes as ungrammatical for a given language and assuming everything not thus dismissed is grammatical within that language. Phrase structure grammars base their framework on constituency relationships,seeing the words in a sentence as ranked,with some words dominating the others. For example,in the sentence "The dog runs","runs" is seen as dominating "dog" since it is the main focus of the sentence. This view stands in contrast to dependency grammars,which base their assumed structure on the relationship between a single word in a sentence and its dependents.
A symbolic linguistic representation is a representation of an utterance that uses symbols to represent linguistic information about the utterance,such as information about phonetics,phonology,morphology,syntax,or semantics. Symbolic linguistic representations are different from non-symbolic representations,such as recordings,because they use symbols to represent linguistic information rather than measurements.
Theta roles are the names of the participant roles associated with a predicate:the predicate may be a verb,an adjective,a preposition,or a noun. If an object is in motion or in a steady state as the speakers perceives the state,or it is the topic of discussion,it is called a theme. The participant is usually said to be an argument of the predicate. In generative grammar,a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example,the verb put requires three arguments.
The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue. Some authors,however,reserve the term for more restricted grammars in the Chomsky hierarchy:context-sensitive grammars or context-free grammars. In a broader sense,phrase structure grammars are also known as constituency grammars. The defining character of phrase structure grammars is thus their adherence to the constituency relation,as opposed to the dependency relation of dependency grammars.
Construction grammar is a family of theories within the field of cognitive linguistics which posit that constructions,or learned pairings of linguistic patterns with meanings,are the fundamental building blocks of human language. Constructions include words,morphemes,fixed expressions and idioms,and abstract grammatical rules such as the passive voice or the ditransitive. Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts,or from other constructions that are recognized to exist. In construction grammar,every utterance is understood to be a combination of multiple different constructions,which together specify its precise meaning and form.
In linguistics,valency or valence is the number and type of arguments and complements controlled by a predicate,content verbs being typical predicates. Valency is related,though not identical,to subcategorization and transitivity,which count only object arguments –valency counts all arguments,including the subject. The linguistic meaning of valency derives from the definition of valency in chemistry. Like valency found in chemistry,there is the binding of specific elements. In the grammatical theory of valency,the verbs organize sentences by binding the specific elements. Examples of elements that would be bound would be the complement and the actant. Although the term originates from valence in chemistry,linguistic valency has a close analogy in mathematics under the term arity.
Arnold Melchior Zwicky is an adjunct professor of linguistics at Stanford University and Distinguished University Professor Emeritus of linguistics at the Ohio State University. The Linguistic Society of America’s Arnold Zwicky Award,given for the first time in 2021,is intended to recognize the contributions of LGBTQ+ scholars in linguistics and is named for Zwicky,the first LGBTQ+ President of the LSA.
Glue semantics,or simply Glue,is a linguistic theory of semantic composition and the syntax–semantics interface which assumes that meaning composition is constrained by a set of instructions stated within a formal logic. These instructions,called meaning constructors,state how the meanings of the parts of a sentence can be combined to provide the meaning of the sentence.
Syntactic movement is the means by which some theories of syntax address discontinuities. Movement was first postulated by structuralist linguists who expressed it in terms of discontinuous constituents or displacement. Some constituents appear to have been displaced from the position in which they receive important features of interpretation. The concept of movement is controversial and is associated with so-called transformational or derivational theories of syntax. Representational theories,in contrast,reject the notion of movement and often instead address discontinuities with other mechanisms including graph reentrancies,feature passing,and type shifters.
In situation theory,situation semantics attempts to provide a solid theoretical foundation for reasoning about common-sense and real world situations,typically in the context of theoretical linguistics,theoretical philosophy,or applied natural language processing,
In linguistics,subcategorization denotes the ability/necessity for lexical items to require/allow the presence and types of the syntactic arguments with which they co-occur. For example,the word "walk" as in "X walks home" requires the noun-phrase X to be animate.
Thomas A. Wasow is Professor of Linguistics,emeritus,and the Clarence Irving Lewis Professor of Philosophy,emeritus at Stanford University.
Deep Linguistic Processing with HPSG - INitiative (DELPH-IN) is a collaboration where computational linguists worldwide develop natural language processing tools for deep linguistic processing of human language. The goal of DELPH-IN is to combine linguistic and statistical processing methods in order to computationally understand the meaning of texts and utterances.
Eloise Jelinek was an American linguist specializing in the study of syntax. Her 1981 doctoral dissertation at the University of Arizona was titled "On Defining Categories:AUX and PREDICATE in Colloquial Egyptian Arabic". She was a member of the faculty of the University of Arizona from 1981 to 1992.
Minimal recursion semantics (MRS) is a framework for computational semantics. It can be implemented in typed feature structure formalisms such as head-driven phrase structure grammar and lexical functional grammar. It is suitable for computational language parsing and natural language generation. MRS enables a simple formulation of the grammatical constraints on lexical and phrasal semantics,including the principles of semantic composition. This technique is used in machine translation.
Georgia M. Green is an American linguist and academic. She is an emeritus professor at the University of Illinois at Urbana-Champaign. Her research has focused on pragmatics,speaker intention,word order and meaning. She has been an advisory editor for several linguistics journals or publishers and she serves on the usage committee for the American Heritage Dictionary.
Emily Menon Bender is an American linguist who is a professor at the University of Washington. She specializes in computational linguistics and natural language processing. She is also the director of the University of Washington's Computational Linguistics Laboratory. She has published several papers on the risks of large language models and on ethics in natural language processing.
International | |
---|---|
National | |
Academics | |
Other |