David Roach Dowty (born 1945 [1] ) is a linguist known primarily for his work in semantic and syntactic theory, and especially in Montague grammar and Categorial grammar. Dowty is a professor emeritus of linguistics at the Ohio State University, and his research interests mainly lie in Semantic and Syntactic Theory, Lexical semantics and Thematic roles, Categorial grammar, and Semantics of Tense and Aspect.
David Dowty received his PhD from the University of Texas at Austin, with a thesis supervised by Robert Wall and Emmon Bach on the temporal semantics of verbs. [2]
Dowty was editor-in-chief of the journal Linguistics and Philosophy from 1988 to 1992, and associate editor of Language . For several years he was chairman of the Department of Linguistics at the Ohio State University. A one-day symposium was held at the University of Groningen in honour of his sixtieth birthday, subsequently published as Theory and Evidence in Semantics. [2]
In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.
Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.
Richard Merritt Montague was an American mathematician and philosopher who made contributions to mathematical logic and the philosophy of language. He is known for proposing Montague grammar to formalize the semantics of natural language. As a student of Alfred Tarski, he also contributed early developments to axiomatic set theory (ZFC). For the latter half of his life, he was a professor at the University of California, Los Angeles until his early death, believed to be a homicide, at age 40.
Montague grammar is an approach to natural language semantics, named after American logician Richard Montague. The Montague grammar is based on mathematical logic, especially higher-order predicate logic and lambda calculus, and makes use of the notions of intensional logic, via Kripke models. Montague pioneered this approach in the 1960s and early 1970s.
Higher order grammar (HOG) is a grammar theory based on higher-order logic. It can be viewed simultaneously as generative-enumerative or model theoretic.
Categorial grammar is a family of formalisms in natural language syntax that share the central assumption that syntactic constituents combine as functions and arguments. Categorial grammar posits a close relationship between the syntax and semantic composition, since it typically treats syntactic categories as corresponding to semantic types. Categorial grammars were developed in the 1930s by Kazimierz Ajdukiewicz and in the 1950s by Yehoshua Bar-Hillel and Joachim Lambek. It saw a surge of interest in the 1970s following the work of Richard Montague, whose Montague grammar assumed a similar view of syntax. It continues to be a major paradigm, particularly within formal semantics.
Theta roles are the names of the participant roles associated with a predicate: the predicate may be a verb, an adjective, a preposition, or a noun. If an object is in motion or in a steady state as the speakers perceives the state, or it is the topic of discussion, it is called a theme. The participant is usually said to be an argument of the predicate. In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.
According to some linguistics theories, a stative verb is a verb that describes a state of being, in contrast to a dynamic verb, which describes an action. The difference can be categorized by saying that stative verbs describe situations that are static, or unchanging throughout their entire duration, whereas dynamic verbs describe processes that entail change over time. Many languages distinguish between these two types in terms of how they can be used grammatically.
In linguistics, a grammatical agent is the thematic relation of the cause or initiator to an event. The agent is a semantic concept distinct from the subject of a sentence as well as from the topic. While the subject is determined syntactically, primarily through word order, the agent is determined through its relationship to the action expressed by the verb. For example, in the sentence "The little girl was bitten by the dog", girl is the subject, but dog is the agent.
Charles J. Fillmore was an American linguist and Professor of Linguistics at the University of California, Berkeley. He received his Ph.D. in Linguistics from the University of Michigan in 1961. Fillmore spent ten years at Ohio State University and a year as a Fellow at the Center for Advanced Study in the Behavioral Sciences at Stanford University before joining Berkeley's Department of Linguistics in 1971. Fillmore was extremely influential in the areas of syntax and lexical semantics.
Generative semantics was a research program in theoretical linguistics which held that syntactic structures are computed on the basis of meanings rather than the other way around. Generative semantics developed out of transformational generative grammar in the mid-1960s, but stood in opposition to it. The period in which the two research programs coexisted was marked by intense and often personal clashes now known as the linguistics wars. Its proponents included Haj Ross, Paul Postal, James McCawley, and George Lakoff, who dubbed themselves "The Four Horsemen of the Apocalypse".
Semantic bootstrapping is a linguistic theory of child language acquisition which proposes that children can acquire the syntax of a language by first learning and recognizing semantic elements and building upon, or bootstrapping from, that knowledge. This theory proposes that children, when acquiring words, will recognize that words label conceptual categories, such as objects or actions. Children will then use these semantic categories as a cue to the syntactic categories, such as nouns and verbs. Having identified particular words as belonging to a syntactic category, they will then look for other correlated properties of those categories, which will allow them to identify how nouns and verbs are expressed in their language. Additionally, children will use perceived conceptual relations, such as Agent of an event, to identify grammatical relations, such as Subject of a sentence. This knowledge, in turn, allows the learner to look for other correlated properties of those grammatical relations.
In linguistics, an argument is an expression that helps complete the meaning of a predicate, the latter referring in this context to a main verb and its auxiliaries. In this regard, the complement is a closely related concept. Most predicates take one, two, or three arguments. A predicate and its arguments form a predicate-argument structure. The discussion of predicates and arguments is associated most with (content) verbs and noun phrases (NPs), although other syntactic categories can also be construed as predicates and as arguments. Arguments must be distinguished from adjuncts. While a predicate needs its arguments to complete its meaning, the adjuncts that appear with a predicate are optional; they are not necessary to complete the meaning of the predicate. Most theories of syntax and semantics acknowledge arguments and adjuncts, although the terminology varies, and the distinction is generally believed to exist in all languages. Dependency grammars sometimes call arguments actants, following Lucien Tesnière (1959).
In certain theories of linguistics, thematic relations, also known as semantic roles, are the various roles that a noun phrase may play with respect to the action or state described by a governing verb, commonly the sentence's main verb. For example, in the sentence "Susan ate an apple", Susan is the doer of the eating, so she is an agent; an apple is the item that is eaten, so it is a patient.
The linguistics wars were extended disputes among American theoretical linguists that occurred mostly during the 1960s and 1970s, stemming from a disagreement between Noam Chomsky and several of his associates and students. The debates started in 1967 when linguists Paul Postal, John R. Ross, George Lakoff, and James D. McCawley —self-dubbed the "Four Horsemen of the Apocalypse"—proposed an alternative approach in which the relation between semantics and syntax is viewed differently, which treated deep structures as meaning rather than syntactic objects. While Chomsky and other generative grammarians argued that meaning is driven by an underlying syntax, generative semanticists posited that syntax is shaped by an underlying meaning. This intellectual divergence led to two competing frameworks in generative semantics and interpretive semantics.
Glue semantics, or simply Glue, is a linguistic theory of semantic composition and the syntax–semantics interface which assumes that meaning composition is constrained by a set of instructions stated within a formal logic. These instructions, called meaning constructors, state how the meanings of the parts of a sentence can be combined to provide the meaning of the sentence.
In semantics, a donkey sentence is a sentence containing a pronoun which is semantically bound but syntactically free. They are a classic puzzle in formal semantics and philosophy of language because they are fully grammatical and yet defy straightforward attempts to generate their formal language equivalents. In order to explain how speakers are able to understand them, semanticists have proposed a variety of formalisms including systems of dynamic semantics such as Discourse representation theory. Their name comes from the example sentence "Every farmer who owns a donkey beats it", in which "it" acts as a donkey pronoun because it is semantically but not syntactically bound by the indefinite noun phrase "a donkey". The phenomenon is known as donkey anaphora.
Formal semantics is the study of grammatical meaning in natural languages using formal concepts from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.
John A. Nerbonne is an American computational linguist. He was a professor of humanities computing at the University of Groningen until January 2017, when he gave his valedictory address at the celebration of the 30th anniversary of his department there.
In linguistics, the syntax–semantics interface is the interaction between syntax and semantics. Its study encompasses phenomena that pertain to both syntax and semantics, with the goal of explaining correlations between form and meaning. Specific topics include scope, binding, and lexical semantic properties such as verbal aspect and nominal individuation, semantic macroroles, and unaccusativity.