In linguistics, relational grammar (RG) is a syntactic theory which argues that primitive grammatical relations provide the ideal means to state syntactic rules in universal terms. Relational grammar began as an alternative to transformational grammar.
In relational grammar, constituents that serve as the arguments to predicates are numbered in what is called the grammatical relations (GR) hierarchy. This numbering system corresponds loosely to the notions of subject, direct object and indirect object. The numbering scheme is subject → (1), direct object → (2) and indirect object → (3). Other constituents (such as oblique, genitive, and object of comparative) are called nonterms (N). The predicate is marked (P).
According to Geoffrey K. Pullum (1977), [1] the GR hierarchy directly corresponds to the accessibility hierarchy:
1 | 2 | 3 | N | ||
---|---|---|---|---|---|
subject | direct object | indirect object | other oblique cases | genitive | object of comparative |
A schematic representation of a clause in this formalism might look like:
1 | P | 3 | 2 |
---|---|---|---|
John | gave | Mary | a kiss |
One of the components of RG theory is a set of linguistic universals stated in terms of the numbered roles presented above. Such a universal is the stratal uniqueness law, which states that there can be at most one 1, 2, and 3 per stratum.
Pullum (1977) [1] lists three more universals:
However, Pullum formulated these universals before the discovery of languages with object-initial word order. After the discovery of such languages, he retracted his prior statements. [2]
In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.
In linguistics, an object is any of several types of arguments. In subject-prominent, nominative-accusative languages such as English, a transitive verb typically distinguishes between its subject and any of its objects, which can include but are not limited to direct objects, indirect objects, and arguments of adpositions ; the latter are more accurately termed oblique arguments, thus including other arguments not covered by core grammatical roles, such as those governed by case morphology or relational nouns . In ergative-absolutive languages, for example most Australian Aboriginal languages, the term "subject" is ambiguous, and thus the term "agent" is often used instead to contrast with "object", such that basic word order is often spoken of in terms such as Agent-Object-Verb (AOV) instead of Subject-Object-Verb (SOV). Topic-prominent languages, such as Mandarin, focus their grammars less on the subject-object or agent-object dichotomies but rather on the pragmatic dichotomy of topic and comment.
In linguistics, X-bar theory is a model of phrase-structure grammar and a theory of syntactic category formation that was first proposed by Noam Chomsky in 1970 reformulating the ideas of Zellig Harris (1951,) and further developed by Ray Jackendoff, along the lines of the theory of generative grammar put forth in the 1950s by Chomsky. It attempts to capture the structure of phrasal categories with a single uniform structure called the X-bar schema, basing itself on the assumption that any phrase in natural language is an XP that is headed by a given syntactic category X. It played a significant role in resolving issues that phrase structure rules had, representative of which is the proliferation of grammatical rules, which is against the thesis of generative grammar.
Lexical functional grammar (LFG) is a constraint-based grammar framework in theoretical linguistics. It posits two separate levels of syntactic structure, a phrase structure grammar representation of word order and constituency, and a representation of grammatical functions such as subject and object, similar to dependency grammar. The development of the theory was initiated by Joan Bresnan and Ronald Kaplan in the 1970s, in reaction to the theory of transformational grammar which was current in the late 1970s. It mainly focuses on syntax, including its relation with morphology and semantics. There has been little LFG work on phonology.
Generative grammar, or generativism, is a linguistic theory that regards linguistics as the study of a hypothesised innate grammatical structure. It is a biological or biologistic modification of earlier structuralist theories of linguistics, deriving ultimately from glossematics. Generative grammar considers grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. It is a system of explicit rules that may apply repeatedly to generate an indefinite number of sentences which can be as long as one wants them to be. The difference from structural and functional models is that the object is base-generated within the verb phrase in generative grammar. This purportedly cognitive structure is thought of as being a part of a universal grammar, a syntactic structure which is caused by a genetic mutation in humans.
In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.
Geoffrey Keith Pullum is a British and American linguist specialising in the study of English. He is Professor Emeritus of General Linguistics at the University of Edinburgh.
Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called dependencies. Dependency grammar differs from phrase structure grammar in that while it can identify phrases it tends to overlook phrasal nodes. A dependency structure is determined by the relation between a word and its dependents. Dependency structures are flatter than phrase structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages with free word order, such as Czech or Warlpiri.
In linguistics, valency or valence is the number and type of arguments controlled by a predicate, content verbs being typical predicates. Valency is related, though not identical, to subcategorization and transitivity, which count only object arguments – valency counts all arguments, including the subject. The linguistic meaning of valency derives from the definition of valency in chemistry. The valency metaphor appeared first in linguistics in Charles Sanders Peirce's essay "The Logic of Relatives" in 1897, and it then surfaced in the works of a number of linguists decades later in the late 1940s and 1950s. Lucien Tesnière is credited most with having established the valency concept in linguistics. A major authority on the valency of the English verbs is Allerton (1982), who made the important distinction between semantic and syntactic valency.
In grammar, a complement is a word, phrase, or clause that is necessary to complete the meaning of a given expression. Complements are often also arguments.
Syntactic Structures is an influential work in linguistics by American linguist Noam Chomsky, originally published in 1957. It is an elaboration of his teacher Zellig Harris's model of transformational generative grammar. A short monograph of about a hundred pages, Chomsky's presentation is recognized as one of the most significant studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning. Thus, Chomsky argued for the independence of syntax from semantics.
The term predicate is used in one of two ways in linguistics and its subfields. The first defines a predicate as everything in a standard declarative sentence except the subject, and the other views it as just the main content verb or associated predicative expression of a clause. Thus, by the first definition the predicate of the sentence Frank likes cake is likes cake. By the second definition, the predicate of the same sentence is just the content verb likes, whereby Frank and cake are the arguments of this predicate. Differences between these two definitions can lead to confusion.
In linguistics, raising constructions involve the movement of an argument from an embedded or subordinate clause to a matrix or main clause; in other words, a raising predicate/verb appears with a syntactic argument that is not its semantic argument, but is rather the semantic argument of an embedded predicate. For example, in they seem to be trying, the predicand of trying is the subject of seem. Although English has raising constructions, not all languages do.
In linguistics, an argument is an expression that helps complete the meaning of a predicate, the latter referring in this context to a main verb and its auxiliaries. In this regard, the complement is a closely related concept. Most predicates take one, two, or three arguments. A predicate and its arguments form a predicate-argument structure. The discussion of predicates and arguments is associated most with (content) verbs and noun phrases (NPs), although other syntactic categories can also be construed as predicates and as arguments. Arguments must be distinguished from adjuncts. While a predicate needs its arguments to complete its meaning, the adjuncts that appear with a predicate are optional; they are not necessary to complete the meaning of the predicate. Most theories of syntax and semantics acknowledge arguments and adjuncts, although the terminology varies, and the distinction is generally believed to exist in all languages. Dependency grammars sometimes call arguments actants, following Lucien Tesnière (1959).
In linguistics, a small clause consists of a subject and its predicate, but lacks an overt expression of tense. Small clauses have the semantic subject-predicate characteristics of a clause, and have some, but not all, the properties of a constituent. Structural analyses of small clauses vary according to whether a flat or layered analysis is pursued. The small clause is related to the phenomena of raising-to-object, exceptional case-marking, accusativus cum infinitivo, and object control.
The Cambridge Grammar of the English Language (CamGEL) is a descriptive grammar of the English language. Its primary authors are Rodney Huddleston and Geoffrey K. Pullum. Huddleston was the only author to work on every chapter. It was published by Cambridge University Press in 2002 and has been cited more than 8,000 times.
Syntactic movement is the means by which some theories of syntax address discontinuities. Movement was first postulated by structuralist linguists who expressed it in terms of discontinuous constituents or displacement. Some constituents appear to have been displaced from the position in which they receive important features of interpretation. The concept of movement is controversial and is associated with so-called transformational or derivational theories of syntax. Representational theories, in contrast, reject the notion of movement and often instead address discontinuities with other mechanisms including graph reentrancies, feature passing, and type shifters.
In linguistics, arc pair grammar (APG) is a theory of syntax that aims to formalize and expand upon relational grammar. It primarily builds upon the relational grammar concept of an arc, but also makes use of more formally stated ideas from model theory and graph theory. It was developed in the late 1970s by David E. Johnson and Paul Postal, and formalized in 1980 in the eponymous book Arc Pair Grammar.
Model-theoretic grammars, also known as constraint-based grammars, contrast with generative grammars in the way they define sets of sentences: they state constraints on syntactic structure rather than providing operations for generating syntactic objects. A generative grammar provides a set of operations such as rewriting, insertion, deletion, movement, or combination, and is interpreted as a definition of the set of all and only the objects that these operations are capable of producing through iterative application. A model-theoretic grammar simply states a set of conditions that an object must meet, and can be regarded as defining the set of all and only the structures of a certain sort that satisfy all of the constraints. The approach applies the mathematical techniques of model theory to the task of syntactic description: a grammar is a theory in the logician's sense and the well-formed structures are the models that satisfy the theory.
In semantics, a predicand is an argument in an utterance, specifically that of which something is predicated. By extension, in syntax, it is the constituent in a clause typically functioning as the subject.