This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these template messages)
|
A sentence diagram is a pictorial representation of the grammatical structure of a sentence. The term "sentence diagram" is used more when teaching written language, where sentences are diagrammed. The model shows the relations between words and the nature of sentence structure and can be used as a tool to help recognize which potential sentences are actual sentences.
The Reed–Kellogg system was developed by Alonzo Reed and Brainerd Kellogg for teaching grammar to students through visualization. [1] It lost some support in the 1970s in the US, but has spread to Europe. [2] It is considered "traditional" in comparison to the parse trees of academic linguists. [3]
Simple sentences in the Reed–Kellogg system are diagrammed according to these forms:
The diagram of a simple sentence begins with a horizontal line called the base. The subject is written on the left, the predicate on the right, separated by a vertical bar that extends through the base. The predicate must contain a verb, and the verb either requires other sentence elements to complete the predicate, permits them to do so, or precludes them from doing so. The verb and its object, when present, are separated by a line that ends at the baseline. If the object is a direct object, the line is vertical. If the object is a predicate noun or adjective, the line looks like a backslash, \, sloping toward the subject.
Modifiers of the subject, predicate, or object are placed below the baseline:
Modifiers, such as adjectives (including articles) and adverbs, are placed on slanted lines below the word they modify. Prepositional phrases are also placed beneath the word they modify; the preposition goes on a slanted line and the slanted line leads to a horizontal line on which the object of the preposition is placed.
These basic diagramming conventions are augmented for other types of sentence structures, e.g. for coordination and subordinate clauses.
Reed–Kellogg diagrams reflect, to some degree, concepts underlying modern parse trees. Those concepts are the constituency relation of phrase structure grammars and the dependency relation of dependency grammars. These two relations are illustrated here adjacent to each other for comparison, where D means Determiner, N means Noun, NP means Noun Phrase, S means Sentence, V means Verb, VP means Verb Phrase and IP means Inflectional Phrase.
Constituency is a one-to-one-or-more relation; every word in the sentence corresponds to one or more nodes in the tree diagram. Dependency, in contrast, is a one-to-one relation; every word in the sentence corresponds to exactly one node in the tree diagram. Both parse trees employ the convention where the category acronyms (e.g. N, NP, V, VP) are used as the labels on the nodes in the tree. The one-to-one-or-more constituency relation is capable of increasing the amount of sentence structure to the upper limits of what is possible. The result can be very "tall" trees, such as those associated with X-bar theory. Both constituency-based and dependency-based theories of grammar have established traditions. [4] [5]
Reed–Kellogg diagrams employ both of these modern tree generating relations. The constituency relation is present in the Reed–Kellogg diagrams insofar as subject, verb, object, and/or predicate are placed equi-level on the horizontal base line of the sentence and divided by a vertical or slanted line. In a Reed–Kellogg diagram, the vertical dividing line that crosses the base line corresponds to the binary division in the constituency-based tree (S → NP + VP), and the second vertical dividing line that does not cross the baseline (between verb and object) corresponds to the binary division of VP into verb and direct object (VP → V + NP). Thus the vertical and slanting lines that cross or rest on the baseline correspond to the constituency relation. The dependency relation, in contrast, is present insofar as modifiers dangle off of or appear below the words that they modify.
A sentence may also be broken down by functional parts: subject, object, adverbial, verb (predicator). [6] The subject is the owner of an action, the verb represents the action, the object represents the recipient of the action, and the adverbial qualifies the action. The various parts can be phrases rather than individual words.
In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.
In grammar, a phrase—called expression in some contexts—is a group of words or singular word acting as a grammatical unit. For instance, the English expression "the very happy squirrel" is a noun phrase which contains the adjective phrase "very happy". Phrases can consist of a single word or a complete sentence. In theoretical linguistics, phrases are often analyzed as units of syntactic structure such as a constituent. There is a difference between the common use of the term phrase and its technical use in linguistics. In common usage, a phrase is usually a group of words with some special idiomatic meaning or other significance, such as "all rights reserved", "economical with the truth", "kick the bucket", and the like. It may be a euphemism, a saying or proverb, a fixed expression, a figure of speech, etc.. In linguistics, these are known as phrasemes.
Phrase structure rules are a type of rewrite rule used to describe a given language's syntax and are closely associated with the early stages of transformational grammar, proposed by Noam Chomsky in 1957. They are used to break down a natural language sentence into its constituent parts, also known as syntactic categories, including both lexical categories and phrasal categories. A grammar that uses phrase structure rules is a type of phrase structure grammar. Phrase structure rules as they are commonly employed operate according to the constituency relation, and a grammar that employs phrase structure rules is therefore a constituency grammar; as such, it stands in contrast to dependency grammars, which are based on the dependency relation.
A parse tree or parsing tree or derivation tree or concrete syntax tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar. The term parse tree itself is used primarily in computational linguistics; in theoretical syntax, the term syntax tree is more common.
Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.
In linguistics, a verb phrase (VP) is a syntactic unit composed of a verb and its arguments except the subject of an independent clause or coordinate clause. Thus, in the sentence A fat man quickly put the money into the box, the words quickly put the money into the box constitute a verb phrase; it consists of the verb put and its arguments, but not the subject a fat man. A verb phrase is similar to what is considered a predicate in traditional grammars.
Lexical functional grammar (LFG) is a constraint-based grammar framework in theoretical linguistics. It posits two separate levels of syntactic structure, a phrase structure grammar representation of word order and constituency, and a representation of grammatical functions such as subject and object, similar to dependency grammar. The development of the theory was initiated by Joan Bresnan and Ronald Kaplan in the 1970s, in reaction to the theory of transformational grammar which was current in the late 1970s. It mainly focuses on syntax, including its relation with morphology and semantics. There has been little LFG work on phonology.
In linguistics, branching refers to the shape of the parse trees that represent the structure of sentences. Assuming that the language is being written or transcribed from left to right, parse trees that grow down and to the right are right-branching, and parse trees that grow down and to the left are left-branching. The direction of branching reflects the position of heads in phrases, and in this regard, right-branching structures are head-initial, whereas left-branching structures are head-final. English has both right-branching (head-initial) and left-branching (head-final) structures, although it is more right-branching than left-branching. Some languages such as Japanese and Turkish are almost fully left-branching (head-final). Some languages are mostly right-branching (head-initial).
Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called dependencies. Dependency grammar differs from phrase structure grammar in that while it can identify phrases it tends to overlook phrasal nodes. A dependency structure is determined by the relation between a word and its dependents. Dependency structures are flatter than phrase structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages with free word order, such as Czech or Warlpiri.
The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue. Some authors, however, reserve the term for more restricted grammars in the Chomsky hierarchy: context-sensitive grammars or context-free grammars. In a broader sense, phrase structure grammars are also known as constituency grammars. The defining trait of phrase structure grammars is thus their adherence to the constituency relation, as opposed to the dependency relation of dependency grammars.
In generative grammar, non-configurational languages are languages characterized by a flat phrase structure, which allows syntactically discontinuous expressions, and a relatively free word order.
The term predicate is used in two ways in linguistics and its subfields. The first defines a predicate as everything in a standard declarative sentence except the subject, and the other defines it as only the main content verb or associated predicative expression of a clause. Thus, by the first definition, the predicate of the sentence Frank likes cake is likes cake, while by the second definition, it is only the content verb likes, and Frank and cake are the arguments of this predicate. The conflict between these two definitions can lead to confusion.
In theoretical linguistics, a distinction is made between endocentric and exocentric constructions. A grammatical construction is said to be endocentric if it fulfils the same linguistic function as one of its parts, and exocentric if it does not. The distinction reaches back at least to Bloomfield's work of the 1930s, who based it on terms by Pāṇini and Patañjali in Sanskrit grammar. Such a distinction is possible only in phrase structure grammars, since in dependency grammars all constructions are necessarily endocentric.
In linguistics, control is a construction in which the understood subject of a given predicate is determined by some expression in context. Stereotypical instances of control involve verbs. A superordinate verb "controls" the arguments of a subordinate, nonfinite verb. Control was intensively studied in the government and binding framework in the 1980s, and much of the terminology from that era is still used today. In the days of Transformational Grammar, control phenomena were discussed in terms of Equi-NP deletion. Control is often analyzed in terms of a null pronoun called PRO. Control is also related to raising, although there are important differences between control and raising.
In linguistics, a small clause consists of a subject and its predicate, but lacks an overt expression of tense. Small clauses have the semantic subject-predicate characteristics of a clause, and have some, but not all, properties of a constituent. Structural analyses of small clauses vary according to whether a flat or layered analysis is pursued. The small clause is related to the phenomena of raising-to-object, exceptional case-marking, accusativus cum infinitivo, and object control.
In linguistics, immediate constituent analysis or IC analysis is a method of sentence analysis that was proposed by Wilhelm Wundt and named by Leonard Bloomfield. The process reached a full-blown strategy for analyzing sentence structure in the distributionalist works of Zellig Harris and Charles F. Hockett, and in glossematics by Knud Togeby. The practice is now widespread. Most tree structures employed to represent the syntactic structure of sentences are products of some form of IC-analysis. The process and result of IC-analysis can, however, vary greatly based upon whether one chooses the constituency relation of phrase structure grammars or the dependency relation of dependency grammars as the underlying principle that organizes constituents into hierarchical structures.
In linguistics, a catena is a unit of syntax and morphology, closely associated with dependency grammars. It is a more flexible and inclusive unit than the constituent and its proponents therefore consider it to be better suited than the constituent to serve as the fundamental unit of syntactic and morphosyntactic analysis.
This article describes the syntax of clauses in the English language, chiefly in Modern English. A clause is often said to be the smallest grammatical unit that can express a complete proposition. But this semantic idea of a clause leaves out much of English clause syntax. For example, clauses can be questions, but questions are not propositions. A syntactic description of an English clause is that it is a subject and a verb. But this too fails, as a clause need not have a subject, as with the imperative, and, in many theories, an English clause may be verbless. The idea of what qualifies varies between theories and has changed over time.
Subject–verb inversion in English is a type of inversion marked by a predicate verb that precedes a corresponding subject, e.g., "Beside the bed stood a lamp". Subject–verb inversion is distinct from subject–auxiliary inversion because the verb involved is not an auxiliary verb.
In semantics, a predicand is an argument in an utterance, specifically that of which something is predicated. By extension, in syntax, it is the constituent in a clause typically functioning as the subject.