Verb phrase

Last updated

In linguistics, a verb phrase (VP) is a syntactic unit composed of a verb and its arguments except the subject of an independent clause or coordinate clause. Thus, in the sentence A fat man quickly put the money into the box, the words quickly put the money into the box constitute a verb phrase; it consists of the verb put and its arguments, but not the subject a fat man. A verb phrase is similar to what is considered a predicate in traditional grammars.

Contents

Verb phrases generally are divided among two types: finite, of which the head of the phrase is a finite verb; and nonfinite, where the head is a nonfinite verb, such as an infinitive, participle or gerund. Phrase structure grammars acknowledge both types, but dependency grammars treat the subject as just another verbal dependent, and they do not recognize the finite verbal phrase constituent. Understanding verb phrase analysis depends on knowing which theory applies in context.

In phrase structure grammars

In phrase structure grammars such as generative grammar, the verb phrase is one headed by a verb. It may be composed of only a single verb, but typically it consists of combinations of main and auxiliary verbs, plus optional specifiers, complements (not including subject complements), and adjuncts. For example:

Yankee batters hit the ball well enough to win their first World Series since 2000.
Mary saw the man through the window.
David gave Mary a book.

The first example contains the long verb phrase hit the ball well enough to win their first World Series since 2000; the second is a verb phrase composed of the main verb saw, the complement phrase the man (a noun phrase), and the adjunct phrase through the window (an adverbial phrase and prepositional phrase). The third example presents three elements, the main verb gave, the noun Mary, and the noun phrase a book, all of which comprise the verb phrase. Note, the verb phrase described here corresponds to the predicate of traditional grammar.

Current views vary on whether all languages have a verb phrase; some schools of generative grammar (such as principles and parameters) hold that all languages have a verb phrase, while others (such as lexical functional grammar) take the view that at least some languages lack a verb phrase constituent, including those languages with a very free word order (the so-called non-configurational languages, such as Japanese, Hungarian, or Australian aboriginal languages), and some languages with a default VSO order (several Celtic and Oceanic languages).

Phrase structure grammars view both finite and nonfinite verb phrases as constituent phrases and, consequently, do not draw any key distinction between them. Dependency grammars (described below) are much different in this regard.

In dependency grammars

While phrase structure grammars (constituency grammars) acknowledge both finite and non-finite VPs as constituents (complete subtrees), dependency grammars reject the former. That is, dependency grammars acknowledge only non-finite VPs as constituents; finite VPs do not qualify as constituents in dependency grammars. For example:

John has finished the work. – Finite VP in bold
John has finished the work. – Non-finite VP in bold

Since has finished the work contains the finite verb has, it is a finite VP, and since finished the work contains the non-finite verb finished but lacks a finite verb, it is a non-finite VP. Similar examples:

They do not want to try that. – Finite VP in bold
They do not want to try that. – One non-finite VP in bold
They do not want to try that. – Another non-finite VP in bold

These examples illustrate well that many clauses can contain more than one non-finite VP, but they generally contain only one finite VP. Starting with Lucien Tesnière 1959, [1] dependency grammars challenge the validity of the initial binary division of the clause into subject (NP) and predicate (VP), which means they reject the notion that the second half of this binary division, i.e. the finite VP, is a constituent. They do, however, readily acknowledge the existence of non-finite VPs as constituents. The two competing views of verb phrases are visible in the following trees:

Johnhasfinishedthework-1.jpg

The constituency tree on the left shows the finite VP has finished the work as a constituent, since it corresponds to a complete subtree. The dependency tree on the right, in contrast, does not acknowledge a finite VP constituent, since there is no complete subtree there that corresponds to has finished the work. Note that the analyses agree concerning the non-finite VP finished the work; both see it as a constituent (complete subtree).

Dependency grammars point to the results of many standard constituency tests to back up their stance. [2] For instance, topicalization, pseudoclefting, and answer ellipsis suggest that non-finite VP does, but finite VP does not, exist as a constituent:

*...and has finished the work, John. – Topicalization
*What John has done is has finished the work. – Pseudoclefting
What has John done? – *Has finished the work. – Answer ellipsis

The * indicates that the sentence is bad. These data must be compared to the results for non-finite VP:

...and finished the work, John (certainly) has. – Topicalization
What John has done is finished the work. – Pseudoclefting
What has John done? – Finished the work. – Answer ellipsis

The strings in bold are the ones in focus. Attempts to in some sense isolate the finite VP fail, but the same attempts with the non-finite VP succeed. [3]

Narrowly defined

Verb phrases are sometimes defined more narrowly in scope, in effect counting only those elements considered strictly verbal in verb phrases. That would limit the definition to only main and auxiliary verbs, plus infinitive or participle constructions. [4] For example, in the following sentences only the words in bold form the verb phrase:

John has given Mary a book.
The picnickers were being eaten alive by mosquitos.
She kept screaming like a football maniac.
Thou shalt not kill.

This more narrow definition is often applied in functionalist frameworks and traditional European reference grammars. It is incompatible with the phrase structure model, because the strings in bold are not constituents under that analysis. It is, however, compatible with dependency grammars and other grammars that view the verb catena (verb chain) as the fundamental unit of syntactic structure, as opposed to the constituent.

See also

Notes

  1. Concerning Tesnière's rejection of a finite VP constituent, see Tesnière (1959:103–105).
  2. For a discussion of the evidence for and against a finite VP constituent, see Matthews (2007:17ff.), Miller (2011:54ff.), and Osborne et al. (2011:323f.).
  3. Attempts to motivate the existence of a finite VP constituent tend to confuse the distinction between finite and non-finite VPs. They mistakenly take evidence for a non-finite VP constituent as support for the existence a finite VP constituent. See for instance Akmajian and Heny (1980:29f., 257ff.), Finch (2000:112), van Valin (2001:111ff.), Kroeger (2004:32ff.), Sobin (2011:30ff.).
  4. Klammer and Schulz (1996:157ff.), for instance, pursue this narrow understanding of verb phrases.

Related Research Articles

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

In grammar, a phrase—called expression in some contexts—is a group of words or singular word acting as a grammatical unit. For instance, the English expression "the very happy squirrel" is a noun phrase which contains the adjective phrase "very happy". Phrases can consist of a single word or a complete sentence. In theoretical linguistics, phrases are often analyzed as units of syntactic structure such as a constituent. There is a difference between the common use of the term phrase and its technical use in linguistics. In common usage, a phrase is usually a group of words with some special idiomatic meaning or other significance, such as "all rights reserved", "economical with the truth", "kick the bucket", and the like. It may be a euphemism, a saying or proverb, a fixed expression, a figure of speech, etc.. In linguistics, these are known as phrasemes.

In language, a clause is a constituent that comprises a semantic predicand and a semantic predicate. A typical clause consists of a subject and a syntactic predicate, the latter typically a verb phrase composed of a verb with any objects and other modifiers. However, the subject is sometimes unvoiced if it is retrievable from context, especially in null-subject language but also in other languages, including English instances of the imperative mood.

<span class="mw-page-title-main">Parse tree</span> Tree in formal language theory

A parse tree or parsing tree or derivation tree or concrete syntax tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar. The term parse tree itself is used primarily in computational linguistics; in theoretical syntax, the term syntax tree is more common.

Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called dependencies. Dependency grammar differs from phrase structure grammar in that while it can identify phrases it tends to overlook phrasal nodes. A dependency structure is determined by the relation between a word and its dependents. Dependency structures are flatter than phrase structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages with free word order, such as Czech or Warlpiri.

The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue. Some authors, however, reserve the term for more restricted grammars in the Chomsky hierarchy: context-sensitive grammars or context-free grammars. In a broader sense, phrase structure grammars are also known as constituency grammars. The defining trait of phrase structure grammars is thus their adherence to the constituency relation, as opposed to the dependency relation of dependency grammars.

In linguistics, valency or valence is the number and type of arguments controlled by a predicate, content verbs being typical predicates. Valency is related, though not identical, to subcategorization and transitivity, which count only object arguments – valency counts all arguments, including the subject. The linguistic meaning of valency derives from the definition of valency in chemistry. Like valency found in chemistry, there is the binding of specific elements. In the grammatical theory of valency, the verbs organize sentences by binding the specific elements. Examples of elements that would be bound would be the complement and the actant. Although the term originates from valence in chemistry, linguistic valency has a close analogy in mathematics under the term arity.

In generative grammar, non-configurational languages are languages characterized by a flat phrase structure, which allows syntactically discontinuous expressions, and a relatively free word order.

In theoretical linguistics, a distinction is made between endocentric and exocentric constructions. A grammatical construction is said to be endocentric if it fulfils the same linguistic function as one of its parts, and exocentric if it does not. The distinction reaches back at least to Bloomfield's work of the 1930s, who based it on terms by Pāṇini and Patañjali in Sanskrit grammar. Such a distinction is possible only in phrase structure grammars, since in dependency grammars all constructions are necessarily endocentric.

A nonfinite verb, in contrast to a finite verb, is a derivative form of a verb that lacks inflection (conjugation) for number or person. In the English language, the nonfinite verb cannot perform action as the main verb of an independent clause, while in French, the first verb is typically the only finite one. In English, nonfinite verbs include infinitives, participles and gerunds. Nonfinite verb forms in some other languages include converbs, gerundives and supines. The categories of mood, tense, and or voice may be absent from non-finite verb forms in some languages.

<span class="mw-page-title-main">Lucien Tesnière</span> French linguist (1893–1954)

Lucien Tesnière was a prominent and influential French linguist. He was born in Mont-Saint-Aignan on May 13, 1893. As a senior lecturer at the University of Strasbourg (1924) and later professor at the University of Montpellier (1937), he published many papers and books on Slavic languages. However, his importance in the history of linguistics is based mainly on his development of an approach to the syntax of natural languages that would become known as dependency grammar. He presented his theory in his book Éléments de syntaxe structurale, published posthumously in 1959. In the book he proposes a sophisticated formalization of syntactic structures, supported by many examples from a diversity of languages. Tesnière died in Montpellier on December 6, 1954.

In linguistics, an argument is an expression that helps complete the meaning of a predicate, the latter referring in this context to a main verb and its auxiliaries. In this regard, the complement is a closely related concept. Most predicates take one, two, or three arguments. A predicate and its arguments form a predicate-argument structure. The discussion of predicates and arguments is associated most with (content) verbs and noun phrases (NPs), although other syntactic categories can also be construed as predicates and as arguments. Arguments must be distinguished from adjuncts. While a predicate needs its arguments to complete its meaning, the adjuncts that appear with a predicate are optional; they are not necessary to complete the meaning of the predicate. Most theories of syntax and semantics acknowledge arguments and adjuncts, although the terminology varies, and the distinction is generally believed to exist in all languages. Dependency grammars sometimes call arguments actants, following Lucien Tesnière (1959).

In linguistics, verb phrase ellipsis is a type of elliptical construction and a type of anaphora in which a verb phrase has been left out (elided) provided that its antecedent can be found within the same linguistic context. For example, "She will sell sea shells, and he will <sell sea shells> too" is understood as "She will sell sea shells, and he will sell sea shells too". VP-ellipsis is well-studied, particularly with regard to its occurrence in English, although certain types can be found in other languages as well.

Antecedent-contained deletion (ACD), also called antecedent-contained ellipsis, is a phenomenon whereby an elided verb phrase appears to be contained within its own antecedent. For instance, in the sentence "I read every book that you did", the verb phrase in the main clause appears to license ellipsis inside the relative clause which modifies its object. ACD is a classic puzzle for theories of the syntax-semantics interface, since it threatens to introduce an infinite regress. It is commonly taken as motivation for syntactic transformations such as quantifier raising, though some approaches explain it using semantic composition rules or by adoption more flexible notions of what it means to be a syntactic unit.

In linguistics, gapping is a type of ellipsis that occurs in the non-initial conjuncts of coordinate structures. Gapping usually elides minimally a finite verb and further any non-finite verbs that are present. This material is "gapped" from the non-initial conjuncts of a coordinate structure. Gapping exists in many languages, but by no means in all of them, and gapping has been studied extensively and is therefore one of the more understood ellipsis mechanisms. Stripping is viewed as a particular manifestation of the gapping mechanism where just one remnant appears in the gapped/stripped conjunct.

In linguistics, immediate constituent analysis or IC analysis is a method of sentence analysis that was proposed by Wilhelm Wundt and named by Leonard Bloomfield. The process reached a full-blown strategy for analyzing sentence structure in the distributionalist works of Zellig Harris and Charles F. Hockett, and in glossematics by Knud Togeby. The practice is now widespread. Most tree structures employed to represent the syntactic structure of sentences are products of some form of IC-analysis. The process and result of IC-analysis can, however, vary greatly based upon whether one chooses the constituency relation of phrase structure grammars or the dependency relation of dependency grammars as the underlying principle that organizes constituents into hierarchical structures.

In linguistics, a catena is a unit of syntax and morphology, closely associated with dependency grammars. It is a more flexible and inclusive unit than the constituent and its proponents therefore consider it to be better suited than the constituent to serve as the fundamental unit of syntactic and morphosyntactic analysis.

Pseudogapping is an ellipsis mechanism that elides most but not all of a non-finite verb phrase; at least one part of the verb phrase remains, which is called the remnant. Pseudogapping occurs in comparative and contrastive contexts, so it appears often after subordinators and coordinators such as if, although, but, than, etc. It is similar to verb phrase ellipsis (VP-ellipsis) insofar as the ellipsis is introduced by an auxiliary verb, and many grammarians take it to be a particular type of VP-ellipsis. The distribution of pseudogapping is more restricted than that of VP-ellipsis, however, and in this regard, it has some traits in common with gapping. But unlike gapping, pseudogapping occurs in English but not in closely related languages. The analysis of pseudogapping can vary greatly depending in part on whether the analysis is based in a phrase structure grammar or a dependency grammar. Pseudogapping was first identified, named, and explored by Stump (1977) and has since been studied in detail by Levin (1986) among others, and now enjoys a firm position in the canon of acknowledged ellipsis mechanisms of English.

Subject–verb inversion in English is a type of inversion marked by a predicate verb that precedes a corresponding subject, e.g., "Beside the bed stood a lamp". Subject–verb inversion is distinct from subject–auxiliary inversion because the verb involved is not an auxiliary verb.

In linguistics, the term right node raising (RNR) denotes a sharing mechanism that sees the material to the immediate right of parallel structures being in some sense "shared" by those parallel structures, e.g. [Sam likes] but [Fred dislikes] the debates. The parallel structures of RNR are typically the conjuncts of a coordinate structure, although the phenomenon is not limited to coordination, since it can also appear with parallel structures that do not involve coordination. The term right node raising itself is due to Postal (1974). Postal assumed that the parallel structures are complete clauses below the surface. The shared constituent was then raised rightward out of each conjunct of the coordinate structure and attached as a single constituent to the structure above the level of the conjuncts, hence "right node raising" was occurring in a literal sense. While the term right node raising survives, the actual analysis that Postal proposed is not widely accepted. RNR occurs in many languages, including English and related languages.

References