Adverbial complement

Last updated

An adverbial complement is an adverbial that is required to complete the meaning of a verb, such that if it is removed, it will yield an ungrammatical sentence or an intrinsically different meaning of the verb. They stand in contrast to adverbial adjuncts, which can be removed from a sentence without altering its structure or meaning. [1]

In grammar, an adverbial is a word or a group of words that modifies or more closely defines the sentence or the verb. Look at the examples below:

In linguistics, an adjunct is an optional, or structurally dispensable, part of a sentence, clause, or phrase that, if removed or discarded, will not otherwise affect the remainder of the sentence. Example: In the sentence John helped Bill in Central Park, the phrase in Central Park is an adjunct.

Contents

Adverbial complements often accompany verbs of caused motion such as put or place:

However, they can occur with other types of verbs as well:

Theoretical approaches

Head-driven phrase structure grammar describes adverbial complements as part of the verbs' subcategorization frame, which is why they are obligatory arguments. In this theory, adverbial complements are stored in the lexicon as part of the grammatical competence relating to the verb.

Head-driven phrase structure grammar (HPSG) is a highly lexicalized, constraint-based grammar developed by Carl Pollard and Ivan Sag. It is a type of phrase structure grammar, as opposed to a dependency grammar, and it is the immediate successor to generalized phrase structure grammar. HPSG draws from other fields such as computer science and uses Ferdinand de Saussure's notion of the sign. It uses a uniform formalism and is organized in a modular way which makes it attractive for natural language processing.

A verb, from the Latin verbum meaning word, is a word that in syntax conveys an action, an occurrence, or a state of being. In the usual description of English, the basic form, with or without the particle to, is the infinitive. In many languages, verbs are inflected to encode tense, aspect, mood, and voice. A verb may also agree with the person, gender or number of some of its arguments, such as its subject, or object. Verbs have tenses: present, to indicate that an action is being carried out; past, to indicate that an action has been done; future, to indicate that an action will be done.

A lexicon, word-hoard, wordbook, or word-stock is the vocabulary of a person, language, or branch of knowledge. In linguistics, a lexicon is a language's inventory of lexemes. The word "lexicon" derives from the Greek λεξικόν (lexicon), neuter of λεξικός (lexikos) meaning "of or for words."

An alternative description, along the lines of construction grammar is that they are parts of certain argument structure constructions – in this case the caused motion construction – which are specifically compatible with the semantics of the verb. Here, adverbial complements are stored in the grammar as part of the caused motion construction which is a sign in its own right.

In linguistics, construction grammar groups a number of models of grammar that all subscribe to the idea that knowledge of a language is based on a collection of "form and function pairings". The "function" side covers what is commonly understood as meaning, content, or intent; it usually extends over both conventional fields of semantics and pragmatics.

Another construction-based theory combines the two arguing that certain senses of verbs co-occur so frequently with certain argument structure constructions, that the argument structures are also stored as part of the grammatical competence relating to the verb. These small argument structure constructions are called mini-constructions. So, in the case of put, in accordance with this theory, adverbial complements are both part of the argument structure construction and stored as information regarding the verb itself.

In linguistics, a grammatical construction is any syntactic string of words ranging from sentences over phrasal structures to certain complex lexemes, such as phrasal verbs.

See also

Related Research Articles

Phrase Group of words

In everyday speech, a phrase may be any group of words, often carrying a special idiomatic meaning; in this sense it is synonymous with expression. In linguistic analysis, a phrase is a group of words that functions as a constituent in the syntax of a sentence, a single unit within a grammatical hierarchy. A phrase typically appears within a clause, but it is possible also for a phrase to be a clause or to contain a clause within it. There are also types of phrases like noun phrase, prepositional phrase and noun phrase

A noun phrase or nominal phrase is a phrase that has a noun as its head or shows the same grammatical function as such a phrase. Noun phrases are very common cross-linguistically, and they may be the most frequently occurring phrase type.

Clause smallest grammatical unit that can express a complete proposition

In language, a clause is the smallest grammatical unit that can express a complete proposition. A typical clause consists of a subject and a predicate, the latter typically a verb phrase, a verb with any objects and other modifiers. However, the subject is sometimes not said or explicit, often the case in null-subject languages if the subject is retrievable from context, but it sometimes also occurs in other languages such as English.

Preposition and postposition preposition, postposition or circumposition (in linguistics)

Prepositions and postpositions, together called adpositions, are a class of words used to express spatial or temporal relations or mark various semantic roles.

Lexical semantics, is a subfield of linguistic semantics. The units of analysis in lexical semantics are lexical units which include not only words but also sub-words or sub-units such as affixes and even compound words and phrases. Lexical units make up the catalogue of words in a language, the lexicon. Lexical semantics looks at how the meaning of the lexical units correlates with the structure of the language or syntax. This is referred to as syntax-semantic interface.

In grammar, a modifier is an optional element in phrase structure or clause structure. A modifier is so called because it is said to modify another element in the structure, on which it is dependent. Typically the modifier can be removed without affecting the grammar of the sentence. For example, in the English sentence This is a red ball, the adjective red is a modifier, modifying the noun ball. Removal of the modifier would leave This is a ball, which is grammatically correct and equivalent in structure to the original sentence.

In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.

In linguistics, valency or valence is the number of arguments controlled by a predicate, content verbs being typical predicates. Valency is related, though not identical, to subcategorization and transitivity, which count only object arguments – valency counts all arguments, including the subject. The linguistic meaning of valency derives from the definition of valency in chemistry. The valency metaphor appeared first in linguistics in Charles Sanders Peirce's essay The logic of relatives in 1897, and it then surfaced in the works of a number of linguists decades later in the late 1940s and 1950s. Lucien Tesnière is credited most with having established the valency concept in linguistics.

In grammar, a complement is a word, phrase or clause that is necessary to complete the meaning of a given expression. Complements are often also arguments.

There are two competing notions of the predicate, generating confusion concerning the use of the term predicate in general. The first concerns traditional grammar, which tends to view a predicate as one of two main parts of a sentence, the other part being the subject. The purpose of the predicate is to complete an idea about the subject, such as what it does or what it is like. For instance, in a sentence such as Frank likes cake, the subject is Frank and the predicate is likes cake.

An adpositional phrase, in linguistics, is a syntactic category that includes prepositional phrases, postpositional phrases, and circumpositional phrases. Adpositional phrases contain an adposition as head and usually a complement such as a noun phrase. Language syntax treats adpositional phrases as units that act as arguments or adjuncts. Prepositional and postpositional phrases differ by the order of the words used. Languages that are primarily head-initial such as English predominantly use prepositional phrases whereas head-final languages predominantly employ postpositional phrases. Many languages have both types, as well as circumpositional phrases.

In linguistics, an adverbial phrase ("AdvP") is a multi-word expression operating adverbially: its syntactic function is to modify other expressions, including verbs, adjectives, adverbs, adverbials, and sentences. Adverbial phrases can be divided into two types: complement adverbs versus modifier adverbs. For example, in the sentence She sang very well, the expression very well is an adverbial phrase, as it modifies the verb sing. More specifically, the adverbial phrase very well contains two adverbials, very and well: while well modifies the verb to convey information about the manner of singing, very is a degree modifier that conveys information about the degree to which the action of singing well was accomplished.

In linguistics, dative shift is a pattern in which the subcategorization of a verb can take on two alternating forms. In the oblique dative (OD) form, the verb takes a noun phrase (NP) and a dative prepositional phrase (PP), the second of which is not a core argument.

In linguistics, an argument is an expression that helps complete the meaning of a predicate, the latter referring in this context to a main verb and its auxiliaries. In this regard, the complement is a closely related concept. Most predicates take one, two, or three arguments. A predicate and its arguments form a predicate-argument structure. The discussion of predicates and arguments is associated most with (content) verbs and noun phrases (NPs), although other syntactic categories can also be construed as predicates and as arguments. Arguments must be distinguished from adjuncts. While a predicate needs its arguments to complete its meaning, the adjuncts that appear with a predicate are optional; they are not necessary to complete the meaning of the predicate. Most theories of syntax and semantics acknowledge arguments and adjuncts, although the terminology varies, and the distinction is generally believed to exist in all languages. Dependency grammars sometimes call arguments actants, following Tesnière (1959).

In linguistics, in the study of syntax, an empty category is an element that does not have any phonological content and is therefore unpronounced. Empty categories may also be referred to as covert categories, in contrast to overt categories which are pronounced. When representing empty categories in tree structures, linguists use a null symbol to depict the idea that there is a mental category at the level being represented, even if the word(s) are being left out of overt speech. The phenomenon was named and outlined by Noam Chomsky in his 1981 LGB framework, and serves to address apparent violations of locality of selection — there are different types of empty categories that each appear to account for locality violations in different environments.

In certain theories of linguistics, thematic relations, also known as semantic roles, are the various roles that a noun phrase may play with respect to the action or state described by a governing verb, commonly the sentence's main verb. For example, in the sentence "Susan ate an apple", Susan is the doer of the eating, so she is an agent; the apple is the item that is eaten, so it is a patient. While most modern linguistic theories make reference to such relations in one form or another, the general term, as well as the terms for specific relations, varies: 'participant role', 'semantic role', and 'deep case' have also been employed with similar sense.

In linguistics, inversion is any of several grammatical constructions where two expressions switch their canonical order of appearance, that is, they invert. The most frequent type of inversion in English is subject–auxiliary inversion in which an auxiliary verb changes places with its subject; it often occurs in questions, such as Are you coming?, with the subject you is switched with the auxiliary are. In many other languages, especially those with a freer word order than English, inversion can take place with a variety of verbs and with other syntactic categories as well.

In linguistics, locality refers to the proximity of elements in a linguistic structure. Constraints on locality limit the span over which rules can apply to a particular structure. Theories of transformational grammar use syntactic locality constraints to explain restrictions on argument selection, syntactic binding, and syntactic movement.

References

  1. "Adverbial Complements". Thefreedictionary.com. Retrieved 2016-11-23.