Argument (linguistics)

Last updated

In linguistics, an argument is an expression that helps complete the meaning of a predicate, [1] the latter referring in this context to a main verb and its auxiliaries. In this regard, the complement is a closely related concept. Most predicates take one, two, or three arguments. A predicate and its arguments form a predicate-argument structure. The discussion of predicates and arguments is associated most with (content) verbs and noun phrases (NPs), although other syntactic categories can also be construed as predicates and as arguments. Arguments must be distinguished from adjuncts. While a predicate needs its arguments to complete its meaning, the adjuncts that appear with a predicate are optional; they are not necessary to complete the meaning of the predicate. [2] Most theories of syntax and semantics acknowledge arguments and adjuncts, although the terminology varies, and the distinction is generally believed to exist in all languages. Dependency grammars sometimes call arguments actants, following Lucien Tesnière (1959).

Contents

The area of grammar that explores the nature of predicates, their arguments, and adjuncts is called valency theory. Predicates have a valence; they determine the number and type of arguments that can or must appear in their environment. The valence of predicates is also investigated in terms of subcategorization.

Arguments and adjuncts

The basic analysis of the syntax and semantics of clauses relies heavily on the distinction between arguments and adjuncts. The clause predicate, which is often a content verb, demands certain arguments. That is, the arguments are necessary in order to complete the meaning of the verb. The adjuncts that appear, in contrast, are not necessary in this sense. The subject phrase and object phrase are the two most frequently occurring arguments of verbal predicates. [3] For instance:

Jill likes Jack.
Sam fried the meat.
The old man helped the young man.

Each of these sentences contains two arguments (in bold), the first noun (phrase) being the subject argument, and the second the object argument. Jill, for example, is the subject argument of the predicate likes, and Jack is its object argument. Verbal predicates that demand just a subject argument (e.g. sleep, work, relax) are intransitive, verbal predicates that demand an object argument as well (e.g. like, fry, help) are transitive, and verbal predicates that demand two object arguments are ditransitive (e.g. give, lend).

When additional information is added to our three example sentences, one is dealing with adjuncts, e.g.

Jill really likes Jack.
Jill likes Jack most of the time.
Jill likes Jack when the sun shines.
Jill likes Jack because he's friendly.

The added phrases (in bold) are adjuncts; they provide additional information that is not necessary to complete the meaning of the predicate likes. One key difference between arguments and adjuncts is that the appearance of a given argument is often obligatory, whereas adjuncts appear optionally. While typical verb arguments are subject or object nouns or noun phrases as in the examples above, they can also be prepositional phrases (PPs) (or even other categories). The PPs in bold in the following sentences are arguments:

Sam put the pen on the chair.
Larry does not put up with that.
Bill is getting on my case.

We know that these PPs are (or contain) arguments because when we attempt to omit them, the result is unacceptable:

*Sam put the pen.
*Larry does not put up.
*Bill is getting.

Subject and object arguments are known as core arguments; core arguments can be suppressed, added, or exchanged in different ways, using voice operations like passivization, antipassivization, applicativization, incorporation, etc. Prepositional arguments, which are also called oblique arguments, however, do not tend to undergo the same processes.

Psycholinguistic (argument vs adjuncts)

Psycholinguistic theories must explain how syntactic representations are built incrementally during sentence comprehension. One view that has sprung from psycholinguistics is the argument structure hypothesis (ASH), which explains the distinct cognitive operations for argument and adjunct attachment: arguments are attached via the lexical mechanism, but adjuncts are attached using general (non-lexical) grammatical knowledge that is represented as phrase structure rules or the equivalent.

Argument status determines the cognitive mechanism in which a phrase will be attached to the developing syntactic representations of a sentence. Psycholinguistic evidence supports a formal distinction between arguments and adjuncts, for any questions about the argument status of a phrase are, in effect, questions about learned mental representations of the lexical heads.[ citation needed ]

Syntactic vs. semantic arguments

An important distinction acknowledges both syntactic and semantic arguments. Content verbs determine the number and type of syntactic arguments that can or must appear in their environment; they impose specific syntactic functions (e.g. subject, object, oblique, specific preposition, possessor, etc.) onto their arguments. These syntactic functions will vary as the form of the predicate varies (e.g. active verb, passive participle, gerund, nominal, etc.). In languages that have morphological case, the arguments of a predicate must appear with the correct case markings (e.g. nominative, accusative, dative, genitive, etc.) imposed on them by their predicate. The semantic arguments of the predicate, in contrast, remain consistent, e.g.

Jack is liked by Jill.
Jill's liking Jack
Jack's being liked by Jill
the liking of Jack by Jill
Jill's like for Jack

The predicate 'like' appears in various forms in these examples, which means that the syntactic functions of the arguments associated with Jack and Jill vary. The object of the active sentence, for instance, becomes the subject of the passive sentence. Despite this variation in syntactic functions, the arguments remain semantically consistent. In each case, Jill is the experiencer (= the one doing the liking) and Jack is the one being experienced (= the one being liked). In other words, the syntactic arguments are subject to syntactic variation in terms of syntactic functions, whereas the thematic roles of the arguments of the given predicate remain consistent as the form of that predicate changes.

The syntactic arguments of a given verb can also vary across languages. For example, the verb put in English requires three syntactic arguments: subject, object, locative (e. g. He put the book into the box). These syntactic arguments correspond to the three semantic arguments agent, theme, and goal. The Japanese verb oku 'put', in contrast, has the same three semantic arguments, but the syntactic arguments differ, since Japanese does not require three syntactic arguments, so it is correct to say Kare ga hon o oita ("He put the book"). The equivalent sentence in English is ungrammatical without the required locative argument, as the examples involving put above demonstrate. For this reason, a slight paraphrase is required to render the nearest grammatical equivalent in English: He positioned the book or He deposited the book.

Distinguishing between arguments and adjuncts

Arguments vs. adjuncts

A large body of literature has been devoted to distinguishing arguments from adjuncts. [4] Numerous syntactic tests have been devised for this purpose. One such test is the relative clause diagnostic. If the test constituent can appear after the combination which occurred/happened in a relative clause, it is an adjunct, not an argument, e.g.

Bill left on Tuesday. → Bill left, which happened on Tuesday. – on Tuesday is an adjunct.
Susan stopped due to the weather. → Susan stopped, which occurred due to the weather. – due to the weather is an adjunct.
Fred tried to say something twice. → Fred tried to say something, which occurred twice. – twice is an adjunct.

The same diagnostic results in unacceptable relative clauses (and sentences) when the test constituent is an argument, e.g.

Bill left home. → *Bill left, which happened home. – home is an argument.
Susan stopped her objections. → *Susan stopped, which occurred her objections. – her objections is an argument.
Fred tried to say something. → *Fred tried to say, which happened something. – something is an argument.

This test succeeds in identifying prepositional arguments as well:

We are waiting for Susan. → *We are waiting, which is happening for Susan. – for Susan is an argument.
Tom put the knife in the drawer. → *Tom put the knife, which occurred in the drawer. – in the drawer is an argument.
We laughed at you. → *We laughed, which occurred at you. – at you is an argument.

The utility of the relative clause test is, however, limited. It incorrectly suggests, for instance, that modal adverbs (e.g. probably, certainly, maybe) and manner expressions (e.g. quickly, carefully, totally) are arguments. If a constituent passes the relative clause test, however, one can be sure that it is not an argument.

Obligatory vs. optional arguments

A further division blurs the line between arguments and adjuncts. Many arguments behave like adjuncts with respect to another diagnostic, the omission diagnostic. Adjuncts can always be omitted from the phrase, clause, or sentence in which they appear without rendering the resulting expression unacceptable. Some arguments (obligatory ones), in contrast, cannot be omitted. There are many other arguments, however, that are identified as arguments by the relative clause diagnostic but that can nevertheless be omitted, e.g.

a. She cleaned the kitchen.
b. She cleaned. – the kitchen is an optional argument.
a. We are waiting for Larry.
b. We are waiting. – for Larry is an optional argument.
a. Susan was working on the model.
b. Susan was working. – on the model is an optional argument.

The relative clause diagnostic would identify the constituents in bold as arguments. The omission diagnostic here, however, demonstrates that they are not obligatory arguments. They are, rather, optional. The insight, then, is that a three-way division is needed. On the one hand, one distinguishes between arguments and adjuncts, and on the other hand, one allows for a further division between obligatory and optional arguments.

Arguments and adjuncts in noun phrases

Most work on the distinction between arguments and adjuncts has been conducted at the clause level and has focused on arguments and adjuncts to verbal predicates. The distinction is crucial for the analysis of noun phrases as well, however. If it is altered somewhat, the relative clause diagnostic can also be used to distinguish arguments from adjuncts in noun phrases, e.g.

Bill's bold reading of the poem after lunch
*bold reading of the poem after lunch that was Bill'sBill's is an argument.
Bill's reading of the poem after lunch that was boldbold is an adjunct
*Bill's bold reading after lunch that was of the poemof the poem is an argument
Bill's bold reading of the poem that was after lunchafter lunch is an adjunct

The diagnostic identifies Bill's and of the poem as arguments, and bold and after lunch as adjuncts.

Representing arguments and adjuncts

The distinction between arguments and adjuncts is often indicated in the tree structures used to represent syntactic structure. In phrase structure grammars, an adjunct is "adjoined" to a projection of its head predicate in such a manner that distinguishes it from the arguments of that predicate. The distinction is quite visible in theories that employ the X-bar schema, e.g.

Argument1.jpg

The complement argument appears as a sister of the head X, and the specifier argument appears as a daughter of XP. The optional adjuncts appear in one of a number of positions adjoined to a bar-projection of X or to XP.

Theories of syntax that acknowledge n-ary branching structures and hence construe syntactic structure as being flatter than the layered structures associated with the X-bar schema must employ some other means to distinguish between arguments and adjuncts. In this regard, some dependency grammars employ an arrow convention. Arguments receive a "normal" dependency edge, whereas adjuncts receive an arrow edge. [5] In the following tree, an arrow points away from an adjunct toward the governor of that adjunct:

Argument2.jpg

The arrow edges in the tree identify four constituents (= complete subtrees) as adjuncts: At one time, actually, in congress, and for fun. The normal dependency edges (= non-arrows) identify the other constituents as arguments of their heads. Thus Sam, a duck, and to his representative in congress are identified as arguments of the verbal predicate wanted to send.

Relevant theories

Argumentation theory focuses on how logical reasoning leads to end results through an internal structure built of premises, a method of reasoning and a conclusion. There are many versions of argumentation that relate to this theory that include: conversational, mathematical, scientific, interpretive, legal, and political.

Grammar theory, specifically functional theories of grammar, relate to the functions of language as the link to fully understanding linguistics by referencing grammar elements to their functions and purposes.

A variety of theories exist regarding the structure of syntax, including generative grammar, categorial grammar, and dependency grammar.

Modern theories of semantics include formal semantics, lexical semantics, and computational semantics. Formal semantics focuses on truth conditioning. Lexical Semantics delves into word meanings in relation to their context and computational semantics uses algorithms and architectures to investigate linguistic meanings.

The concept of valence is the number and type of arguments that are linked to a predicate, in particular to a verb. In valence theory verbs' arguments include also the argument expressed by the subject of the verb.

History of argument linguistics

The notion of argument structure was first conceived in the 1980s by researchers working in the government–binding framework to help address controversies about arguments. [6]

Importance

The distinction between arguments and adjuncts is crucial to most theories of syntax and grammar. Arguments behave differently from adjuncts in numerous ways. Theories of binding, coordination, discontinuities, ellipsis, etc. must acknowledge and build on the distinction. When one examines these areas of syntax, what one finds is that arguments consistently behave differently from adjuncts and that without the distinction, our ability to investigate and understand these phenomena would be seriously hindered. There is a distinction between arguments and adjuncts which is not really noticed by many in everyday language. The difference is between obligatory phrases versus phrases which embellish a sentence. For instance, if someone says "Tim punched the stuffed animal", the phrase stuffed animal would be an argument because it is the main part of the sentence. If someone says, "Tim punched the stuffed animal with glee", the phrase with glee would be an adjunct because it just enhances the sentence and the sentence can stand alone without it. [7]

See also

Notes

  1. Most grammars define the argument in this manner, i.e. it is an expression that helps complete the meaning of a predicate (a verb). See for instance Tesnière (1969: 128).
  2. Concerning the completion of a predicates meaning via its arguments, see for instance Kroeger (2004:9ff.).
  3. Geeraerts, Dirk; Cuyckens, Hubert (2007). The Oxford Handbook of Cognitive Linguistics. Oxford University Press US. ISBN   978-0-19-514378-2.
  4. For instance, see the essays on valency theory in Ágel et al. (2003/6).
  5. See Eroms (2000) and Osborne and Groß (2012) in this regard.
  6. Levin, Beth (2013-05-28). "Argument Structure". Linguistics. doi: 10.1093/obo/9780199772810-0099 . ISBN   978-0-19-977281-0 . Retrieved 2019-03-05.
  7. Damon Tutunjian; Julie E. Boland. "Do we need a distinction between arguments and adjuncts? Evidence from psycholinguistic studies of comprehension" (PDF). University of Michigan.

Related Research Articles

<span class="mw-page-title-main">Syntax</span> System responsible for combining morphemes into complex structures

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

In grammar, a phrase—called expression in some contexts—is a group of words or singular word acting as a grammatical unit. For instance, the English expression "the very happy squirrel" is a noun phrase which contains the adjective phrase "very happy". Phrases can consist of a single word or a complete sentence. In theoretical linguistics, phrases are often analyzed as units of syntactic structure such as a constituent.

In language, a clause is a constituent that comprises a semantic predicand and a semantic predicate. A typical clause consists of a subject and a syntactic predicate, the latter typically a verb phrase composed of a verb with any objects and other modifiers. However, the subject is sometimes unvoiced if it is retrievable from context, especially in null-subject language but also in other languages, including English instances of the imperative mood.

Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.

In linguistics, a verb phrase (VP) is a syntactic unit composed of a verb and its arguments except the subject of an independent clause or coordinate clause. Thus, in the sentence A fat man quickly put the money into the box, the words quickly put the money into the box constitute a verb phrase; it consists of the verb put and its arguments, but not the subject a fat man. A verb phrase is similar to what is considered a predicate in traditional grammars.

Lexical functional grammar (LFG) is a constraint-based grammar framework in theoretical linguistics. It posits two separate levels of syntactic structure, a phrase structure grammar representation of word order and constituency, and a representation of grammatical functions such as subject and object, similar to dependency grammar. The development of the theory was initiated by Joan Bresnan and Ronald Kaplan in the 1970s, in reaction to the theory of transformational grammar which was current in the late 1970s. It mainly focuses on syntax, including its relation with morphology and semantics. There has been little LFG work on phonology.

In linguistics, an adjunct is an optional, or structurally dispensable, part of a sentence, clause, or phrase that, if removed or discarded, will not structurally affect the remainder of the sentence. Example: In the sentence John helped Bill in Central Park, the phrase in Central Park is an adjunct.

Theta roles are the names of the participant roles associated with a predicate: the predicate may be a verb, an adjective, a preposition, or a noun. If an object is in motion or in a steady state as the speakers perceives the state, or it is the topic of discussion, it is called a theme. The participant is usually said to be an argument of the predicate. In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.

The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue. Some authors, however, reserve the term for more restricted grammars in the Chomsky hierarchy: context-sensitive grammars or context-free grammars. In a broader sense, phrase structure grammars are also known as constituency grammars. The defining trait of phrase structure grammars is thus their adherence to the constituency relation, as opposed to the dependency relation of dependency grammars.

In linguistics, valency or valence is the number and type of arguments controlled by a predicate, content verbs being typical predicates. Valency is related, though not identical, to subcategorization and transitivity, which count only object arguments – valency counts all arguments, including the subject. The linguistic meaning of valency derives from the definition of valency in chemistry. Like valency found in chemistry, there is the binding of specific elements. In the grammatical theory of valency, the verbs organize sentences by binding the specific elements. Examples of elements that would be bound would be the complement and the actant. Although the term originates from valence in chemistry, linguistic valency has a close analogy in mathematics under the term arity.

In generative grammar, non-configurational languages are languages characterized by a flat phrase structure, which allows syntactically discontinuous expressions, and a relatively free word order.

In grammar, a complement is a word, phrase, or clause that is necessary to complete the meaning of a given expression. Complements are often also arguments.

A predicate is one of the two main parts of a sentence. For the simple sentence "John [is yellow]", John acts as the subject, and is yellow acts as the predicate, a subsequent description of the subject headed with a verb.

An adpositional phrase is a syntactic category that includes prepositional phrases, postpositional phrases, and circumpositional phrases. Adpositional phrases contain an adposition as head and usually a complement such as a noun phrase. Language syntax treats adpositional phrases as units that act as arguments or adjuncts. Prepositional and postpositional phrases differ by the order of the words used. Languages that are primarily head-initial such as English predominantly use prepositional phrases whereas head-final languages predominantly employ postpositional phrases. Many languages have both types, as well as circumpositional phrases.

In theoretical linguistics, a distinction is made between endocentric and exocentric constructions. A grammatical construction is said to be endocentric if it fulfils the same linguistic function as one of its parts, and exocentric if it does not. The distinction reaches back at least to Bloomfield's work of the 1930s, who based it on terms by Pāṇini and Patañjali in Sanskrit grammar. Such a distinction is possible only in phrase structure grammars, since in dependency grammars all constructions are necessarily endocentric.

Topicalization is a mechanism of syntax that establishes an expression as the sentence or clause topic by having it appear at the front of the sentence or clause. This involves a phrasal movement of determiners, prepositions, and verbs to sentence-initial position. Topicalization often results in a discontinuity and is thus one of a number of established discontinuity types, the other three being wh-fronting, scrambling, and extraposition. Topicalization is also used as a constituency test; an expression that can be topicalized is deemed a constituent. The topicalization of arguments in English is rare, whereas circumstantial adjuncts are often topicalized. Most languages allow topicalization, and in some languages, topicalization occurs much more frequently and/or in a much less marked manner than in English. Topicalization in English has also received attention in the pragmatics literature.

<span class="mw-page-title-main">Lucien Tesnière</span> French linguist

Lucien Tesnière was a prominent and influential French linguist. He was born in Mont-Saint-Aignan on May 13, 1893. As a senior lecturer at the University of Strasbourg (1924) and later professor at the University of Montpellier (1937), he published many papers and books on Slavic languages. However, his importance in the history of linguistics is based mainly on his development of an approach to the syntax of natural languages that would become known as dependency grammar. He presented his theory in his book Éléments de syntaxe structurale, published posthumously in 1959. In the book he proposes a sophisticated formalization of syntactic structures, supported by many examples from a diversity of languages. Tesnière died in Montpellier on December 6, 1954.

In linguistics, raising constructions involve the movement of an argument from an embedded or subordinate clause to a matrix or main clause. In other words, a raising predicate/verb appears with a syntactic argument that is not its semantic argument but rather the semantic argument of an embedded predicate. For example, in they seem to be trying, the predicand of trying is the subject of seem. English has raising constructions, unlike some other languages.

In linguistics, control is a construction in which the understood subject of a given predicate is determined by some expression in context. Stereotypical instances of control involve verbs. A superordinate verb "controls" the arguments of a subordinate, nonfinite verb. Control was intensively studied in the government and binding framework in the 1980s, and much of the terminology from that era is still used today. In the days of Transformational Grammar, control phenomena were discussed in terms of Equi-NP deletion. Control is often analyzed in terms of a null pronoun called PRO. Control is also related to raising, although there are important differences between control and raising. Most if not all languages have control constructions and these constructions tend to occur frequently.

In linguistics, subcategorization denotes the ability/necessity for lexical items to require/allow the presence and types of the syntactic arguments with which they co-occur. For example, the word "walk" as in "X walks home" requires the noun-phrase X to be animate.

References