Topicalization is a mechanism of syntax that establishes an expression as the sentence or clause topic by having it appear at the front of the sentence or clause (as opposed to in a canonical position further to the right). Topicalization is a phrasal movement of determiners, prepositions, and verbs. [ clarification needed ] Topicalization often results in a discontinuity and is thus one of a number of established discontinuity types, the other three being wh-fronting, scrambling, and extraposition. Topicalization is also used as a constituency test; an expression that can be topicalized is deemed a constituent. The topicalization of arguments in English is rare, whereas circumstantial adjuncts are often topicalized. Most languages allow topicalization, and in some languages, topicalization occurs much more frequently and/or in a much less marked manner than in English. Topicalization in English has also received attention in the pragmatics literature.
Typical cases of topicalization are illustrated with the following examples:
Assuming that the a-sentences represent canonical word order, the b-sentences contain instances of topicalization. The constituent in bold is fronted to establish it as topic. The first two examples, which use topicalized adjuncts, are typical, but the last two examples with topicalized object arguments are comparatively rare. The appearance of the demonstrative determiners that and those is important since without them, topicalization of an argument seems less acceptable: A pizza I won't eat.
Topicalization can occur across long distances:
Topicalization is similar to wh-movement insofar as the constituents that can be wh-fronted can also be topicalized:
Also, topicalization is similar to wh-fronting insofar as the islands and barriers to wh-fronting are also islands and barriers to topicalization:
Those examples illustrate the similar behavior of topicalization and wh-fronting. Further data, which will not be produced here, could show, however, that topicalization is unlike the other two major discontinuity types: scrambling and extraposition.
The theoretical analysis of topicalization can vary greatly depending in part on the theory of sentence structure that one adopts. If one assumes the layered structures associated with many phrase structure grammars, all instances of topicalization will involve a discontinuity. If, in contrast, less layered structures are assumed as for example in dependency grammar, then many instances of topicalization do not involve a discontinuity, but rather just inversion.This point is illustrated here first using flatter structures that lack a finite VP-constituent (which means the entire sentence has the status of a large VP). Both constituency- and dependency-based analyses are given. The example itself is a piece of Yoda wisdom (as he speaks to Anakin), and is certainly of questionable acceptability in this regard. It is, however, perfectly understandable:
The upper two trees show the analysis using flat constituency-based structures that lack a finite VP constituent, and the lower two trees are dependency-based,whereby dependency inherently rejects the existence of finite VP-constituents. The noteworthy aspect of these examples is that topicalization does not result in a discontinuity, since there are no crossing lines in the trees. What this means is that such cases can be analyzed purely in terms of inversion. The topicalized expression simply "inverts" to the other side of its head.
Instead of the flat trees just examined, most constituency grammars posit more layered structures that include a finite VP constituent. These more layered structures are likely to address topicalization in terms of movement or copying, as illustrated with the following two trees:
Tree a. shows the canonical word order again, and tree b. illustrates what is known as the movement or copying analysis. The topicalized expression is first generated in its canonical position but is then copied to the front of the sentence, the original then being deleted.
The movement analysis of discontinuities is one possible way to address those instances of topicalization that cannot be explained in terms of inversion. An alternative explanation is feature passing. One assumes that the topicalized expression is not moved or copied to the clause-initial position, but rather it is "base" generated there. Instead of movement, there is feature passing, however.A link of a sort is established between the topicalized expression and its governor. The link is the path along which information about the topicalized expression is passed to the governor of that expression. A piece of Yoda wisdom is again used for illustration, the full sentence being Careful you must be when sensing the future, Anakin:
The nodes in red mark the path of feature passing. Features (=information) about the topicalized expression are passed rightward through (and down) the tree structure to the governor of that expression. This path is present in both analyses, i.e. in the constituency-based a-analysis on the left and in the dependency-based b-analysis on the right. Since topicalization can occur over long distances, feature passing must also occur over long distances. The final example shows a dependency-based analysis of a sentence where the feature passing path is quite long:
Information about the topicalized such nonsense is passed along the path marked in red down to the governor of the topicalized expression spouting. The words corresponding to the nodes in red form a catena (Latin for 'chain', plural catenae).A theory of topicalization is then built up in part by examining the nature of these catenae for feature passing.
In linguistics, syntax is the set of rules, principles, and processes that govern the structure of sentences in a given language, usually including word order. The term syntax is also used to refer to the study of such principles and processes. The goal of many syntacticians is to discover the syntactic rules common to all languages.
In everyday speech, a phrase is any group of words, often carrying a special idiomatic meaning; in this sense it is synonymous with expression. In linguistic analysis, a phrase is a group of words that functions as a constituent in the syntax of a sentence, a single unit within a grammatical hierarchy. A phrase typically appears within a clause, but it is possible also for a phrase to be a clause or to contain a clause within it. There are also types of phrases like noun phrase and prepositional phrase.
Phrase structure rules are a type of rewrite rule used to describe a given language's syntax and are closely associated with the early stages of transformational grammar, proposed by Noam Chomsky in 1957. They are used to break down a natural language sentence into its constituent parts, also known as syntactic categories, including both lexical categories and phrasal categories. A grammar that uses phrase structure rules is a type of phrase structure grammar. Phrase structure rules as they are commonly employed operate according to the constituency relation, and a grammar that employs phrase structure rules is therefore a constituency grammar; as such, it stands in contrast to dependency grammars, which are based on the dependency relation.
A noun phrase, or nominal (phrase), is a phrase that has a noun as its head or performs the same grammatical function as a noun. Noun phrases are very common cross-linguistically, and they may be the most frequently occurring phrase type.
In language, a clause is a part of the sentence that constitutes or comprises a predicate. A typical clause consists of a subject and a predicate, the latter typically a verb phrase, a verb with any objects and other modifiers. However, the subject is sometimes not said or explicit, often the case in null-subject languages if the subject is retrievable from context, but it sometimes also occurs in other languages such as English.
A parse tree or parsing tree or derivation tree or concrete syntax tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar. The term parse tree itself is used primarily in computational linguistics; in theoretical syntax, the term syntax tree is more common.
Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called dependencies. Dependency grammar differs from phrase structure grammar in that while it can identify phrases it tends to overlook phrasal nodes. A dependency structure is determined by the relation between a word and its dependents. Dependency structures are flatter than phrase structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages with free word order, such as Czech or Warlpiri.
In syntactic analysis, a constituent is a word or a group of words that function as a single unit within a hierarchical structure. The constituent structure of sentences is identified using tests for constituents. These tests apply to a portion of a sentence, and the results provide evidence about the constituent structure of the sentence. Many constituents are phrases. A phrase is a sequence of one or more words built around a head lexical item and working as a unit within a sentence. A word sequence is shown to be a phrase/constituent if it exhibits one or more of the behaviors discussed below. The analysis of constituent structure is associated mainly with phrase structure grammars, although dependency grammars also allow sentence structure to be broken down into constituent parts.
In linguistics, wh-movement is the formation of syntactic dependencies involving interrogative words. An example in English is the dependency formed between what and the object position of doing in "What are you doing?". Interrogative forms are known within English linguistics as wh-words such as what, when, where, who, and why, but also include interrogative words like how. This kind of dependency has been used as a diagnostic tool in syntactic studies as it is subject to a number of interacting grammatical constraints.
In traditional grammar and syntax, the predicate is the portion of a sentence which makes a claim about the subject. For instance, in "Mary drinks beer", the predicate would be the verb phrase "drinks beer". Some syntactic frameworks view the predicate as a constituent while others do not. In semantics, the term refers to a function which takes an argument and returns a truth value. Words or phrases which denote such functions are also sometimes referred to as "predicates" regardless of syntactic framework.
In linguistics, ellipsis or an elliptical construction is the omission from a clause of one or more words that are nevertheless understood in the context of the remaining elements. There are numerous distinct types of ellipsis acknowledged in theoretical syntax. This article provides an overview of them. Theoretical accounts of ellipsis can vary greatly depending in part upon whether a constituency-based or a dependency-based theory of syntactic structure is pursued.
In linguistics, an argument is an expression that helps complete the meaning of a predicate, the latter referring in this context to a main verb and its auxiliaries. In this regard, the complement is a closely related concept. Most predicates take one, two, or three arguments. A predicate and its arguments form a predicate-argument structure. The discussion of predicates and arguments is associated most with (content) verbs and noun phrases (NPs), although other syntactic categories can also be construed as predicates and as arguments. Arguments must be distinguished from adjuncts. While a predicate needs its arguments to complete its meaning, the adjuncts that appear with a predicate are optional; they are not necessary to complete the meaning of the predicate. Most theories of syntax and semantics acknowledge arguments and adjuncts, although the terminology varies, and the distinction is generally believed to exist in all languages. Dependency grammars sometimes call arguments actants, following Tesnière (1959).
Antecedent-contained deletion (ACD), also called antecedent-contained ellipsis, is a phenomenon whereby an elided verb phrase appears to be contained within its own antecedent. For instance, in the sentence "I read every book that you did", the verb phrase in the main clause appears to license ellipsis inside the relative clause which modifies its object. ACD is a classic puzzle for theories of the syntax-semantics interface, since it threatens to introduce an infinite regress. It is commonly taken as motivation for syntactic transformations such as quantifier raising, though some approaches explain it using semantic composition rules or by adoption more flexible notions of what it means to be a syntactic unit.
Syntactic movement is the means by which some theories of syntax address discontinuities. Movement was first postulated by structuralist linguists who expressed it in terms of discontinuous constituents or displacement. Certain constituents appear to have been displaced from the position where they receive important features of interpretation. The concept of movement is controversial; it is associated with so-called transformational or derivational theories of syntax. Representational theories, in contrast, reject the notion of movement, often addressing discontinuities in terms of feature passing or persistent structural identities instead.
Scrambling is a syntactic phenomenon wherein sentences can be formulated using a variety of different word orders without any change in meaning. Scrambling often results in a discontinuity since the scrambled expression can end up at a distance from its head. Scrambling does not occur in English, but it is frequent in languages with freer word order, such as German, Russian, Persian and Turkic languages. The term was coined by Haj Ross in his 1967 dissertation and is widely used in present work, particularly with the generative tradition.
In linguistics, the catena is a unit of syntax and morphology, closely associated with dependency grammars. It is a more flexible and inclusive unit than the constituent and may therefore be better suited than the constituent to serve as the fundamental unit of syntactic and morphosyntactic analysis.
In syntax, shifting occurs when two or more constituents appearing on the same side of their common head exchange positions in a sense to obtain non-canonical order. The most widely acknowledged type of shifting is heavy NP shift, but shifting involving a heavy NP is just one manifestation of the shifting mechanism. Shifting occurs in most if not all European languages, and it may in fact be possible in all natural languages. Shifting is not inversion, and inversion is not shifting, but the two mechanisms are similar insofar as they are both present in languages like English that have relatively strict word order. The theoretical analysis of shifting varies in part depending on the theory of sentence structure that one adopts. If one assumes relatively flat structures, shifting does not result in a discontinuity. Shifting is often motivated by the relative weight of the constituents involved. The weight of a constituent is determined by a number of factors: e.g., number of words, contrastive focus, and semantic content.
In linguistics, a discontinuity occurs when a given word or phrase is separated from another word or phrase that it modifies in such a manner that a direct connection cannot be established between the two without incurring crossing lines in the tree structure. The terminology that is employed to denote discontinuities varies depending on the theory of syntax at hand. The terms discontinuous constituent, displacement, long distance dependency, unbounded dependency, and projectivity violation are largely synonymous with the term discontinuity. There are various types of discontinuities, the most prominent and widely studied of these being topicalization, wh-fronting, scrambling, and extraposition.
Extraposition is a mechanism of syntax that alters word order in such a manner that a relatively "heavy" constituent appears to the right of its canonical position. Extraposing a constituent results in a discontinuity and in this regard, it is unlike shifting, which does not generate a discontinuity. The extraposed constituent is separated from its governor by one or more words that dominate its governor. Two types of extraposition are acknowledged in theoretical syntax: standard cases where extraposition is optional and it-extraposition where extraposition is obligatory. Extraposition is motivated in part by a desire to reduce center embedding by increasing right-branching and thus easing processing, center-embedded structures being more difficult to process. Extraposition occurs frequently in English and related languages.
Subject–verb inversion in English is a type of inversion where the subject and verb switch their canonical order of appearance so that the subject follows the verb(s), e.g. A lamp stood beside the bed → Beside the bed stood a lamp. Subject–verb inversion is distinct from subject–auxiliary inversion because the verb involved is not an auxiliary verb.