Dynamic antisymmetry

Last updated

Dynamic antisymmetry is a theory of syntactic movement presented in Andrea Moro's 2000 monograph Dynamic Antisymmetry based on the work presented in Richard S. Kayne's 1994 monograph The Antisymmetry of Syntax.

A premise: the antisymmetry of syntaxThe crux of Kayne's theory is that hierarchical structure in natural language maps universally onto a particular surface linearization, namely specifier-head-complement branching order. To understand what is meant by hierarchical structure, consider the sentence, The King of England likes apples. We can replace this by, He likes apples. Since the phrase the King of England can be replaced by a pronoun, we say that it constitutes a hierarchical unit (called a constituent). Further constituency tests reveal the phrase likes apples to be a constituent. Hierarchical units are built up according to the principles of phrase structure into a branching tree formation rather than into a linear order. Older theories of linearization posited various algorithms for translating the hierarchical structure into a linear order; however, Antisymmetry holds that linear order falls out from the hierarchical relationships among the constituents. In this particular case, there is a relation of asymmetric c-command between the constituent the King of England and likes apples. Therefore, the first constituent is ordered linearly before the second. Further tests ultimately give rise to linear order for the internal parts of these constituents. The theory of the antisymmetry of syntax has a twofold aims. On the one hand, it derives a version of X-bar theory, a formal theory of phrase structure in transformational generative grammar, by means of a unique principle: the Linear Correspondence Axiom (LCA). According to this principle - simplifying - a word W precedes a word W' if and only if W is contained in a node Q that asymmetrically c-commands a node R containing W'. It follows that there cannot be two nodes that mutually c-command each other, unless either one of them contains another node, otherwise the words which are contained in the two nodes could not be linearized. On the other hand, it captures the fact that many structures and derivations that are found in certain languages do not have mirror counterparts in other languages by the same principle. Kayne hypothesized that all phrases whose surface order is not specifier-head-complement have undergone movements which disrupt this underlying order. Subsequently, there have also been attempts at deriving specifier-complement-head as the basic word order.

Dynamic antisymmetry and linearization is a weak version of the theory of antisymmetry developed by Andrea Moro and allows the generation of non-LCA compatible structures (points of symmetry) before the hierarchical structure is linearized at Phonetic Form. The LCA is only active when required: in other words, universal grammar is more parsimonious than in the other model, in that it does not impose restrictions when they are not detectable, i.e. linearization before the articulatory-perceptual interface. In fact, Dynamic Antisymmetry considers movement as a way to rescue structures from a crash at the articulatory-perceptual interface. The unwanted structures are rescued by movement: deleting the phonetic content of the moved element would neutralize the linearization problem. From this perspective, Dynamic Antisymmetry aims at unifying movement and phrase structure which would otherwise be two independent properties that characterize all human language grammars.

Dynamic antisymmetry and labelling: the principle of Dynamic antisymmetry has also been interpreted in computational terms. More specifically: when two XPs are Merged and neither one follows the projection principle, then the structure cannot be computed unless either one moves, thereby forcing the other to project. That's because a single copy is only one link of a bigger chain. This proposal has been formulated as a paper now collected in Moro 2013; see Chomsky 2013 for the proposal to generalise this principle and include it in the standard theory.

See also

Related Research Articles

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning. There are numerous approaches to syntax which differ in their central assumptions and goals.

Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.

In linguistics, X-bar theory is a theory of syntactic category formation that was first proposed by Chomsky (1970) and further developed by Jackendoff (1977), along the lines of the theory of generative grammar put forth in the 1950s by Noam Chomsky. It attempts to capture the structure of phrasal categories with a single uniform structure called the X-bar schema, basing itself on the assumption that any phrase in natural language is an XP that is headed by a given syntactic category X. It played a significant role in resolving issues that phrase structure rules had, representative of which is the proliferation of grammatical rules, which is against the thesis of generative grammar.

In linguistics, the minimalist program (MP) is a major line of inquiry that has been developing inside generative grammar since the early 1990s, starting with a 1993 paper by Noam Chomsky.

In generative linguistics, Distributed Morphology is a theoretical framework introduced in 1993 by Morris Halle and Alec Marantz. The central claim of Distributed Morphology is that there is no divide between the construction of words and sentences. The syntax is the single generative engine that forms sound-meaning correspondences, both complex phrases and complex words. This approach challenges the traditional notion of the Lexicon as the unit where derived words are formed and idiosyncratic word-meaning correspondences are stored. In Distributed Morphology there is no unified Lexicon as in earlier generative treatments of word-formation. Rather, the functions that other theories ascribe to the Lexicon are distributed among other components of the grammar.

In linguistics, branching refers to the shape of the parse trees that represent the structure of sentences. Assuming that the language is being written or transcribed from left to right, parse trees that grow down and to the right are right-branching, and parse trees that grow down and to the left are left-branching. The direction of branching reflects the position of heads in phrases, and in this regard, right-branching structures are head-initial, whereas left-branching structures are head-final. English has both right-branching (head-initial) and left-branching (head-final) structures, although it is more right-branching than left-branching. Some languages such as Japanese and Turkish are almost fully left-branching (head-final). Some languages are mostly right-branching (head-initial).

Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called dependencies. Dependency grammar differs from phrase structure grammar in that while it can identify phrases it tends to overlook phrasal nodes. A dependency structure is determined by the relation between a word and its dependents. Dependency structures are flatter than phrase structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages with free word order, such as Czech or Warlpiri.

In generative grammar and related frameworks, a node in a parse tree c-commands its sister node and all of its sister's descendants. In these frameworks, c-command plays a central role in defining and constraining operations such as syntactic movement, binding, and scope. Tanya Reinhart introduced c-command in 1976 as a key component of her theory of anaphora. The term is short for "constituent command".

In linguistics, antisymmetry is a theory of syntactic linearization presented in Richard S. Kayne's 1994 monograph The Antisymmetry of Syntax. It asserts that hierarchical structure in natural language maps universally onto a particular surface linearization, namely specifier-head-complement branching order. The theory derives a version of X-bar theory. Kayne hypothesizes that all phrases whose surface order is not specifier-head-complement have undergone syntactic movements that disrupt this underlying order. Subsequently, others have attempted to derive specifier-complement-head as the basic word order.

In linguistics, head directionality is a proposed parameter that classifies languages according to whether they are head-initial or head-final. The head is the element that determines the category of a phrase: for example, in a verb phrase, the head is a verb. Therefore, head initial would be "VO" languages and head final would be "OV" languages.

The projection principle is a stipulation proposed by Noam Chomsky as part of the phrase structure component of generative-transformational grammar. The projection principle is used in the derivation of phrases under the auspices of the principles and parameters theory.

In the field of linguistics, specifically in syntax, phonetic form (PF), also known as phonological form or the articulatory-perceptual (A-P) system, is a certain level of mental representation of a linguistic expression, derived from surface structure, and related to Logical Form. Phonetic form is the level of representation wherein expressions, or sentences, are assigned a phonetic representation, which is then pronounced by the speaker. Phonetic form takes surface structure as its input, and outputs an audible, pronounced sentence.

Heavy NP shift is an operation that involves re-ordering (shifting) a "heavy" noun phrase (NP) to a position to the right of its canonical position under certain circumstances. The heaviness of the NP is determined by its grammatical complexity; whether or not shifting occurs can impact the grammaticality of the sentence.

Andrea Moro Italian linguist

Andrea Carlo Moro is an Italian linguist, neuroscientist and novelist.

Merge is one of the basic operations in the Minimalist Program, a leading approach to generative syntax, when two syntactic objects are combined to form a new syntactic unit. Merge also has the property of recursion in that it may apply to its own output: the objects combined by Merge are either lexical items or sets that were themselves formed by Merge. This recursive property of Merge has been claimed to be a fundamental characteristic that distinguishes language from other cognitive faculties. As Noam Chomsky (1999) puts it, Merge is "an indispensable operation of a recursive system ... which takes two syntactic objects A and B and forms the new object G={A,B}" (p. 2).

In linguistics, locality refers to the proximity of elements in a linguistic structure. Constraints on locality limit the span over which rules can apply to a particular structure. Theories of transformational grammar use syntactic locality constraints to explain restrictions on argument selection, syntactic binding, and syntactic movement.

In linguistics, immediate constituent analysis or IC analysis is a method of sentence analysis that was first mentioned by Leonard Bloomfield and developed further by Rulon Wells. The process reached a full-blown strategy for analyzing sentence structure in the early works of Noam Chomsky. The practice is now widespread. Most tree structures employed to represent the syntactic structure of sentences are products of some form of IC-analysis. The process and result of IC-analysis can, however, vary greatly based upon whether one chooses the constituency relation of phrase structure grammars or the dependency relation of dependency grammars as the underlying principle that organizes constituents into hierarchical structures.

In syntax, shifting occurs when two or more constituents appearing on the same side of their common head exchange positions in a sense to obtain non-canonical order. The most widely acknowledged type of shifting is heavy NP shift, but shifting involving a heavy NP is just one manifestation of the shifting mechanism. Shifting occurs in most if not all European languages, and it may in fact be possible in all natural languages including sign languages. Shifting is not inversion, and inversion is not shifting, but the two mechanisms are similar insofar as they are both present in languages like English that have relatively strict word order. The theoretical analysis of shifting varies in part depending on the theory of sentence structure that one adopts. If one assumes relatively flat structures, shifting does not result in a discontinuity. Shifting is often motivated by the relative weight of the constituents involved. The weight of a constituent is determined by a number of factors: e.g., number of words, contrastive focus, and semantic content.

In linguistics, a discontinuity occurs when a given word or phrase is separated from another word or phrase that it modifies in such a manner that a direct connection cannot be established between the two without incurring crossing lines in the tree structure. The terminology that is employed to denote discontinuities varies depending on the theory of syntax at hand. The terms discontinuous constituent, displacement, long distance dependency, unbounded dependency, and projectivity violation are largely synonymous with the term discontinuity. There are various types of discontinuities, the most prominent and widely studied of these being topicalization, wh-fronting, scrambling, and extraposition.

In linguistics, a node is a point in a tree diagram or syntactic tree that can be assigned a syntactic category label.

References