Syntax

Last updated

In linguistics, syntax ( /ˈsɪntæks/ ) [1] [2] is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, agreement, the nature of crosslinguistic variation, and the relationship between form and meaning. There are numerous approaches to syntax which differ in their central assumptions and goals.

Contents

Etymology

The word syntax comes from Ancient Greek: σύνταξις "coordination", which consists of σύνsyn, "together", and τάξιςtáxis, "an ordering".

Sequencing of subject, verb, and object

One basic description of a language's syntax is the sequence in which the subject (S), verb (V), and object (O) usually appear in sentences. Over 85% of languages usually place the subject first, either in the sequence SVO or the sequence SOV. The other possible sequences are VSO, VOS, OVS, and OSV, the last three of which are rare. In most generative theories of syntax, these surface differences arise from a more complex clausal phrase structure, and each order may be compatible with multiple derivations.

Early history

The Aṣṭādhyāyī of Pāṇini (c. 4th century BC in Ancient India), is often cited as an example of a premodern work that approaches the sophistication of a modern syntactic theory (as works on grammar were written long before modern syntax came about). [3] In the West, the school of thought that came to be known as "traditional grammar" began with the work of Dionysius Thrax.

For centuries, a framework known as grammaire générale (first expounded in 1660 by Antoine Arnauld in a book of the same title) dominated work in syntax: as its basic premise the assumption that language is a direct reflection of thought processes and therefore there is a single, most natural way to express a thought.[ citation needed ]

However, in the 19th century, with the development of historical-comparative linguistics, linguists began to realize the sheer diversity of human language and to question fundamental assumptions about the relationship between language and logic. It became apparent that there was no such thing as the most natural way to express a thought, and therefore logic could no longer be relied upon as a basis for studying the structure of language.[ citation needed ]

The Port-Royal grammar modeled the study of syntax upon that of logic. (Indeed, large parts of the Port-Royal Logic were copied or adapted from the Grammaire générale. [4] ) Syntactic categories were identified with logical ones, and all sentences were analyzed in terms of "subject – copula – predicate". Initially, this view was adopted even by the early comparative linguists such as Franz Bopp.

The central role of syntax within theoretical linguistics became clear only in the 20th century, which could reasonably be called the "century of syntactic theory" as far as linguistics is concerned. (For a detailed and critical survey of the history of syntax in the last two centuries, see the monumental work by Giorgio Graffi (2001). [5] )

Theories of syntax

There are a number of theoretical approaches to the discipline of syntax. One school of thought, founded in the works of Derek Bickerton, [6] sees syntax as a branch of biology, since it conceives of syntax as the study of linguistic knowledge as embodied in the human mind. Other linguists (e.g., Gerald Gazdar) take a more Platonistic view, since they regard syntax to be the study of an abstract formal system. [7] Yet others (e.g., Joseph Greenberg) consider syntax a taxonomical device to reach broad generalizations across languages.

Syntacticians have attempted to explain the causes of word-order variation within individual languages and cross-linguistically. Much of such work has been done within frameworks of generative grammar which assumes that the core of syntax depends on a genetic structure which is common to all mankind. Typological research of the languages of the world has however found few absolute universals, leading some to conclude that none of syntax has to be directly genetic.

Alternative explanations have been sought in language processing. It is suggested that the brain finds it easier to parse syntactic patterns which are either right or left branching, but not mixed. The most widely held approach is the performance–grammar correspondence hypothesis by John A. Hawkins who suggests that language is a non-innate adaptation to innate cognitive mechanisms. Cross-linguistic tendencies are considered as being based on language users' preference for grammars that are organized efficiently, and on their avoidance of word orderings which cause processing difficulty. Some languages however exhibit regular inefficient patterning. These include the VO languages Chinese, with the adpositional phrase before the verb, and Finnish which has postpositions; but there are few other profoundly exceptional languages. [8]

Syntactic models

Dependency grammar

Dependency grammar is an approach to sentence structure where syntactic units are arranged according to the dependency relation, as opposed to the constituency relation of phrase structure grammars. Dependencies are directed links between words. The (finite) verb is seen as the root of all clause structure and all the other words in the clause are either directly or indirectly dependent on this root. Some prominent dependency-based theories of syntax are:

Lucien Tesnière (1893–1954) is widely seen as the father of modern dependency-based theories of syntax and grammar. He argued vehemently against the binary division of the clause into subject and predicate that is associated with the grammars of his day (S → NP VP) and which remains at the core of most phrase structure grammars. In the place of this division, he positioned the verb as the root of all clause structure. [9]

Categorial grammar

Categorial grammar is an approach in which constituents combine as function and argument, according to combinatory possibilities specified in their syntactic categories. For example, where other approaches might posit a rule that combines a noun phrase (NP) and a verb phrase (VP), CG would posit a syntactic category NP and another NP\S, read as "a category that searches to the left (indicated by \) for an NP (the element on the left) and outputs a sentence (the element on the right)." So the syntactic category for an intransitive verb is a complex formula representing the fact that the verb acts as a function word requiring an NP as an input and produces a sentence level structure as an output. This complex category is notated as (NP\S) instead of V. The category of transitive verb is defined as an element that requires two NPs (its subject and its direct object) to form a sentence. This is notated as (NP/(NP\S)) which means "a category that searches to the right (indicated by /) for an NP (the object), and generates a function (equivalent to the VP) which is (NP\S), which in turn represents a function that searches to the left for an NP and produces a sentence."

Tree-adjoining grammar is a categorial grammar that adds in partial tree structures to the categories.

Stochastic/probabilistic grammars/network theories

Theoretical approaches to syntax that are based upon probability theory are known as stochastic grammars. One common implementation of such an approach makes use of a neural network or connectionism.

Functional grammars

Functionalist models of grammar study the form–function interaction by performing a structural and a functional analysis.

Generative syntax

Generative syntax is the study of syntax within the overarching framework of generative grammar. Generative theories of syntax typically propose analyses of grammatical patterns using formal tools such as phrase structure grammars augmented with additional operations such as syntactic movement. Their goal in analyzing a particular language is to specify rules which generate all and only the expressions which are well-formed in that language. In doing so, they seek to identify innate domain-specific principles of linguistic cognition, in line with the wider goals of the generative enterprise. Generative syntax is among the approaches that adopt the principle of the autonomy of syntax, assuming that meaning and communicative intent is determined by the syntax rather than the other way around.

Generative syntax was proposed in the late 1950s by Noam Chomsky, building on earlier work by Zellig Harris, Louis Hjelmslev, among others. Since then, numerous theories have been proposed under its umbrella:

Other theories that find their origin in the generative paradigm are:

Cognitive and usage-based grammars

The Cognitive Linguistics framework stems from generative grammar, but adheres to evolutionary rather than Chomskyan linguistics. Cognitive models often recognise the generative assumption that the object belongs to the verb phrase. Cognitive frameworks include:

See also

Syntactic terms

Related Research Articles

A syntactic category is a syntactic unit that theories of syntax assume. Word classes, largely corresponding to traditional parts of speech are syntactic categories. In phrase structure grammars, the phrasal categories are also syntactic categories. Dependency grammars, however, do not acknowledge phrasal categories.

In syntax and grammar, a phrase is a group of words which act together as a grammatical unit. For instance, the English expression "the very happy squirrel" is a noun phrase which contains the adjective phrase "very happy". Phrases can consist of a single word or a complete sentence. In theoretical linguistics, phrases are often analyzed as units of syntactic structure such as a constituent.

Phrase structure rules are a type of rewrite rule used to describe a given language's syntax and are closely associated with the early stages of transformational grammar, proposed by Noam Chomsky in 1957. They are used to break down a natural language sentence into its constituent parts, also known as syntactic categories, including both lexical categories and phrasal categories. A grammar that uses phrase structure rules is a type of phrase structure grammar. Phrase structure rules as they are commonly employed operate according to the constituency relation, and a grammar that employs phrase structure rules is therefore a constituency grammar; as such, it stands in contrast to dependency grammars, which are based on the dependency relation.

In linguistics, X-bar theory is a theory of syntactic category formation. It embodies two independent claims: one, that phrases may contain intermediate constituents projected from a head X; and two, that this system of projected constituency may be common to more than one category.

In linguistics, a verb phrase (VP) is a syntactic unit composed of at least one verb and its dependents—objects, complements and other modifiers—but not always including the subject. Thus in the sentence A fat man put the money quickly in the box, the words put the money quickly in the box are a verb phrase; it consists of the verb put and its dependents, but not the subject a fat man. A verb phrase is similar to what is considered a predicate in more traditional grammars.

Generative grammar Theory in linguistics

Generative grammar is a concept in generative linguistics, a linguistic theory that regards linguistics as the study of a hypothesised innate grammatical structure. It is a biological or biologistic modification of structuralist theories, deriving ultimately from glossematics. Generative Grammar considers grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. The difference from structural and functional models is that the object is placed into the verb phrase in generative grammar. This purportedly cognitive structure is thought of as being a part of a universal grammar, a syntactic structure which is caused by a genetic mutation in humans.

In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.

In linguistics, branching refers to the shape of the parse trees that represent the structure of sentences. Assuming that the language is being written or transcribed from left to right, parse trees that grow down and to the right are right-branching, and parse trees that grow down and to the left are left-branching. The direction of branching reflects the position of heads in phrases, and in this regard, right-branching structures are head-initial, whereas left-branching structures are head-final. English has both right-branching (head-initial) and left-branching (head-final) structures, although it is more right-branching than left-branching. Some languages such as Japanese and Turkish are almost fully left-branching (head-final). Some languages are mostly right-branching (head-initial).

Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called dependencies. Dependency grammar differs from phrase structure grammar in that while it can identify phrases it tends to overlook phrasal nodes. A dependency structure is determined by the relation between a word and its dependents. Dependency structures are flatter than phrase structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages with free word order, such as Czech or Warlpiri.

The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue. Some authors, however, reserve the term for more restricted grammars in the Chomsky hierarchy: context-sensitive grammars or context-free grammars. In a broader sense, phrase structure grammars are also known as constituency grammars. The defining trait of phrase structure grammars is thus their adherence to the constituency relation, as opposed to the dependency relation of dependency grammars.

In generative grammar, non-configurational languages are languages characterized by a flat phrase structure, which allows syntactically discontinuous expressions, and a relatively free word order.

In grammar, a complement is a word, phrase, or clause that is necessary to complete the meaning of a given expression. Complements are often also arguments.

The term predicate is used in one of two ways in linguistics and its subfields. The first defines a predicate as everything in a standard declarative sentence except the subject, and the other views it as just the main content verb or associated predicative expression of a clause. Thus, by the first definition the predicate of the sentence Frank likes cake is likes cake. By the second definition, the predicate of the same sentence is just the content verb likes, whereby Frank and cake are the arguments of this predicate. Differences between these two definitions can lead to confusion.

Lucien Tesnière French linguist

Lucien Tesnière was a prominent and influential French linguist. He was born in Mont-Saint-Aignan on May 13, 1893. As a maître de conférences in Strasbourg (1924), and later professor in Montpellier (1937), he published many papers and books on Slavic languages. However, his importance in the history of linguistics is based mainly on his development of an approach to the syntax of natural languages that would become known as dependency grammar. He presented his theory in his book Éléments de syntaxe structurale, published posthumously in 1959. In the book he proposes a sophisticated formalization of syntactic structures, supported by many examples from a diversity of languages. Tesnière died in Montpellier on December 6, 1954.

In linguistics, head directionality is a proposed parameter that classifies languages according to whether they are head-initial or head-final. The head is the element that determines the category of a phrase: for example, in a verb phrase, the head is a verb.

In linguistics, an argument is an expression that helps complete the meaning of a predicate, the latter referring in this context to a main verb and its auxiliaries. In this regard, the complement is a closely related concept. Most predicates take one, two, or three arguments. A predicate and its arguments form a predicate-argument structure. The discussion of predicates and arguments is associated most with (content) verbs and noun phrases (NPs), although other syntactic categories can also be construed as predicates and as arguments. Arguments must be distinguished from adjuncts. While a predicate needs its arguments to complete its meaning, the adjuncts that appear with a predicate are optional; they are not necessary to complete the meaning of the predicate. Most theories of syntax and semantics acknowledge arguments and adjuncts, although the terminology varies, and the distinction is generally believed to exist in all languages. Dependency grammars sometimes call arguments actants, following Tesnière (1959).

In linguistics, a small clause consists of a subject and its predicate, but lacks an overt expression of tense. Small clauses have the semantic subject-predicate characteristics of a clause, and have some, but not all, the properties of a constituent. Structural analyses of small clauses vary according to whether a flat or layered analysis is pursued. The small clause is related to the phenomena of raising-to-object, exceptional case-marking, accusativus cum infinitivo, and object control.

The term linguistic performance was used by Noam Chomsky in 1960 to describe "the actual use of language in concrete situations". It is used to describe both the production, sometimes called parole, as well as the comprehension of language. Performance is defined in opposition to "competence"; the latter describes the mental knowledge that a speaker or listener has of language.

Merge is one of the basic operations in the Minimalist Program, a leading approach to generative syntax, when two syntactic objects are combined to form a new syntactic unit. Merge also has the property of recursion in that it may apply to its own output: the objects combined by Merge are either lexical items or sets that were themselves formed by Merge. This recursive property of Merge has been claimed to be a fundamental characteristic that distinguishes language from other cognitive faculties. As Noam Chomsky (1999) puts it, Merge is "an indispensable operation of a recursive system ... which takes two syntactic objects A and B and forms the new object G={A,B}" (p. 2).

In linguistics, subcategorization denotes the ability/necessity for lexical items to require/allow the presence and types of the syntactic arguments with which they co-occur. The notion of subcategorization is similar to the notion of valency, although the two concepts stem from different traditions in the study of syntax and grammar.

References

Citations

  1. "syntax". Oxford Dictionaries UK Dictionary. Oxford University Press . Retrieved 2016-01-22.
  2. "syntax". Merriam-Webster Dictionary .
  3. Fortson IV, Benjamin W. (2004). Indo-European Language and Culture: An Introduction. Blackwell. p. 186. ISBN   978-1405188968. [The Aṣṭādhyāyī] is a highly precise and thorough description of the structure of Sanskrit somewhat resembling modern generative grammar...[it] remained the most advanced linguistic analysis of any kind until the twentieth century.
  4. Arnauld, Antoine (1683). La logique (5th ed.). Paris: G. Desprez. p. 137. Nous avons emprunté...ce que nous avons dit...d'un petit Livre...sous le titre de Grammaire générale.
  5. Giorgio, Graffi (2001). 200 Years of Syntax: A Critical Survey (googlebook preview). John Benjamins Publishing. ISBN   9789027284570.
  6. See Bickerton, Derek (1990). Language and Species. University of Chicago Press. ISBN   0-226-04610-9. and, for more recent advances, Derek Bickerton; Eörs Szathmáry, eds. (2009). Biological foundations and origin of syntax. MIT Press. ISBN   978-0-262-01356-7.
  7. Ted Briscoe, 2 May 2001, Interview with Gerald Gazdar Archived 2005-11-22 at the Wayback Machine . Retrieved 2008-06-04.
  8. Song, Jae Jung (2012). Word Order. Cambridge University Press. ISBN   9781139033930.
  9. Concerning Tesnière's rejection of the binary division of the clause into subject and predicate and in favor of the verb as the root of all structure, see Tesnière (1969:103–105).
  10. Chomsky, Noam. 1957. Syntactic Structures. The Hague/Paris: Mouton, p. 15.
  11. Chomsky, Noam (1981/1993). Lectures on Government and Binding: The Pisa Lectures. Mouton de Gruyter.
  12. Chomsky, Noam (1995). The Minimalist Program. MIT Press.

Sources

  • Brown, Keith; Miller, Jim, eds. (1996). Concise Encyclopedia of Syntactic Theories. New York: Elsevier Science. ISBN   0-08-042711-1.
  • Carnie, Andrew (2006). Syntax: A Generative Introduction (2nd ed.). Oxford: Wiley-Blackwell. ISBN   1-4051-3384-8.
  • Freidin, Robert; Lasnik, Howard, eds. (2006). Syntax. Critical Concepts in Linguistics. New York: Routledge. ISBN   0-415-24672-5.
  • Graffi, Giorgio (2001). 200 Years of Syntax. A Critical Survey. Studies in the History of the Language Sciences 98. Amsterdam: Benjamins. ISBN   90-272-4587-8.
  • Talasiewicz, Mieszko (2009). Philosophy of Syntax – Foundational Topics. Springer. ISBN   978-90-481-3287-4. An interdisciplinary essay on the interplay between logic and linguistics on syntactic theories.
  • Tesnière, Lucien (1969). Éleménts de syntaxe structurale. 2nd edition. Paris: Klincksieck. [ ISBN missing ]

Further reading