Generative grammar

Last updated

Generative grammar is a linguistic theory that regards grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. Noam Chomsky first used the term in relation to the theoretical linguistics of grammar that he developed in the late 1950s. [1] Linguists who follow the generative approach have been called generativists. The generative school has focused on the study of syntax and addressed other aspects of a language's structure, including morphology and phonology.

Theoretical linguistics, or general linguistics, is the branch of linguistics which inquires into the nature of language itself and seeks to answer fundamental questions as to what language is; how it works; how universal grammar (UG) as a domain-specific mental organ operates, if it exists at all; what are its unique properties; how does language relate to other cognitive processes, etc. Theoretical linguists are most concerned with constructing models of linguistic knowledge, and ultimately developing a linguistic theory.

In linguistics, grammar is the set of structural rules governing the composition of clauses, phrases and words in a natural language. The term refers also to the study of such rules and this field includes phonology, morphology and syntax, often complemented by phonetics, semantics and pragmatics.

Language Capacity to communicate using signs, such as words or gestures

Language is a system that consists of the development, acquisition, maintenance and use of complex systems of communication, particularly the human ability to do so; a language is any specific example of such a system.

Contents

Early versions of Chomsky's theory were called transformational grammar, a term still used to include his subsequent theories, [2] the most recent of which is the minimalist program theory: Chomsky and other generativists have argued that many of the properties of a generative grammar arise from a universal grammar that is innate to the human brain, rather than being learned from the environment (see the poverty of the stimulus argument).

In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones.

In linguistics, the minimalist program (MP) is a major line of inquiry that has been developing inside generative grammar since the early 1990s, starting with a 1993 paper by Noam Chomsky.

Universal grammar (UG) in linguistics, is the theory of the genetic component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that a certain set of structural rules are innate to humans, independent of sensory experience. With more linguistic stimuli received in the course of psychological development, children then adopt specific syntactic rules that conform to UG. It is sometimes known as "mental grammar", and stands contrasted with other "grammars", e.g. prescriptive, descriptive and pedagogical. The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established, as some linguists have argued languages are so diverse that such universality is rare. It is a matter of empirical investigation to determine precisely what properties are universal and what linguistic capacities are innate.

There are a number of versions of generative grammar currently practiced within linguistics.

A contrasting approach is that of constraint-based grammars. Where a generative grammar attempts to list all the rules that result in all well-formed sentences, constraint-based grammars allow anything that is not otherwise constrained. Certain versions of dependency grammar, head-driven phrase structure grammar, lexical functional grammar, categorial grammar, relational grammar, link grammar, and tree-adjoining grammar are constraint-based grammars that have been proposed. In stochastic grammar, grammatical correctness is taken as a probabilistic variable, rather than a discrete (yes or no) property.

Constraint-based grammars can perhaps be best understood in contrast to generative grammars. Whereas a generative grammar lists all the transformations, merges, movements, and deletions that can result in all well-formed sentences, constraint-based grammars take the opposite approach: allowing anything that is not otherwise constrained.

Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called dependencies. DGs are distinct from phrase structure grammars, since DGs lack phrasal nodes, although they acknowledge phrases. Structure is determined by the relation between a word and its dependents. Dependency structures are flatter than phrase structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages with free word order, such as Czech, Slovak, and Warlpiri.

Head-driven phrase structure grammar (HPSG) is a highly lexicalized, constraint-based grammar developed by Carl Pollard and Ivan Sag. It is a type of phrase structure grammar, as opposed to a dependency grammar, and it is the immediate successor to generalized phrase structure grammar. HPSG draws from other fields such as computer science and uses Ferdinand de Saussure's notion of the sign. It uses a uniform formalism and is organized in a modular way which makes it attractive for natural language processing.

Frameworks

There are a number of different approaches to generative grammar. Common to all is the effort to come up with a set of rules or principles that formally defines each and every one of the members of the set of well-formed expressions of a natural language. The term generative grammar has been associated with at least the following schools of linguistics:

In neuropsychology, linguistics, and the philosophy of language, a natural language or ordinary language is any language that has evolved naturally in humans through use and repetition without conscious planning or premeditation. Natural languages can take different forms, such as speech or signing. They are distinguished from constructed and formal languages such as those used to program computers or to study logic.

Government and binding is a theory of syntax and a phrase structure grammar in the tradition of transformational grammar developed principally by Noam Chomsky in the 1980s. This theory is a radical revision of his earlier theories and was later revised in The Minimalist Program (1995) and several subsequent papers, the latest being Three Factors in Language Design (2005). Although there is a large literature on government and binding theory which is not written by Chomsky, Chomsky's papers have been foundational in setting the research agenda.

In linguistics, relational grammar (RG) is a syntactic theory which argues that primitive grammatical relations provide the ideal means to state syntactic rules in universal terms. Relational grammar began as an alternative to transformational grammar.

Categorial grammar is a term used for a family of formalisms in natural language syntax motivated by the principle of compositionality and organized according to the view that syntactic constituents should generally combine as functions or according to a function-argument relationship. Most versions of categorial grammar analyze sentence structure in terms of constituencies and are therefore phrase structure grammars.

Historical development of models of transformational grammar

Although Leonard Bloomfield, whose work Chomsky rejects, saw the ancient Indian grammarian Pāṇini as an antecedent of structuralism, [3] [4] Chomsky, in an award acceptance speech delivered in India in 2001, claimed "The first generative grammar in the modern sense was Panini's grammar".

Generative grammar has been under development since the late 1950s, and has undergone many changes in the types of rules and representations that are used to predict grammaticality. In tracing the historical development of ideas within generative grammar, it is useful to refer to various stages in the development of the theory.

Standard theory (1957–1965)

The so-called standard theory corresponds to the original model of generative grammar laid out by Chomsky in 1965.

A core aspect of standard theory is the distinction between two different representations of a sentence, called deep structure and surface structure. The two representations are linked to each other by transformational grammar.

Extended standard theory (1965–1973)

The so-called extended standard theory was formulated in the late 1960s and early 1970s. Features are:

  • syntactic constraints
  • generalized phrase structures (X-bar theory)

Revised extended standard theory (1973–1976)

The so-called revised extended standard theory was formulated between 1973 and 1976. It contains

Relational grammar (ca. 1975–1990)

An alternative model of syntax based on the idea that notions like subject, direct object, and indirect object play a primary role in grammar.

Government and binding/Principles and parameters theory (1981–1990)

Chomsky's Lectures on Government and Binding (1981) and Barriers (1986).

Minimalist program (1990–present)

Context-free grammars

Generative grammars can be described and compared with the aid of the Chomsky hierarchy (proposed by Chomsky in the 1950s). This sets out a series of types of formal grammars with increasing expressive power. Among the simplest types are the regular grammars (type 3); Chomsky claims that these are not adequate as models for human language, because of the allowance of the center-embedding of strings within strings, in all natural human languages.

At a higher level of complexity are the context-free grammars (type 2). The derivation of a sentence by such a grammar can be depicted as a derivation tree. Linguists working within generative grammar often view such trees as a primary object of study. According to this view, a sentence is not merely a string of words. Instead, adjacent words are combined into constituents, which can then be further combined with other words or constituents to create a hierarchical tree-structure.

The derivation of a simple tree-structure for the sentence "the dog ate the bone" proceeds as follows. The determiner the and noun dog combine to create the noun phrase the dog. A second noun phrase the bone is created with determiner the and noun bone. The verb ate combines with the second noun phrase, the bone, to create the verb phrase ate the bone. Finally, the first noun phrase, the dog, combines with the verb phrase, ate the bone, to complete the sentence: the dog ate the bone. The following tree diagram illustrates this derivation and the resulting structure:

Basic english syntax tree.svg

Such a tree diagram is also called a phrase marker. They can be represented more conveniently in text form, (though the result is less easy to read); in this format the above sentence would be rendered as:
[S [NP [D The ] [N dog ] ] [VP [V ate ] [NP [D the ] [N bone ] ] ] ]

Chomsky has argued that phrase structure grammars are also inadequate for describing natural languages, and formulated the more complex system of transformational grammar. [5]

Music

Generative grammar has been used to a limited extent in music theory and analysis since the 1980s. [6] [7] The most well-known approaches were developed by Mark Steedman [8] as well as Fred Lerdahl and Ray Jackendoff, [9] who formalized and extended ideas from Schenkerian analysis. [10] More recently, such early generative approaches to music were further developed and extended by various scholars. [11] [12] [13] [14] The theory of generative grammar has been manipulated by the Sun Ra Revival Post-Krautrock Archestra in the development of their post-structuralist lyrics. This is particularly emphasised in their song "Sun Ra Meets Terry Lee".[ citation needed ] French Composer Philippe Manoury applied the systematic of generative grammar to the field of contemporary classical music.[ citation needed ]

See also

Related Research Articles

In linguistics, syntax is the set of rules, principles, and processes that govern the structure of sentences in a given language, usually including word order. The term syntax is also used to refer to the study of such principles and processes. The goal of many syntacticians is to discover the syntactic rules common to all languages.

Phrase structure rules are a type of rewrite rule used to describe a given language's syntax and are closely associated with the early stages of transformational grammar, being first proposed by Noam Chomsky in 1957. They are used to break down a natural language sentence into its constituent parts, also known as syntactic categories, including both lexical categories and phrasal categories. A grammar that uses phrase structure rules is a type of phrase structure grammar. Phrase structure rules as they are commonly employed operate according to the constituency relation, and a grammar that employs phrase structure rules is therefore a constituency grammar; as such, it stands in contrast to dependency grammars, which are based on the dependency relation.

A noun phrase or nominal phrase is a phrase that has a noun as its head or performs the same grammatical function as such a phrase. Noun phrases are very common cross-linguistically, and they may be the most frequently occurring phrase type.

Parse tree ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar

A parse tree or parsing tree or derivation tree or concrete syntax tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar. The term parse tree itself is used primarily in computational linguistics; in theoretical syntax, the term syntax tree is more common.

X-bar theory is a theory of syntactic category formation. It embodies two independent claims: one, that phrases may contain intermediate constituents projected from a head X; and two, that this system of projected constituency may be common to more than one category.

Deep structure and surface structure are concepts used in linguistics, specifically in the study of syntax in the Chomskyan tradition of transformational generative grammar.

In X-bar theory in linguistics, specifiers, head words, complements and adjuncts together form phrases. Specifiers differ from complements and adjuncts because they are non-recursive, as you can only have one specifier. They are not sisters of the head, but rather sisters of the phrase formed by the head and the complement or adjunct.

Ray Jackendoff American linguist and philosophy professor

Ray Jackendoff is an American linguist. He is professor of philosophy, Seth Merrin Chair in the Humanities and, with Daniel Dennett, co-director of the Center for Cognitive Studies at Tufts University. He has always straddled the boundary between generative linguistics and cognitive linguistics, committed to both the existence of an innate universal grammar and to giving an account of language that is consistent with the current understanding of the human mind and cognition.

In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.

In linguistics, branching refers to the shape of the parse trees that represent the structure of sentences. Assuming that the language is being written or transcribed from left to right, parse trees that grow down and to the right are right-branching, and parse trees that grow down and to the left are left-branching. The direction of branching reflects the position of heads in phrases, and in this regard, right-branching structures are head-initial, whereas left-branching structures are head-final. English has both right-branching (head-initial) and left-branching (head-final) structures, although it is more right-branching than left-branching. Some languages such as Japanese and Turkish are almost fully left-branching (head-final). Some languages are mostly right-branching (head-initial).

The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue. Some authors, however, reserve the term for more restricted grammars in the Chomsky hierarchy: context-sensitive grammars or context-free grammars. In a broader sense, phrase structure grammars are also known as constituency grammars. The defining trait of phrase structure grammars is thus their adherence to the constituency relation, as opposed to the dependency relation of dependency grammars.

<i>Syntactic Structures</i> book by Noam Chomsky

Syntactic Structures is a major work in linguistics by American linguist Noam Chomsky. It was first published in 1957. It introduced the idea of transformational generative grammar. This approach to syntax was fully formal. At its base, this method uses phrase structure rules. These rules break down sentences into smaller parts. Chomsky then combines these with a new kind of rules called "transformations". This procedure gives rise to different sentence structures. Chomsky aimed to show that this limited set of rules "generates" all and only the grammatical sentences of a given language, which are unlimited in number.

Linguistic competence is the system of linguistic knowledge possessed by native speakers of a language. It is distinguished from linguistic performance, which is the way a language system is used in communication. Noam Chomsky introduced this concept in his elaboration of generative grammar, where it has been widely adopted and competence is the only level of language that is studied.

Generative semantics is the name of a research program within linguistics, initiated by the work of various early students of Noam Chomsky: John R. Ross, Paul Postal, and later James McCawley. George Lakoff and Pieter Seuren were also instrumental in developing and advocating the theory.

The projection principle is a stipulation proposed by Noam Chomsky as part of the phrase structure component of generative-transformational grammar. The projection principle is used in the derivation of phrases under the auspices of the principles and parameters theory.

<i>Aspects of the Theory of Syntax</i> book by Noam Chomsky

Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.

Lectures on Government and Binding: The Pisa Lectures (LGB) is a book by American linguist Noam Chomsky, published in 1981. It is based on the lectures Chomsky gave at the GLOW conference and workshop held at the Scuola Normale Superiore in Pisa, Italy in 1979. In this book, Chomsky presented his government and binding theory of syntax. It had great influence on the syntactic research in early 1980s, especially among the linguists working within the transformational grammar framework.

References

  1. "Tool Module: Chomsky's Universal Grammar". thebrain.mcgill.ca. Retrieved 2017-08-28.
  2. "Mod 4 Lesson 4.2.3 Generative-Transformational Grammar Theory". www2.leeward.hawaii.edu. Retrieved 2017-02-02.
  3. Bloomfield, Leonard, 1929, 274; cited in Rogers, David, 1987, 88
  4. Hockett, Charles, 1987, 41
  5. Chomsky, Noam (1956). "Three models for the description of language" (PDF). IRE Transactions on Information Theory. 2 (3): 113–124. doi:10.1109/TIT.1956.1056813. Archived from the original (PDF) on 2010-09-19.
  6. Baroni, M., Maguire, S., and Drabkin, W. (1983). The Concept of Musical Grammar. Music Analysis, 2:175–208.
  7. Baroni, M. and Callegari, L. (1982) Eds., Musical grammars and computer analysis. Leo S. Olschki Editore: Firenze, 201–218.
  8. Steedman, M.J. (1989). "A Generative Grammar for Jazz Chord Sequences". Music Perception. 2 (1): 52–77. doi:10.2307/40285282. JSTOR   40285282.
  9. Lerdahl, Fred; Ray Jackendoff (1996). A Generative Theory of Tonal Music. Cambridge: MIT Press. ISBN   978-0-262-62107-6.
  10. Heinrich Schenker, Free Composition. (Der Freie Satz) translated and edited by Ernst Ostler. New York: Longman, 1979.
  11. Tojo, O. Y. & Nishida, M. (2006). Analysis of chord progression by HPSG. In Proceedings of the 24th IASTED international conference on Artificial intelligence and applications, 305–310.
  12. Rohrmeier, Martin (2007). A generative grammar approach to diatonic harmonic structure. In Spyridis, Georgaki, Kouroupetroglou, Anagnostopoulou (Eds.), Proceedings of the 4th Sound and Music Computing Conference, 97–100. http://smc07.uoa.gr/SMC07%20Proceedings/SMC07%20Paper%2015.pdf
  13. Giblin, Iain (2008). Music and the generative enterprise. Doctoral dissertation. University of New South Wales.
  14. Katz, Jonah; David Pesetsky (2009) "The Identity Thesis for Language and Music". http://ling.auf.net/lingBuzz/000959

[1] [2]

Further reading

  1. "Mod 4 Lesson 4.2.3 Generative-Transformational Grammar Theory". www2.leeward.hawaii.edu. Retrieved 2017-02-02.
  2. Kamalani Hurley, Pat. "Mod 4 Lesson 4.2.3 Generative-Transformational Grammar Theory". www2.leeward.hawaii.edu. Retrieved 2017-02-02.