Author | Noam Chomsky |
---|---|
Country | United States |
Language | English |
Subject | Linguistics |
Published | 1964 |
Media type | Print (Hardcover and Paperback) |
Pages | 119 |
ISBN | 978-9027907004 |
Current Issues in Linguistic Theory is a 1964 book by American linguist Noam Chomsky. It is a revised and expanded version of "The Logical Basis of Linguistic Theory", a paper that Chomsky presented in the ninth International Congress of Linguists held in Cambridge, Massachusetts in 1962. It is a short monograph of about a hundred pages, similar to Chomsky's earlier Syntactic Structures (1957). In Aspects of the Theory of Syntax (1965), Chomsky presents many of its ideas in a more elaborate manner.
Chomsky places emphasis on the capacity of human languages to create new sentences in an infinite manner. To him, this creativity is an essential characteristic of languages in general. Chomsky boldly proclaims that this creativity is the "central fact to which any significant linguistic theory must address itself". [1] He adds that any "theory of language that neglects this 'creative' aspect is of only marginal interest". [2] Chomsky then calls the existing structuralist linguistic theory of his time a "taxonomic" enterprise which limited itself within a narrow scope to become an "inventory of elements", not an inventory of underlying rules. In doing so, this "far too oversimplified" linguistic model "seriously underestimates the richness of structure of language and the generative processes that underlie it". [3] After dismissing the existing theories, Chomsky attempts to show that his newly invented "transformational generative grammar" model is "much closer to the truth". [4]
Chomsky defines three levels of success for any linguistic theory. These are "observational adequacy" (i.e. correctly picking out the valid linguistic data that linguists must work on), "descriptive adequacy" (i.e. assigning clear structural description to elements of sentences) and "explanatory adequacy" (i.e. justifying, with the help of a principled basis, the selection of a descriptive grammar for a language).
Chomsky states that much of modern structural linguistics in the first half of the 20th century were preoccupied with observational adequacy. He also states that descriptive adequacy could technically be achieved by a set of structural descriptions (like a computer program) that cover all linguistic data in an ad hoc manner. But for Chomsky this still gives us little insight into the nature of linguistic structure. Therefore a comprehensive coverage of all data in the "observational adequacy" or all structural descriptions at the "descriptive adequacy" level would not be worthwhile nor interesting. A successful linguistic theory must achieve the higher level of "explanatory adequacy", describing the distinctive features of a natural language as opposed to any set of structural descriptions. For Chomsky, considering the stage where linguistics was at the time, such depth of analysis seemed more important than ever-broadening scope.
According to the British linguist John Earl Joseph, the paper "secured [Chomsky's] international reputation in linguistics". [5]
The following outline is provided as an overview and topical guide to linguistics:
In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.
In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones. The method is commonly associated with American linguist Noam Chomsky.
Deep structure and surface structure are concepts used in linguistics, specifically in the study of syntax in the Chomskyan tradition of transformational generative grammar.
Generative grammar, or generativism, is a linguistic theory that regards linguistics as the study of a hypothesised innate grammatical structure. It is a biological or biologistic modification of earlier structuralist theories of linguistics, deriving ultimately from glossematics. Generative grammar considers grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. It is a system of explicit rules that may apply repeatedly to generate an indefinite number of sentences which can be as long as one wants them to be. The difference from structural and functional models is that the object is base-generated within the verb phrase in generative grammar. This purportedly cognitive structure is thought of as being a part of a universal grammar, a syntactic structure which is caused by a genetic mutation in humans.
Principles and parameters is a framework within generative linguistics in which the syntax of a natural language is described in accordance with general principles and specific parameters that for particular languages are either turned on or off. For example, the position of heads in phrases is determined by a parameter. Whether a language is head-initial or head-final is regarded as a parameter which is either on or off for particular languages. Principles and parameters was largely formulated by the linguists Noam Chomsky and Howard Lasnik. Many linguists have worked within this framework, and for a period of time it was considered the dominant form of mainstream generative linguistics.
Syntactic Structures is an influential work in linguistics by American linguist Noam Chomsky, originally published in 1957. It is an elaboration of his teacher Zellig Harris's model of transformational generative grammar. A short monograph of about a hundred pages, Chomsky's presentation is recognized as one of the most significant studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning. Thus, Chomsky argued for the independence of syntax from semantics.
In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.
Poverty of the stimulus (POS) is the controversial argument from linguistics that children are not exposed to rich enough data within their linguistic environments to acquire every feature of their language. This is considered evidence contrary to the empiricist idea that language is learned solely through experience. The claim is that the sentences children hear while learning a language do not contain the information needed to develop a thorough understanding of the grammar of the language.
The term Cartesian linguistics was coined by Noam Chomsky in his book Cartesian Linguistics: A Chapter in the History of Rationalist Thought (1966). The adjective "Cartesian" pertains to René Descartes, a prominent 17th-century philosopher. As well as Descartes, Chomsky surveys other examples of rationalist thought in 17th-century linguistics, in particular the Port-Royal Grammar (1660), which foreshadows some of his own ideas concerning universal grammar.
In his work Aspects of the Theory of Syntax (1965), Noam Chomsky introduces a hierarchy of levels of adequacy for evaluating grammars and metagrammars.
Langueandparole is a theoretical linguistic dichotomy distinguished by Ferdinand de Saussure in his Course in General Linguistics.
The linguistics wars were a protracted academic dispute inside American theoretical linguistics that took place mostly in the 1960s and 1970s, stemming from an intellectual falling out between Noam Chomsky and some of his early colleagues and doctoral students. The debate began in 1967, when linguists Paul Postal, "Haj" Ross, George Lakoff, and James McCawley—self-dubbed the "Four Horsemen of the Apocalypse" —proposed an approach to the relationship between syntax and semantics, which treated deep structures as meanings rather than syntactic objects. While Chomsky and other generative grammarians argued that the meaning of a sentence was derived from its syntax, the generative semanticists argued that syntax was derived from meaning.
Merge is one of the basic operations in the Minimalist Program, a leading approach to generative syntax, when two syntactic objects are combined to form a new syntactic unit. Merge also has the property of recursion in that it may apply to its own output: the objects combined by Merge are either lexical items or sets that were themselves formed by Merge. This recursive property of Merge has been claimed to be a fundamental characteristic that distinguishes language from other cognitive faculties. As Noam Chomsky (1999) puts it, Merge is "an indispensable operation of a recursive system ... which takes two syntactic objects A and B and forms the new object G={A,B}" (p. 2).
Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.
Lectures on Government and Binding: The Pisa Lectures (LGB) is a book by the linguist Noam Chomsky, published in 1981. It is based on the lectures Chomsky gave at the GLOW conference and workshop held at the Scuola Normale Superiore in Pisa, Italy in 1979. In this book, Chomsky presented his government and binding theory of syntax. It had great influence on the syntactic research in early 1980s, especially among the linguists working within the transformational grammar framework.
In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.
In linguistics, transformational syntax is a derivational approach to syntax that developed from the extended standard theory of generative grammar originally proposed by Noam Chomsky in his books Syntactic Structures and Aspects of the Theory of Syntax. It emerged from a need to improve on approaches to grammar in structural linguistics.
Integrational Linguistics (IL) is a general approach to linguistics that has been developed by the German linguist Hans-Heinrich Lieb and others since the late 1960s. The term "Integrational Linguistics" as a name for this approach has been used in publications since 1977 and antedates the use of the same term for integrationism, an unrelated approach developed by Roy Harris. Integrational Linguistics continues being developed by an open group of linguists from various countries.
Distributionalism was a general theory of language and a discovery procedure for establishing elements and structures of language based on observed usage. It can be seen as an elaboration of structuralism but takes a more computational approach. Originally mostly applied to understanding phonological processes and phonotactics, distributional methods were also applied to work on lexical semantics and provide the basis for the distributional hypothesis for meaning. Current computational approaches to learn the semantics of words from text in the form of word embeddings using machine learning are based on distributional theory.