Levels of adequacy

Last updated

In his work Aspects of the Theory of Syntax (1965), Noam Chomsky introduces a hierarchy of levels of adequacy for evaluating grammars (theories of specific languages) and metagrammars (theories of grammars).

Contents

These levels constitute a taxonomy of theories (a grammar of a natural language being an example of such a theory) according to validation. This taxonomy might be extended to scientific theories in general, and from there even stretched into the field of the aesthetics of art. [1] This present article's use of the phrase as a terminus technicus should not be confused with its everyday language uses.

Motivation

The "potency" criterion alluded to in the preceding section is somewhat ill-defined, but may include "exhaustiveness", "effectiveness', and an affective component as well. (Arguably, the taxonomy is also motivated by considerations of "elegance". This should not be confused with the application of the taxonomy in the field of aesthetics). As a metatheory, or "theory of theories", it becomes a concept of epistemology in the philosophy of science, rather than a mere tool or methodology of scientific linguistics. As Chomsky put it in an earlier work:

The theory of linguistic structure must be distinguished clearly from a manual of helpful procedures for the discovery of grammars. [2]

The levels

  1. Observational adequacy
    • The theory achieves an exhaustive and discrete enumeration of the data points.
    • There is a pigeonhole for each observation.
  2. Descriptive adequacy
    • The theory formally specifies rules accounting for all observed arrangements of the data.
    • The rules produce all and only the well-formed constructs (relations) of the protocol space.

    ...the grammar gives a correct account of the linguistic intuition of the native speaker, and specifies the observed data (in particular) in terms of significant generalizations that express underlying regularities in the language. [3]

  3. Explanatory adequacy
    • The theory provides a principled choice between competing descriptions.
    • It deals with the uttermost underlying structure.
    • It has predictive power.

    A linguistic theory that aims for explanatory adequacy is concerned with the internal structure of the device [i.e. grammar]; that is, it aims to provide a principled basis, independent of any particular language, for the selection of the descriptively adequate grammar of each language. [4]

Theories which do not achieve the third level of adequacy are said to "account for the observations", rather than to "explain the observations."

The second and third levels include the assumption of Ockhamist parsimony. This is related to the Minimalist requirement, [5] which is elaborated as a corollary of the levels, but which is actually employed as an axiom.

Precursors in the philosophy of science

It is suggested that the system of levels proposed by Chomsky in Aspects of the Theory of Syntax has its antecedents in the works of Descartes, Kant, Carnap, Quine, and others. Certainly the criterion of adequacy found in rationalism, specifically, rational empiricism, bear some resemblance to Chomsky's formulation.

Since one of the key issues which Chomsky treats in Aspects is a supposition of a congenital endowment of the language faculty in humans, the topic ramifies into questions of innateness and a priori knowledge, since it is by reference to those questions that the third level of adequacy is to be sought.

Note

This concept should not be confused with the "causal adequacy principle," which refers to Descartes' version of the ontological argument for the existence of God in his Meditations on First Philosophy .

Bibliography

Related Research Articles

<span class="mw-page-title-main">Syntax</span> System responsible for combining morphemes into complex structures

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

<span class="mw-page-title-main">Universal grammar</span> Theory of the biological component of the language faculty

Universal grammar (UG), in modern linguistics, is the theory of the innate biological component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that there are innate constraints on what the grammar of a possible human language could be. When linguistic stimuli are received in the course of language acquisition, children then adopt specific syntactic rules that conform to UG. The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established, as some linguists have argued languages are so diverse that such universality is rare, and the theory of universal grammar remains controversial among linguists.

In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones.

In linguistics, X-bar theory is a model of phrase-structure grammar and a theory of syntactic category formation that was first proposed by Noam Chomsky in 1970 reformulating the ideas of Zellig Harris (1951), and further developed by Ray Jackendoff, along the lines of the theory of generative grammar put forth in the 1950s by Chomsky. It attempts to capture the structure of phrasal categories with a single uniform structure called the X-bar schema, basing itself on the assumption that any phrase in natural language is an XP that is headed by a given syntactic category X. It played a significant role in resolving issues that phrase structure rules had, representative of which is the proliferation of grammatical rules, which is against the thesis of generative grammar.

Deep structure and surface structure are concepts used in linguistics, specifically in the study of syntax in the Chomskyan tradition of transformational generative grammar.

Government and binding is a theory of syntax and a phrase structure grammar in the tradition of transformational grammar developed principally by Noam Chomsky in the 1980s. This theory is a radical revision of his earlier theories and was later revised in The Minimalist Program (1995) and several subsequent papers, the latest being Three Factors in Language Design (2005). Although there is a large literature on government and binding theory which is not written by Chomsky, Chomsky's papers have been foundational in setting the research agenda.

<span class="mw-page-title-main">Generative grammar</span> Theory in linguistics

Generative grammar, or generativism, is a linguistic theory that regards linguistics as the study of a hypothesised innate grammatical structure. It is a biological or biologistic modification of earlier structuralist theories of linguistics, deriving from logical syntax and glossematics. Generative grammar considers grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. It is a system of explicit rules that may apply repeatedly to generate an indefinite number of sentences which can be as long as one wants them to be. The difference from structural and functional models is that the object is base-generated within the verb phrase in generative grammar. This purportedly cognitive structure is thought of as being a part of a universal grammar, a syntactic structure which is caused by a genetic mutation in humans.

In linguistics, the minimalist program is a major line of inquiry that has been developing inside generative grammar since the early 1990s, starting with a 1993 paper by Noam Chomsky.

Conceptual semantics is a framework for semantic analysis developed mainly by Ray Jackendoff in 1976. Its aim is to provide a characterization of the conceptual elements by which a person understands words and sentences, and thus to provide an explanatory semantic representation. Explanatory in this sense refers to the ability of a given linguistic theory to describe how a component of language is acquired by a child.

Principles and parameters is a framework within generative linguistics in which the syntax of a natural language is described in accordance with general principles and specific parameters that for particular languages are either turned on or off. For example, the position of heads in phrases is determined by a parameter. Whether a language is head-initial or head-final is regarded as a parameter which is either on or off for particular languages. Principles and parameters was largely formulated by the linguists Noam Chomsky and Howard Lasnik. Many linguists have worked within this framework, and for a period of time it was considered the dominant form of mainstream generative linguistics.

<i>Syntactic Structures</i> Book by Noam Chomsky

Syntactic Structures is an important work in linguistics by American linguist Noam Chomsky, originally published in 1957. A short monograph of about a hundred pages, it is recognized as one of the most significant and influential linguistic studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning, thus arguing for the independence of syntax from semantics.

In the field of linguistics, specifically in syntax, phonetic form (PF), also known as phonological form or the articulatory-perceptual (A-P) system, is a certain level of mental representation of a linguistic expression, derived from surface structure, and related to Logical Form. Phonetic form is the level of representation wherein expressions, or sentences, are assigned a phonetic representation, which is then pronounced by the speaker. Phonetic form takes surface structure as its input, and outputs an audible, pronounced sentence.

<span class="mw-page-title-main">Cartesian linguistics</span>

The term Cartesian linguistics was coined by Noam Chomsky in his book Cartesian Linguistics: A Chapter in the History of Rationalist Thought (1966). The adjective "Cartesian" pertains to René Descartes, a prominent 17th-century philosopher. As well as Descartes, Chomsky surveys other examples of rationalist thought in 17th-century linguistics, in particular the Port-Royal Grammar (1660), which foreshadows some of his own ideas concerning universal grammar.

Merge is one of the basic operations in the Minimalist Program, a leading approach to generative syntax, when two syntactic objects are combined to form a new syntactic unit. Merge also has the property of recursion in that it may apply to its own output: the objects combined by Merge are either lexical items or sets that were themselves formed by Merge. This recursive property of Merge has been claimed to be a fundamental characteristic that distinguishes language from other cognitive faculties. As Noam Chomsky (1999) puts it, Merge is "an indispensable operation of a recursive system ... which takes two syntactic objects A and B and forms the new object G={A,B}" (p. 2).

<i>Aspects of the Theory of Syntax</i>

Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.

<i>Lectures on Government and Binding</i>

Lectures on Government and Binding: The Pisa Lectures (LGB) is a book by the linguist Noam Chomsky, published in 1981. It is based on the lectures Chomsky gave at the GLOW conference and workshop held at the Scuola Normale Superiore in Pisa, Italy in 1979. In this book, Chomsky presented his government and binding theory of syntax. It had great influence on the syntactic research in early 1980s, especially among the linguists working within the transformational grammar framework.

<span class="mw-page-title-main">Formalism (linguistics)</span> Concept in linguistics

In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.

In linguistics, transformational syntax is a derivational approach to syntax that developed from the extended standard theory of generative grammar originally proposed by Noam Chomsky in his books Syntactic Structures and Aspects of the Theory of Syntax. It emerged from a need to improve on approaches to grammar in structural linguistics.

<i>Current Issues in Linguistic Theory</i> 1964 book by Noam Chomsky

Current Issues in Linguistic Theory is a 1964 book by American linguist Noam Chomsky. It is a revised and expanded version of "The Logical Basis of Linguistic Theory", a paper that Chomsky presented in the ninth International Congress of Linguists held in Cambridge, Massachusetts in 1962. It is a short monograph of about a hundred pages, similar to Chomsky's earlier Syntactic Structures (1957). In Aspects of the Theory of Syntax (1965), Chomsky presents many of its ideas in a more elaborate manner.

In formal syntax, a node is a point in a tree diagram or syntactic tree that can be assigned a syntactic category label.

References

  1. An example of application of the levels to aesthetics may be found in the discussion at Archived 2006-09-27 at archive.today , accessed 2006-04-19.
  2. Chomsky, 1957, p. 106.
  3. Chomsky 1964, p. 63
  4. Chomsky 1964, p. 63.
  5. Chomsky, 1995.