Autonomy of syntax

Last updated

In linguistics, the autonomy of syntax is the assumption that syntax is arbitrary and self-contained with respect to meaning, semantics, pragmatics, discourse function, and other factors external to language. [1] The autonomy of syntax is advocated by linguistic formalists, and in particular by generative linguistics, whose approaches have hence been called autonomist linguistics.

Contents

The autonomy of syntax is at the center of the debates between formalist and functionalist linguistics, [1] [2] [3] and since the 1980s research has been conducted on the syntax–semantics interface within functionalist approaches, aimed at finding instances of semantically determined syntactic structures, to disprove the formalist argument of the autonomy of syntax. [4]

The principle of iconicity is contrasted, for some scenarios, with that of the autonomy of syntax. The weaker version of the argument for the autonomy of syntax (or that for the autonomy of grammar), includes only for the principle of arbitrariness, while the stronger version includes the claim of self-containedness. [1] The principle of arbitrariness of syntax is actually accepted by most functionalist linguist, and the real dispute between functionalist and generativists is on the claim of self-containedness of grammar or syntax. [5]

History

The assumption of the autonomy of syntax can be traced back to the neglect of the study of semantics by American structuralists like Leonard Bloomfield and Zellig Harris in the 1940s, which was based on a neo-positivist anti-psychologist stance, according to which since it is presumably impossible to study how the brain works, linguists should ignore all cognitive and psychological aspects of language and focus on the only objective data, that is how language appears in its exterior form. This paralleled the distinction between the two approaches in psychology, behaviorism, which was the dominant approach up until the 1940s, and cognitivism.

Over the decades, multiple instances have been found of cases in which syntactic structures are actually determined or influenced by semantic traits, and some formalists and generativists have reacted to that by shrinking those parts of semantics that they consider autonomous. [1] Over the decades, in the changes that Noam Chomsky has made to his generative formulation, there has been a shift from a claim for the autonomy of syntax to one for the autonomy of grammar. [1]

Functionalist linguistics vs. formalist linguistics

The assumption of the autonomy of syntax has been a highly controversial topic in the functionalist and formalist linguistic spheres. Linguistic functionalists make the argument that semantics play a role in syntax, while linguistic formalists agree that semantics and syntax interact, but they are not affected on each other. [1] [2] [3] A common example that is used by linguistic formalists to indicate the validity of autonomy in syntax is, "Colorless green ideas sleep furiously", which demonstrates that, in order for a sentence to be syntactically correct, it does not need to be coherent or meaningful in any way. [6]

Various grammar models have been developed both supporting and rejecting the autonomy of syntax. The main grammatical model that is in support of the Autonomy of Syntax is Generative Grammar, created by Noam Chomsky. On the other hand, examples of models that argue against it are Construction Grammar, Head-driven Phase Structure Grammar, and Generalized Phase Structure Grammar. [7]

See also

Notes and references

  1. 1 2 3 4 5 6 Croft (1995) Autonomy and Functionalist Linguistics , in Language Vol. 71, No. 3 (Sep., 1995), pp. 490-532
  2. 1 2 Butler & Gonzálvez-García, F. (2014) Exploring functional-cognitive space (Vol. 157). John Benjamins Publishing Company, Introduction, pp.6-17
  3. 1 2 Van Valin, R. D. Jr. (2003) Functional linguistics, ch. 13 in The handbook of linguistics , pp. 319-336.
  4. Levin, B., & Rappaport Hovav, M. (1995). Unaccusativity: At the syntax–lexical semantics interface . Cambridge, MA: MIT Press
  5. Croft (1995) pp.509-510
  6. "Generative Grammar: Theory, Types & Examples | Vaia". Hello Vaia. Retrieved 2023-09-15.
  7. CROFT, WILLIAM (2004). "Syntactic theories and syntactic methodology: a reply to Seuren". Journal of Linguistics. 40 (3): 637–654. doi:10.1017/s0022226704002798. ISSN   0022-2267.


Related Research Articles

<span class="mw-page-title-main">Functional linguistics</span> Approach to linguistics

Functional linguistics is an approach to the study of language characterized by taking systematically into account the speaker's and the hearer's side, and the communicative needs of the speaker and of the given language community. Linguistic functionalism spawned in the 1920s to 1930s from Ferdinand de Saussure's systematic structuralist approach to language (1916).

<span class="mw-page-title-main">Syntax</span> System responsible for combining morphemes into complex structures

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones.

In linguistics, X-bar theory is a model of phrase-structure grammar and a theory of syntactic category formation that was first proposed by Noam Chomsky in 1970 reformulating the ideas of Zellig Harris (1951), and further developed by Ray Jackendoff, along the lines of the theory of generative grammar put forth in the 1950s by Chomsky. It attempts to capture the structure of phrasal categories with a single uniform structure called the X-bar schema, basing itself on the assumption that any phrase in natural language is an XP that is headed by a given syntactic category X. It played a significant role in resolving issues that phrase structure rules had, representative of which is the proliferation of grammatical rules, which is against the thesis of generative grammar.

<span class="mw-page-title-main">Generative grammar</span> Theory in linguistics

Generative grammar, or generativism, is a linguistic theory that regards linguistics as the study of a hypothesised innate grammatical structure. It is a biological or biologistic modification of earlier structuralist theories of linguistics, deriving from logical syntax and glossematics. Generative grammar considers grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. It is a system of explicit rules that may apply repeatedly to generate an indefinite number of sentences which can be as long as one wants them to be. The difference from structural and functional models is that the object is base-generated within the verb phrase in generative grammar. This purportedly cognitive structure is thought of as being a part of a universal grammar, a syntactic structure which is caused by a genetic mutation in humans.

In linguistics, the minimalist program is a major line of inquiry that has been developing inside generative grammar since the early 1990s, starting with a 1993 paper by Noam Chomsky.

Theta roles are the names of the participant roles associated with a predicate: the predicate may be a verb, an adjective, a preposition, or a noun. If an object is in motion or in a steady state as the speakers perceives the state, or it is the topic of discussion, it is called a theme. The participant is usually said to be an argument of the predicate. In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.

Construction grammar is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human language. Constructions include words, morphemes, fixed expressions and idioms, and abstract grammatical rules such as the passive voice or the ditransitive. Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts, or from other constructions that are recognized to exist. In construction grammar, every utterance is understood to be a combination of multiple different constructions, which together specify its precise meaning and form.

Principles and parameters is a framework within generative linguistics in which the syntax of a natural language is described in accordance with general principles and specific parameters that for particular languages are either turned on or off. For example, the position of heads in phrases is determined by a parameter. Whether a language is head-initial or head-final is regarded as a parameter which is either on or off for particular languages. Principles and parameters was largely formulated by the linguists Noam Chomsky and Howard Lasnik. Many linguists have worked within this framework, and for a period of time it was considered the dominant form of mainstream generative linguistics.

<i>Syntactic Structures</i> Book by Noam Chomsky

Syntactic Structures is an important work in linguistics by American linguist Noam Chomsky, originally published in 1957. A short monograph of about a hundred pages, it is recognized as one of the most significant and influential linguistic studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning, thus arguing for the independence of syntax from semantics.

<span class="mw-page-title-main">Charles J. Fillmore</span> American linguist

Charles J. Fillmore was an American linguist and Professor of Linguistics at the University of California, Berkeley. He received his Ph.D. in Linguistics from the University of Michigan in 1961. Fillmore spent ten years at Ohio State University and a year as a Fellow at the Center for Advanced Study in the Behavioral Sciences at Stanford University before joining Berkeley's Department of Linguistics in 1971. Fillmore was extremely influential in the areas of syntax and lexical semantics.

In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.

Generative semantics was a research program in theoretical linguistics which held that syntactic structures are computed on the basis of meanings rather than the other way around. Generative semantics developed out of transformational generative grammar in the mid-1960s, but stood in opposition to it. The period in which the two research programs coexisted was marked by intense and often personal clashes now known as the linguistics wars. Its proponents included Haj Ross, Paul Postal, James McCawley, and George Lakoff, who dubbed themselves "The Four Horsemen of the Apocalypse".

In certain theories of linguistics, thematic relations, also known as semantic roles, are the various roles that a noun phrase may play with respect to the action or state described by a governing verb, commonly the sentence's main verb. For example, in the sentence "Susan ate an apple", Susan is the doer of the eating, so she is an agent; an apple is the item that is eaten, so it is a patient.

The linguistics wars were a protracted academic dispute inside American theoretical linguistics that took place mostly in the 1960s and 1970s, stemming from an intellectual falling out between Noam Chomsky and some of his early colleagues and doctoral students. The debate began in 1967, when linguists Paul Postal, "Haj" Ross, George Lakoff, and James McCawley—self-dubbed the "Four Horsemen of the Apocalypse" —proposed an approach to the relationship between syntax and semantics, which treated deep structures as meanings rather than syntactic objects. While Chomsky and other generative grammarians argued that the meaning of a sentence was derived from its syntax, the generative semanticists argued that syntax was derived from meaning.

Merge is one of the basic operations in the Minimalist Program, a leading approach to generative syntax, when two syntactic objects are combined to form a new syntactic unit. Merge also has the property of recursion in that it may apply to its own output: the objects combined by Merge are either lexical items or sets that were themselves formed by Merge. This recursive property of Merge has been claimed to be a fundamental characteristic that distinguishes language from other cognitive faculties. As Noam Chomsky (1999) puts it, Merge is "an indispensable operation of a recursive system ... which takes two syntactic objects A and B and forms the new object G={A,B}" (p. 2).

<i>Aspects of the Theory of Syntax</i>

Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.

Formal semantics is the study of grammatical meaning in natural languages using formal tools from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

<span class="mw-page-title-main">Formalism (linguistics)</span> Concept in linguistics

In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.

In linguistics, the syntax–semantics interface is the interaction between syntax and semantics. Its study encompasses phenomena that pertain to both syntax and semantics, with the goal of explaining correlations between form and meaning. Specific topics include scope, binding, and lexical semantic properties such as verbal aspect and nominal individuation, semantic macroroles, and unaccusativity.