Part of a series on |
Linguistics |
---|
Portal |
In linguistics, the minimalist program is a major line of inquiry that has been developing inside generative grammar since the early 1990s, starting with a 1993 paper by Noam Chomsky. [1]
Following Imre Lakatos's distinction, Chomsky presents minimalism as a program, understood as a mode of inquiry that provides a conceptual framework which guides the development of linguistic theory. As such, it is characterized by a broad and diverse range of research directions. For Chomsky, there are two basic minimalist questions—What is language? and Why does it have the properties it has?—but the answers to these two questions can be framed in any theory. [2]
Minimalism is an approach developed with the goal of understanding the nature of language. It models a speaker's knowledge of language as a computational system with one basic operation, namely Merge. Merge combines expressions taken from the lexicon in a successive fashion to generate representations that characterize I-Language, understood to be the internalized intensional knowledge state as represented in individual speakers. By hypothesis, I-language—also called universal grammar—corresponds to the initial state of the human language faculty in individual human development.
Minimalism is reductive in that it aims to identify which aspects of human language—as well the computational system that underlies it—are conceptually necessary. This is sometimes framed as questions relating to perfect design (Is the design of human language perfect?) and optimal computation (Is the computational system for human language optimal?) [2] According to Chomsky, a human natural language is not optimal when judged based on how it functions, since it often contains ambiguities, garden paths, etc. However, it may be optimal for interaction with the systems that are internal to the mind. [3]
Such questions are informed by a set of background assumptions, some of which date back to the earliest stages of generative grammar: [4]
Minimalism develops the idea that human language ability is optimal in its design and exquisite in its organization, and that its inner workings conform to a very simple computation. On this view, universal grammar instantiates a perfect design in the sense that it contains only what is necessary. Minimalism further develops the notion of economy, which came to the fore in the early 1990s, though still peripheral to transformational grammar. Economy of derivation requires that movements (i.e., transformations) occur only if necessary, and specifically to satisfy to feature-checking, whereby an interpretable feature is matched with a corresponding uninterpretable feature. (See discussion of feature-checking below.) Economy of representation requires that grammatical structures exist for a purpose. The structure of a sentence should be no larger or more complex than required to satisfy constraints on grammaticality.
Within minimalism, economy—recast in terms of the strong minimalist thesis (SMT)—has acquired increased importance. [6] The 2016 book entitled Why Only Us—co-authored by Noam Chomsky and Robert Berwick—defines the strong minimalist thesis as follows:
The optimal situation would be that UG reduces to the simplest computational principles which operate in accord with conditions of computational efficiency. This conjecture is ... called the Strong Minimalist Thesis (SMT).
— Why Only Us? MIT Press. 2016. Page 94.
Under the strong minimalist thesis, language is a product of inherited traits as developmentally enhanced through intersubjective communication and social exposure to individual languages (amongst other things). This reduces to a minimum the "innate" component (the genetically inherited component) of the language faculty, which has been criticized over many decades and is separate from the developmental psychology component.
Intrinsic to the syntactic model (e.g. the Y/T-model) is the fact that social and other factors play no role in the computation that takes place in narrow syntax; what Chomsky, Hauser and Fitch refer to as faculty of language in the narrow sense (FLN), as distinct from faculty of language in the broad sense (FLB). Thus, narrow syntax only concerns itself with interface requirements, also called legibility conditions. SMT can be restated as follows: syntax, narrowly defined, is a product of the requirements of the interfaces and nothing else. This is what is meant by "Language is an optimal solution to legibility conditions" (Chomsky 2001:96).
Interface requirements force deletion of features that are uninterpretable at a particular interface, a necessary consequence of Full Interpretation. A PF object must only consist of features that are interpretable at the articulatory-perceptual (A-P) interface; likewise a LF object must consist of features that are interpretable at the conceptual-intentional (C-I) interface. The presence of an uninterpretable feature at either interface will cause the derivation to crash.
Narrow syntax proceeds as a set of operations—Merge, Move and Agree—carried out upon a numeration (a selection of features, words etc., from the lexicon) with the sole aim of removing all uninterpretable features before being sent via Spell-Out to the A-P and C-I interfaces. The result of these operations is a hierarchical syntactic structure that captures the relationships between the component features.
The exploration of minimalist questions has led to several radical changes in the technical apparatus of transformational generative grammatical theory. Some of the most important are: [7]
Early versions of minimalism posits two basic operations: Merge and Move. Earlier theories of grammar—as well as early minimalist analyses—treat phrasal and movement dependencies differently than current minimalist analyses. In the latter, Merge and Move are different outputs of a single operation. Merge of two syntactic objects (SOs) is called "external Merge". As for Move, it is defined as an instance of "internal Merge", and involves the re-merge of an already merged SO with another SO. [8] In regards to how Move should be formulated, there continues to be active debate about this, but the differences between current proposals are relatively minute.
More recent versions of minimalism recognize three operations: Merge (i.e. external Merge), Move (i.e. internal Merge), and Agree. The emergence of Agree as a basic operation is related to the mechanism which forces movement, which is mediated by feature-checking.
In its original formulation, Merge is a function that takes two objects (α and β) and merges them into an unordered set with a label, either α or β. In more recent treatments, the possibility of the derived syntactic object being un-labelled is also considered; this is called "simple Merge" (see Label section).
Merge(α,β)→{α,{α,β}} | Merge(α,β)→{β,{α,β}} | Merge(α,β) |
In the version of Merge which generates a label, the label identifies the properties of the phrase. Merge will always occur between two syntactic objects: a head and a non-head. [9] For example, Merge can combine the two lexical items drink and water to generate drink water. In the Minimalist Program, the phrase is identified with a label. In the case of drink water, the label is drink since the phrase acts as a verb. This can be represented in a typical syntax tree as follows, with the name of the derived syntactic object (SO) determined either by the lexical item (LI) itself, or by the category label of the LI:
Merge (drink, water) → {drink, {drink, water} } | Merge (drinkV, waterN) → {V, {drinkV, waterN} } |
Merge can operate on already-built structures; in other words, it is a recursive operation. If Merge were not recursive, then this would predict that only two-word utterances are grammatical. (This is relevant for child language acquisition, where children are observed to go through a so-called "two-word" stage. This is discussed below in the implications section.) As illustrated in the accompanying tree structure, if a new head (here γ) is merged with a previously formed syntactic object (a phrase, here {α, {α, β} }), the function has the form Merge (γ, {α, {α, β}}) → {γ, {γ, {α, {α, β}}}}. Here, γ is the head, so the output label of the derived syntactic object is γ.
Chomsky's earlier work defines each lexical item as a syntactic object that is associated with both categorical features and selectional features. [10] Features—more precisely formal features—participate in feature-checking, which takes as input two expressions that share the same feature, and checks them off against each other in a certain domain. [11] In some but not all versions of minimalism, projection of selectional features proceeds via feature-checking, as required by locality of selection: [12] [13] [14]
Selection as projection: As illustrated in the bare phrase structure tree for the sentence The girl ate the food; a notable feature is the absence of distinct labels (see Labels below). Relative to Merge, the selectional features of a lexical item determine how it participates in Merge:
Feature-checking: When a feature is "checked", it is removed.
Locality of selection (LOS) is a principle that forces selectional features to participate in feature checking. LOS states that a selected element must combine with the head that selects it either as complement or specifier. Selection is local in the sense that there is a maximum distance that can occur between a head and what it selects: selection must be satisfied with the projection of the head. [12]
bare phrase structure | projection as feature-checking |
Move arises via "internal Merge".
Movement as feature-checking: The original formulation of the extended projection principle states that clauses must contain a subject in the specifier position of spec TP/IP. [15] In the tree above, there is an EPP feature. This is a strong feature which forces re-Merge—which is also called internal merge—of the DP the girl. The EPP feature in the tree above is a subscript to the T head, which indicates that T needs a subject in its specifier position. This causes the movement of <the girl> to the specifier position of T. [12]
uninterpretable EPP feature forces Move | uninterpretable case feature forces Move |
declarative clause | content question: subject | content question: object |
A substantial body of literature in the minimalist tradition focuses on how a phrase receives a proper label. [16] The debate about labeling reflects the deeper aspirations of the minimalist program, which is to remove all redundant elements in favour of the simplest analysis possible. [17] While earlier proposals focus on how to distinguish adjunction from substitution via labeling, more recent proposals attempt to eliminate labeling altogether, but they have not been universally accepted.
Adjunction and substitution: Chomsky's 1995 monograph entitled The Minimalist Program outlines two methods of forming structure: adjunction and substitution. The standard properties of segments, categories, adjuncts, and specifiers are easily constructed. In the general form of a structured tree for adjunction and substitution, α is an adjunct to X, and α is substituted into SPEC, X position. α can raise to aim for the Xmax position, and it builds a new position that can either be adjoined to [Y-X] or is SPEC, X, in which it is termed the 'target'. At the bottom of the tree, the minimal domain includes SPEC Y and Z along with a new position formed by the raising of α which is either contained within Z, or is Z. [18]
Adjunction: Before the introduction of bare phrase structure, adjuncts did not alter information about bar-level, category information, or the target's (located in the adjoined structure) head. [19] An example of adjunction using the X-bar theory notation is given below for the sentence Luna bought the purse yesterday. Observe that the adverbial modifier yesterday is sister to VP and dominated by VP. Thus, the addition of the modifier does not change information about the bar-level: in this case the maximal projection VP. In the minimalist program, adjuncts are argued to exhibit a different, perhaps more simplified, structure. Chomsky (1995) proposes that adjunction forms a two-segment object/category consisting of: (i) the head of a label; (ii) a different label from the head of the label. The label L is not considered a term in the structure that is formed because it is not identical to the head S, but it is derived from it in an irrelevant way. If α adjoins to S, and S projects, then the structure that results is L = {<H(S), H(S)>,{α,S}}, where the entire structure is replaced with the head S, as well as what the structure contains. The head is what projects, so it can itself be the label or can determine the label irrelevantly. [18] In the new account developed in bare phrase structure, the properties of the head are no longer preserved in adjunction structures, as the attachment of an adjunct to a particular XP following adjunction is non-maximal, as shown in the figure below that illustrates adjunction in BPS. Such an account is applicable to XPs that are related to multiple adjunction. [19]
Adjunction in X-bar theory | Adjunction in bare phrase structure |
Substitution forms a new category consisting of a head (H), which is the label, and an element being projected. Some ambiguities may arise if the features raising, in this case α, contain the entire head and the head is also XMAX. [18]
Labeling algorithm (LA): Merge is a function that takes two objects (α and β) and merges them into an unordered set with a label (either α or β), where the label indicates the kind of phrase that is built via merge. But this labeling technique is too unrestricted since the input labels make incorrect predictions about which lexical categories can merge with each other. Consequently, a different mechanism is needed to generate the correct output label for each application of Merge in order to account for how lexical categories combine; this mechanism is referred to as the labeling algorithm (LA). [20]
Recently, the suitability of a labeling algorithm has been questioned, as syntacticians have identified a number of limitations associated with what Chomsky has proposed. [21] It has been argued that two kinds of phrases pose a problem. The labeling algorithm proposes that labelling occurs via minimal search, a process where a single lexical item within a phrasal structure acts as a head and provides the label for the phrase. [22] It has been noted that minimal search cannot account for the following two possibilities: [21]
In each of these cases, there is no lexical item acting as a prominent element (i.e. a head). Given this, it is not possible through minimal search to extract a label for the phrase. While Chomsky has proposed solutions for these cases, it has been argued that the fact that such cases are problematic suggests that the labeling algorithm violates the tenets of the minimalist program, as it departs from conceptual necessity.
Other linguistic phenomena that create instances where Chomsky's labeling algorithm cannot assign labels include predicate fronting, embedded topicalization, scrambling (free movement of constituents), stacked structures (which involve multiple specifiers).
Given these criticisms of Chomsky's labeling algorithm, it has been recently argued that the labeling algorithm theory should be eliminated altogether and replaced by another labeling mechanism. The symmetry principle has been identified as one such mechanism, as it provides an account of labeling that assigns the correct labels even when phrases are derived through complex linguistic phenomena. [21]
Starting in the early 2000s, attention turned from feature-checking as a condition on movement to feature-checking as a condition on agreement. This line of inquiry was initiated in Chomsky (2000), and formulated as follows:
Many recent analyses assume that Agree is a basic operation, on par with Merge and Move. This is currently a very active area of research, and there remain numerous open questions: [23]
Co-indexation as feature checking: co-indexation markers such as {k, m, o, etc.} [12]
co-indexation |
A phase is a syntactic domain first hypothesized by Noam Chomsky in 1998. [24] It is a domain where all derivational processes operate and where all features are checked. [25] A phase consists of a phase head and a phase domain. Once any derivation reaches a phase and all the features are checked, the phase domain is sent to transfer and becomes invisible to further computations. [25] The literature shows three trends relative to what is generally considered to be a phase:
A simple sentence can be decomposed into two phases, CP and vP. Chomsky considers CP and vP to be strong phases because of their propositional content, as well as their interaction with movement and reconstruction. [26]
Propositional content: CP and vP are both propositional units, but for different reasons. [30] CP is considered a propositional unit because it is a full clause that has tense and force: example (1) shows that the complementizer that in the CP phase conditions finiteness (here past tense) and force (here, affirmative) of the subordinate clause. vP is considered a propositional unit because all the theta roles are assigned in vP: in (2) the verb ate in the vP phase assigns the Theme theta role to the DP the cake and the Agent theta-role to the DP Mary. [12]
(1) John said [CPthat Mary will eat the cake ].
(2) [CPMary [vP<Mary> ate the cake ].
structure of simple sentence | CP phase: that Mary will eat the cake | vP phase: Mary ate the cake |
Movement: CP and vP can be the focus of pseudo-cleft movement, showing that CP and vP form syntactic units: this is shown in (3) for the CP constituent that John is bringingthe dessert, and in (4) for the vP constituent arrive tomorrow. [30]
(3) a. Mary said [CPthat John is bringing the dessert]. b. What Mary said was [CPthat John is bringing the dessert].
(4) a. Alice will [vParrive tomorrow]. b. What Alice will do is [vParrive tomorrow].
Reconstruction. When a moved constituent is interpreted in its original position to satisfy binding principles, this is called reconstruction. [31] Evidence from reconstruction is consistent with the claim that the moved phrase stops at the left edge of CP and vP phases. [28]
(5) a. [Which picture of himselfk] did Johnk think ___ Fredj liked __? b. [Which picture of himselfj] did Johnk think ___ Fredj liked __?
(6) [Which of the papers that hek gave Maryj] did every studentk __ ask herj to read __ carefully?
reconstruction at left edge of CP phase | reconstruction at left edge of vP phase |
Chomsky theorized that syntactic operations must obey the phase impenetrability condition (PIC) which essentially requires that movement be from the left-edge of a phase. The PIC has been variously formulated in the literature. The extended projection principle feature that is on the heads of phases triggers the intermediate movement steps to phase edges. [30]
Movement of a constituent out of a phase is (in the general case) only permitted if the constituent has first moved to the left edge of the phase (XP).
The edge of a head X is defined as the residue outside of X', in either specifier of X and adjuncts to XP. [32]
English successive cyclic wh-movement obeys the PIC. [30] Sentence (7) has two phases: vP and CP. Relative to the application of movement, who moves from the (lower) vP phase to the (higher) CP phase in two steps:
(7) [CPWho did you [vPseewho]]?
Step 1: wh-phrase moves to edge of vP phase | Step 2: wh-phrase moves to edge of CP phase |
Another example of PIC can be observed when analyzing A'-agreement in Medumba. A'-agreement is a term used for the morphological reflex of A'-movement of an XP. [31] In Medumba, when the moved phrase reaches a phase edge, a high low tonal melody is added to the head of the complement of the phase head. Since A'-agreement in Medumba requires movement, the presence of agreement on the complements of phase heads shows that the wh-word moves to the edges of phases and obeys PIC. [31]
Example: [31]
The sentence (2a) has a high low tone on the verb nɔ́ʔ and tense ʤʉ̀n, therefore is grammatical.
(2a) [CP á wʉ́ Wàtɛ̀t nɔ́ɔ̀ʔ [vPⁿ-ʤʉ́ʉ̀n á?]]
'Who did Watat see?'
The sentence (2b) does not have a high low tone on the verb nɔ́ʔ and tense ʤʉ̀n, therefore is not grammatical.
(2b) *[CP á wʉ́ Wàtɛ̀t nɔ́ʔ [vPⁿ-ʤʉ́n á?]]
*'Who did Watat see?'
To generate the grammatical sentence (2a), the wh-phrase á wʉ́ moves from the vP phase to the CP phase. To obey PIC, this movement must take two steps since the wh-phrase needs to move to the edge of the vP phase in order to move out of the lower phase.
One can confirm that A' agreement only occurs with movement by examining sentences where the wh-phrase does not move. In sentence (2c) below, one can observe that there is no high low tone melody on the verb nɔ́ʔ and tense fá since the wh-word does not move to the edge of the vP and CP phase. [31]
(2c) [m-ɛ́n nɔ́ʔfá bɔ̀ á wʉ́ á]
'The child gave the bag to who?'
The spell-out of a string is assumed to be cyclic, but there is no consensus about how to implement this. Some analyses adopt an iterative spell-out algorithm, with spell-out applying after each application of Merge. Other analyses adopt an opportunistic algorithm, where spell-out applies only if it must. And yet others adopt a wait-til-the-end algorithm, with spell-out occurring only at the end of the derivation.
There is no consensus about the cyclicality of the Agree relation: it is sometimes treated as cyclic, sometimes as a-cyclic, and sometimes as counter-cyclic.
From a theoretical standpoint, and in the context of generative grammar, the Minimalist Program is an outgrowth of the principles and parameters (P&P) model, considered to be the ultimate standard theoretical model that generative linguistics developed from the early 1980s through to the early 1990s. [33] The Principles and Parameters model posits a fixed set of principles (held to be valid for all human languages) that—when combined with settings for a finite set of parameters—could describe the properties that characterize the language competence that a child eventually attains. One aim of the Minimalist Program is to ascertain how much of the Principles and Parameters model can be taken to result from the hypothethesized optimal and computationally efficient design of the human language faculty. In turn, some aspects of the Principles and Parameters model provide technical tools and foundational concepts that inform the broad outlines of the Minimalist Program. [34]
X-bar theory—first introduced in Chomsky (1970) and elaborated in Jackendoff (1977) among other works—was a major milestone in the history of the development of generative grammar. It contains the following postulates: [35]
In the chapter "Phrase Structure" of The Handbook of Contemporary Syntactic Theory, Naoki Fukui determined three kinds of syntactic relationships, (1) Dominance: the hierarchical categorization of the lexical items and constituents of the structure, (2) Labeling: the syntactic category of each constituent and (3) Linear order (or Precedence): the left-to-right order of the constituents (essentially the existence of the X-bar schemata). Whereas X-bar theory was composed of the three relationships, bare phrase structure only encodes the first two relationships. [15] Claims 1 and 2 have almost completely withstood their original forms through grammatical theory development, unlike Claim 3, which has not. Claim 1 will be eliminated later on in favour of projection-less nodes. [35]
In 1980, the principles and parameters (P&P) approach took place which marked the emergence of different theories that stray from rule-based grammars/rules, and have instead been replaced with multiple segments of UG such as X-bar theory, case theory, etc. During this time, PS rules disappeared because they have proved to be redundant since they recap what is in the lexicon. Transformational rules have survived with a few amendments to how they are expressed. For complex traditional rules, they do not need to be defined and they can be dwindled to a general schema called Move-α—which means things can be moved anywhere. The only two sub-theories that withstood time within P&P is Move-α. Of the fundamental properties mentioned above, X-bar theory accounts for hierarchical structure and endocentricity, while Move-α accounts for unboundedness and non-local dependencies. A few years later, an effort was made to merge X-bar theory with Move-a by suggesting that structures are built from the bottom going up (using adjunction or substitution depending on the target structure): [35]
X-bar theory had a number of weaknesses and was replaced by bare phrase structure, but some X-bar theory notions were borrowed by BPS. [17] Labeling in bar phrase structure specifically was adapted from conventions of X-bar theory; however, in order to get the "barest" phrase structures there are some dissimilarities. BPS differs from X-bar theory in the following ways: [15]
The main reasoning behind the transition from X-bar theory to BPS is the following:
The examples below show the progression of syntax structure from X-bar theory (the theory preceding BPS), to specifier-less structure. BPS satisfies the principles of UG using at minimum two interfaces such as 'conceptual-intentional and sensorimotor systems' or a third condition not specific to language but still satisfying the conditions put forth by the interface. [35]
In linguistics, there are differing approaches taken to explore the basis of language: two of these approaches are formalism and functionalism. It has been argued that the formalist approach can be characterized by the belief that rules governing syntax can be analyzed independently from things such as meaning and discourse. In other words, according to formalists, syntax is an independent system (referred to as the autonomy of syntax). By contrast, functionalists believe that syntax is determined largely by the communicative function that it serves. Therefore, syntax is not kept separate from things such as meaning and discourse. [36]
Under functionalism, there is a belief that language evolved alongside other cognitive abilities, and that these cognitive abilities must be understood in order to understand language. In Chomsky's theories prior to MP, he had been interested exclusively in formalism, and had believed that language could be isolated from other cognitive abilities. However, with the introduction of MP, Chomsky considers aspects of cognition (e.g. the conceptual-intentional (CI) system and the sensory motor (SM) system) to be linked to language. Rather than arguing that syntax is a specialized model which excludes other systems, under MP, Chomsky considers the roles of cognition, production, and articulation in formulating language. Given that these cognitive systems are considered in an account of language under MP, it has been argued that in contrast to Chomsky's previous theories, MP is consistent with functionalism. [37]
There is a trend in minimalism that shifts from constituency-based to dependency-based structures. Minimalism falls under the dependency grammar umbrella by virtue of adopting bare phrase structure, label-less trees, and specifier-less syntax. [38] [39]
As discussed by Helen Goodluck and Nina Kazanin in their 2020 paper, certain aspects of the minimalist program provide insightful accounts for first language (L1) acquisition by children. [40]
In the late 1990s, David E. Johnson and Shalom Lappin published the first detailed critiques of Chomsky's minimalist program. [41] This technical work was followed by a lively debate with proponents of minimalism on the scientific status of the program. [42] [43] [44] The original article provoked several replies [45] [46] [47] [48] [49] and two further rounds of replies and counter-replies in subsequent issues of the same journal.
Lappin et al. argue that the minimalist program is a radical departure from earlier Chomskyan linguistic practice that is not motivated by any new empirical discoveries, but rather by a general appeal to perfection, which is both empirically unmotivated and so vague as to be unfalsifiable. They compare the adoption of this paradigm by linguistic researchers to other historical paradigm shifts in natural sciences and conclude that of the minimalist program has been an "unscientific revolution", driven primarily by Chomsky's authority in linguistics. The several replies to the article in Natural Language and Linguistic Theory Volume 18 number 4 (2000) make a number of different defenses of the minimalist program. Some claim that it is not in fact revolutionary or not in fact widely adopted, while others agree with Lappin and Johnson on these points, but defend the vagueness of its formulation as not problematic in light of its status as a research program rather than a theory (see above).
Prakash Mondal has published a book-length critique of the minimalist model of grammar, arguing that there are a number of contradictions, inconsistencies and paradoxes within the formal structure of the system. In particular, his critique examines the consequences of adopting some rather innocuous and widespread assumptions or axioms about the nature of language as adopted in the Minimalist model of the language faculty. [50]
Developments in the minimalist program have also been critiqued by Hubert Haider, who has argued that minimalist studies routinely fail to follow scientific rigour. In particular, data compatible with hypotheses are filed under confirmation whereas crucial counter-evidence is largely ignored or shielded off by making ad hoc auxiliary assumptions. Moreover, the supporting data are biased towards SVO languages and are often based on the linguist's introspection rather attempts to gather data in an unbiased manner by experimental means. Haider further refers to the appeal to an authority figure in the field, with dedicated followers taking the core premises of minimalism for granted as if they were established facts. [51]
Much research has been devoted to the study of the consequences that arise when minimalist questions are formulated. The lists below, which are not exhaustive, are given in reverse chronological order.
In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.
A syntactic category is a syntactic unit that theories of syntax assume. Word classes, largely corresponding to traditional parts of speech, are syntactic categories. In phrase structure grammars, the phrasal categories are also syntactic categories. Dependency grammars, however, do not acknowledge phrasal categories.
Phrase structure rules are a type of rewrite rule used to describe a given language's syntax and are closely associated with the early stages of transformational grammar, proposed by Noam Chomsky in 1957. They are used to break down a natural language sentence into its constituent parts, also known as syntactic categories, including both lexical categories and phrasal categories. A grammar that uses phrase structure rules is a type of phrase structure grammar. Phrase structure rules as they are commonly employed operate according to the constituency relation, and a grammar that employs phrase structure rules is therefore a constituency grammar; as such, it stands in contrast to dependency grammars, which are based on the dependency relation.
In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones.
A noun phrase – or NP or nominal (phrase) – is a phrase that usually has a noun or pronoun as its head, and has the same grammatical functions as a noun. Noun phrases are very common cross-linguistically, and they may be the most frequently occurring phrase type.
Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.
In linguistics, X-bar theory is a model of phrase-structure grammar and a theory of syntactic category formation that was first proposed by Noam Chomsky in 1970 reformulating the ideas of Zellig Harris (1951), and further developed by Ray Jackendoff, along the lines of the theory of generative grammar put forth in the 1950s by Chomsky. It attempts to capture the structure of phrasal categories with a single uniform structure called the X-bar schema, basing itself on the assumption that any phrase in natural language is an XP that is headed by a given syntactic category X. It played a significant role in resolving issues that phrase structure rules had, representative of which is the proliferation of grammatical rules, which is against the thesis of generative grammar.
Government and binding is a theory of syntax and a phrase structure grammar in the tradition of transformational grammar developed principally by Noam Chomsky in the 1980s. This theory is a radical revision of his earlier theories and was later revised in The Minimalist Program (1995) and several subsequent papers, the latest being Three Factors in Language Design (2005). Although there is a large literature on government and binding theory which is not written by Chomsky, Chomsky's papers have been foundational in setting the research agenda.
Generative grammar is a research tradition in linguistics that aims to explain the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge. Generative linguists, or generativists, tend to share certain working assumptions such as the competence–performance distinction and the notion that some domain-specific aspects of grammar are partly innate in humans. These assumptions are rejected in non-generative approaches such as usage-based models of language. Generative linguistics includes work in core areas such as syntax, semantics, phonology, psycholinguistics, and language acquisition, with additional extensions to topics including biolinguistics and music cognition.
In linguistics, branching refers to the shape of the parse trees that represent the structure of sentences. Assuming that the language is being written or transcribed from left to right, parse trees that grow down and to the right are right-branching, and parse trees that grow down and to the left are left-branching. The direction of branching reflects the position of heads in phrases, and in this regard, right-branching structures are head-initial, whereas left-branching structures are head-final. English has both right-branching (head-initial) and left-branching (head-final) structures, although it is more right-branching than left-branching. Some languages such as Japanese and Turkish are almost fully left-branching (head-final). Some languages are mostly right-branching (head-initial).
Principles and parameters is a framework within generative linguistics in which the syntax of a natural language is described in accordance with general principles and specific parameters that for particular languages are either turned on or off. For example, the position of heads in phrases is determined by a parameter. Whether a language is head-initial or head-final is regarded as a parameter which is either on or off for particular languages. Principles and parameters was largely formulated by the linguists Noam Chomsky and Howard Lasnik. Many linguists have worked within this framework, and for a period of time it was considered the dominant form of mainstream generative linguistics.
In generative grammar, non-configurational languages are languages characterized by a flat phrase structure, which allows syntactically discontinuous expressions, and a relatively free word order.
In linguistics, the projection principle is a stipulation proposed by Noam Chomsky as part of the phrase structure component of generative-transformational grammar. The projection principle is used in the derivation of phrases under the auspices of the principles and parameters theory.
Biolinguistics can be defined as the study of biology and the evolution of language. It is highly interdisciplinary as it is related to various fields such as biology, linguistics, psychology, anthropology, mathematics, and neurolinguistics to explain the formation of language. It seeks to yield a framework by which we can understand the fundamentals of the faculty of language. This field was first introduced by Massimo Piattelli-Palmarini, professor of Linguistics and Cognitive Science at the University of Arizona. It was first introduced in 1971, at an international meeting at the Massachusetts Institute of Technology (MIT).
In linguistics, an empty category, which may also be referred to as a covert category, is an element in the study of syntax that does not have any phonological content and is therefore unpronounced. Empty categories exist in contrast to overt categories which are pronounced. When representing empty categories in tree structures, linguists use a null symbol (∅) to depict the idea that there is a mental category at the level being represented, even if the word(s) are being left out of overt speech. The phenomenon was named and outlined by Noam Chomsky in his 1981 LGB framework, and serves to address apparent violations of locality of selection — there are different types of empty categories that each appear to account for locality violations in different environments. Empty categories are present in most of the world's languages, although different languages allow for different categories to be empty.
Merge is one of the basic operations in the Minimalist Program, a leading approach to generative syntax, when two syntactic objects are combined to form a new syntactic unit. Merge also has the property of recursion in that it may be applied to its own output: the objects combined by Merge are either lexical items or sets that were themselves formed by Merge. This recursive property of Merge has been claimed to be a fundamental characteristic that distinguishes language from other cognitive faculties. As Noam Chomsky (1999) puts it, Merge is "an indispensable operation of a recursive system ... which takes two syntactic objects A and B and forms the new object G={A,B}" (p. 2).
In linguistics, locality refers to the proximity of elements in a linguistic structure. Constraints on locality limit the span over which rules can apply to a particular structure. Theories of transformational grammar use syntactic locality constraints to explain restrictions on argument selection, syntactic binding, and syntactic movement.
In linguistics, subcategorization denotes the ability/necessity for lexical items to require/allow the presence and types of the syntactic arguments with which they co-occur. For example, the word "walk" as in "X walks home" requires the noun-phrase X to be animate.
In linguistics, transformational syntax is a derivational approach to syntax that developed from the extended standard theory of generative grammar originally proposed by Noam Chomsky in his books Syntactic Structures and Aspects of the Theory of Syntax. It emerged from a need to improve on approaches to grammar in structural linguistics.
In formal syntax, a node is a point in a tree diagram or syntactic tree that can be assigned a syntactic category label.
{{cite book}}
: CS1 maint: location missing publisher (link){{cite journal}}
: Cite journal requires |journal=
(help)