Universal grammar

Last updated
Noam Chomsky is usually associated with the term universal grammar in the 20th and 21st centuries. Noam Chomsky portrait 2015.jpg
Noam Chomsky is usually associated with the term universal grammar in the 20th and 21st centuries.

Universal grammar (UG), in modern linguistics, is the theory of the innate biological component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that there are innate constraints on what the grammar of a possible human language could be. When linguistic stimuli are received in the course of language acquisition, children then adopt specific syntactic rules that conform to UG. [1] The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established, as some linguists have argued languages are so diverse that such universality is rare, [2] and the theory of universal grammar remains controversial among linguists. [3]

Contents

Argument

The theory of universal grammar proposes that if human beings are brought up under normal conditions (not those of extreme sensory deprivation), then they will always develop language with certain properties (e.g., distinguishing nouns from verbs, or distinguishing function words from content words). The theory proposes that there is an innate, biologically determined language faculty that knows these rules, making it possible for children to learn to speak. [4] This faculty does not know the vocabulary of any particular language (so words and their meanings must be learned), and there remain several parameters which can vary freely among languages (such as whether adjectives come before or after nouns) which must also be learned. Evidence in favor of this idea can be found in studies like Valian (1986), which show that children of surprisingly young ages understand syntactic categories and their distribution before this knowledge shows up in production. [5]

As Chomsky puts it, "Evidently, development of language in the individual must involve three factors: genetic endowment, which sets limits on the attainable languages, thereby making language acquisition possible; external data, converted to the experience that selects one or another language within a narrow range; [and] principles not specific to the Faculty of Language." [6]

Occasionally, aspects of universal grammar seem describable in terms of general details regarding cognition. For example, if a predisposition to categorize events and objects as different classes of things is part of human cognition, and directly results in nouns and verbs showing up in all languages, it could be assumed that rather than this aspect of universal grammar being specific to language, it is more generally a part of human cognition. To distinguish properties of languages that can be traced to other facts regarding cognition from properties of languages that cannot, the abbreviation UG* can be used. UG is the term often used by Chomsky for those aspects of the human brain which cause language to be the way that it is (i.e. are universal grammar in the sense used here), but here for the purposes of discussion, it is used for those aspects which are furthermore specific to language (thus UG, as Chomsky uses it, is just an abbreviation for universal grammar, but UG* as used here is a subset of universal grammar).

In the same article, Chomsky casts the theme of a larger research program in terms of the following question: "How little can be attributed to UG while still accounting for the variety of 'I-languages' attained, relying on third factor principles?" [6] (I-languages meaning internal languages, the brain states that correspond to knowing how to speak and understand a particular language, and third factor principles meaning "principles not specific to the Faculty of Language" in the previous quote). Chomsky has speculated that UG might be extremely simple and abstract, for example only a mechanism for combining symbols in a particular way, which he calls "merge". The following quote shows that Chomsky does not use the term "UG" in the narrow sense UG* suggested above:

"The conclusion that merge falls within UG holds whether such recursive generation is unique to FL (faculty of language) or is appropriated from other systems." [6]

In other words, merge is seen as part of UG because it causes language to be the way it is, universal, and is not part of the environment or general properties independent of genetics and environment. Merge is part of universal grammar whether it is specific to language, or whether, as Chomsky suggests, it is also used for example in mathematical thinking. The distinction is the result of the long history of argument about UG*: whereas some people working on language agree that there is universal grammar, many people assume that Chomsky means UG* when he writes UG (and in some cases he might actually mean UG* [though not in the passage quoted above]).

Some students of universal grammar study a variety of grammars to extract generalizations called linguistic universals, often in the form of "If X holds true, then Y occurs." These have been extended to a variety of traits, such as the phonemes found in languages, the word orders which different languages choose, and the reasons why children exhibit certain linguistic behaviors. Other linguists who have influenced this theory include Richard Montague, who developed his version of this theory as he considered issues of the argument from poverty of the stimulus to arise from the constructivist approach to linguistic theory. The application of the idea of universal grammar to the study of second language acquisition (SLA) is represented mainly in the work of McGill linguist Lydia White.

Syntacticians generally hold that there are parametric points of variation between languages, although heated debate occurs over whether UG constraints are essentially universal due to being "hard-wired" (Chomsky's principles and parameters approach), a logical consequence of a specific syntactic architecture (the generalized phrase structure approach) or the result of functional constraints on communication (the functionalist approach). [7]

Relation to the evolution of language

In an article entitled "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" [8] Hauser, Chomsky, and Fitch present the three leading hypotheses for how language evolved and brought humans to the point where they have a universal grammar.

The first hypothesis states that the faculty of language in the broad sense (FLb) is strictly homologous to animal communication. This means that homologous aspects of the faculty of language exist in non-human animals.

The second hypothesis states that the FLb is a derived and uniquely human adaptation for language. This hypothesis holds that individual traits were subject to natural selection and came to be specialized for humans.

The third hypothesis states that only the faculty of language in the narrow sense (FLn) is unique to humans. It holds that while mechanisms of the FLb are present in both human and non-human animals, the computational mechanism of recursion has evolved recently, and solely in humans. [9] This hypothesis aligns most closely with the typical theory of universal grammar championed by Chomsky.

History

The term "universal grammar" predates Noam Chomsky, but pre-Chomskyan ideas of universal grammar are different. For Chomsky, UG is "[the] theory of the genetically based language faculty", [10] which makes UG a theory of language acquisition, and part of the innateness hypothesis. Earlier grammarians and philosophers thought about universal grammar in the sense of a universally shared property or grammar of all languages. The closest analog to their understanding of universal grammar in the late 20th century are Greenberg's linguistic universals.

Chomsky defined universal grammar as "the study of the conditions that must be met by the grammars of all human languages" or "the theory of language structure". [11]

The idea of a universal grammar can be traced back to Roger Bacon's observations in his c.1245 Overview of Grammar and c.1268Greek Grammar that all languages are built upon a common grammar, even though it may undergo incidental variations; and the 13th century speculative grammarians who, following Bacon, postulated universal rules underlying all grammars. The concept of a universal grammar or language was at the core of the 17th century projects for philosophical languages. An influential work in that time was Grammaire générale by Claude Lancelot and Antoine Arnauld. They tried to describe a general grammar for languages, coming to the conclusion that grammar has to be universal. [12] There is a Scottish school of universal grammarians from the 18th century, as distinguished from the philosophical language project, which included authors such as James Beattie, Hugh Blair, James Burnett, James Harris, and Adam Smith. The article on grammar in the first edition of the Encyclopædia Britannica (1771) contains an extensive section titled "Of Universal Grammar".

This tradition was continued in the late 19th century by Wilhelm Wundt and in the early 20th century by linguist Otto Jespersen. Jespersen disagreed with early grammarians on their formulation of "universal grammar", arguing that they tried to derive too much from Latin, and that a UG based on Latin was bound to fail considering the breadth of worldwide linguistic variation. [13] He does not fully dispense with the idea of a "universal grammar", but reduces it to universal syntactic categories or super-categories, such as number, tenses, etc. [14] Jespersen does not discuss whether these properties come from facts about general human cognition or from a language specific endowment (which would be closer to the Chomskyan formulation). As this work predates molecular genetics, he does not discuss the notion of a genetically conditioned universal grammar.

During the rise of behaviorism, the idea of a universal grammar (in either sense) was discarded. In the early 20th century, language was usually understood from a behaviourist perspective, suggesting that language acquisition, like any other kind of learning, could be explained by a succession of trials, errors, and rewards for success. [15] In other words, children learned their mother tongue by simple imitation, through listening and repeating what adults said. For example, when a child says "milk" and the mother will smile and give her child milk as a result, the child will find this outcome rewarding, thus enhancing the child's language development. [16] UG reemerged to prominence and influence in modern linguistics with the theories of Chomsky and Montague in the 1950s–1970s, as part of the "linguistics wars".

In 2016 Chomsky and Berwick co-wrote their book titled Why Only Us, where they defined both the minimalist program and the strong minimalist thesis and its implications to update their approach to UG theory. According to Berwick and Chomsky, the strong minimalist thesis states that "The optimal situation would be that UG reduces to the simplest computational principles which operate in accord with conditions of computational efficiency. This conjecture is ... called the Strong Minimalist Thesis (SMT)." [17] The significance of SMT is to significantly shift the previous emphasis on universal grammars to the concept which Chomsky and Berwick now call "merge". "Merge" is defined in their 2016 book when they state "Every computational system has embedded within it somewhere an operation that applies to two objects X and Y already formed, and constructs from them a new object Z. Call this operation Merge." SMT dictates that "Merge will be as simple as possible: it will not modify X or Y or impose any arrangement on them; in particular, it will leave them unordered, an important fact... Merge is therefore just set formation: Merge of X and Y yields the set {X, Y}." [18]

Chomsky's theory

Chomsky argued that the human brain contains a limited set of constraints for organizing language. This implies in turn that all languages have a common structural basis: the set of rules known as "universal grammar".

Speakers proficient in a language know which expressions are acceptable in their language and which are unacceptable. The key puzzle is how speakers come to know these restrictions of their language, since expressions that violate those restrictions are not present in the input, indicated as such. Chomsky argued that this poverty of stimulus means that Skinner's behaviourist perspective cannot explain language acquisition. The absence of negative evidence—evidence that an expression is part of a class of ungrammatical sentences in a given language—is the core of his argument. [19] For example, in English, an interrogative pronoun like what cannot be related to a predicate within a relative clause:

*"What did John meet a man who sold?"

Such expressions are not available to language learners: they are, by hypothesis, ungrammatical. Speakers of the local language do not use them, and would note them as unacceptable to language learners. Universal grammar offers an explanation for the presence of the poverty of the stimulus, by making certain restrictions into universal characteristics of human languages. Language learners are consequently never tempted to generalize in an illicit fashion.[ citation needed ]

Presence of creole languages

The presence of creole languages is sometimes cited as further support for this theory, especially by Bickerton's controversial language bioprogram theory. Creoles are languages that develop and form when disparate societies with no common language come together and are forced to devise a new system of communication. The system used by the original speakers is typically an inconsistent mix of vocabulary items, known as a pidgin. As these speakers' children begin to acquire their first language, they use the pidgin input to effectively create their own original language, known as a creole. Unlike pidgins, creoles have native speakers (those with acquisition from early childhood) and make use of a full, systematic grammar.

According to Bickerton, the idea of universal grammar is supported by creole languages because certain features are shared by virtually all in the category. For example, their default point of reference in time (expressed by bare verb stems) is not the present moment, but the past. Using pre-verbal auxiliaries, they uniformly express tense, aspect, and mood. Negative concord occurs, but it affects the verbal subject (as opposed to the object, as it does in languages like Spanish). Another similarity among creoles can be seen in the fact that questions are created simply by changing the intonation of a declarative sentence, not its word order or content.

However, extensive work by Carla Hudson-Kam and Elissa Newport suggests that creole languages may not support a universal grammar at all. In a series of experiments, Hudson-Kam and Newport looked at how children and adults learn artificial grammars. They found that children tend to ignore minor variations in the input when those variations are infrequent, and reproduce only the most frequent forms. In doing so, they tend to standardize the language that they hear around them. Hudson-Kam and Newport hypothesize that in a pidgin-development situation (and in the real-life situation of a deaf child whose parents are or were disfluent signers), children systematize the language they hear, based on the probability and frequency of forms, and not that which has been suggested on the basis of a universal grammar. [20] [21] Further, it seems to follow that creoles would share features with the languages from which they are derived, and thus look similar in terms of grammar.

Many researchers of universal grammar argue against a concept of relexification, which says that a language replaces its lexicon almost entirely with that of another. This goes against universalist ideas of a universal grammar, which has an innate grammar.[ citation needed ]

Evidentiality

There is no support in genetics or brain research for the claim that syntactic structures are innate. Although early brain lesion studies have been interpreted as pointing to a modular view of language, which could support Chomsky's idea of a language organ, the advent of neuroimaging has made it evident that language relates to several different functions of the brain. Chomsky's explanation that language stems from a single gene mutation is likewise incompatible with the data from comparative genomics: it is not possible for a single mutation to specify a whole cognitive function. [22]

By contrast, there is evidence from neuroimaging for the inverse idea that different languages shape the brain in different ways. Research using high-resolution diffusion-weighted MRI and tractography-based network statistics of the language connectome suggests that the white-matter brain connectivity adapts to the characteristic syntactic processing demands of the person's native language. [23] ERP imaging has found that sentence processing is the matter of interaction of syntax and semantics, rather than occurring along innate and arbitrary pathways. [24] [25] However, Chomsky rejects neuroplasticity and connectionism, which, according to him, constitute a denial of human freedom. According to Chomsky, his theory of an innate language organ and an innate moral organ oppose a philosophy of social control. [26]

Research in computational linguistics indicates the existence of unifying syntactic structures across all natural languages. [27] For instance, sentence structures of English and Japanese only appear to differ on the surface but possess the same structures underneath. This phenomenon, known as universal syntactic structures, suggests languages work the same way. Such findings, however, were already known to classical and logical grammar, [28] which proposed universal grammar as a logical necessity, and linguistic typology has found empirical evidence for numerous cross-linguistic tendencies underlying human languages. There is, however, no specific support for Chomsky's notion of genetically determined grammar.

Criticisms

Neurogeneticists Simon Fisher and Sonja Vernes consider Chomsky's "Universal Grammar" as an example of a romantic simplification of genetics and neuroscience. According to them, the link from genes to grammar has not been consistently mapped by scientists. What has been established by research relates primarily to speech pathologies. The arising lack of certainty has provided an audience for unconstrained speculations that have fed the myth of "so-called grammar genes". [22]

Geoffrey Sampson maintains that universal grammar theories are not falsifiable and are therefore pseudoscientific. He argues that the grammatical "rules" linguists posit are simply post-hoc observations about existing languages, rather than predictions about what is possible in a language. [29] [30] Similarly, Jeffrey Elman argues that the unlearnability of languages assumed by universal grammar is based on a too-strict, "worst-case" model of grammar, that is not in keeping with any actual grammar. In keeping with these points, James Hurford argues that the postulate of a language acquisition device (LAD) essentially amounts to the trivial claim that languages are learnt by humans, and thus, that the LAD is less a theory than an explanandum looking for theories. [31]

Morten H. Christiansen and Nick Chater have argued that the relatively fast-changing nature of language would prevent the slower-changing genetic structures from ever catching up, undermining the possibility of a genetically hard-wired universal grammar. Instead of an innate universal grammar, they claim, "apparently arbitrary aspects of linguistic structure may result from general learning and processing biases deriving from the structure of thought processes, perceptuo-motor factors, cognitive limitations, and pragmatics". [32]

Wolfram Hinzen summarizes the most common criticisms of universal grammar:

In addition, it has been suggested that people learn about probabilistic patterns of word distributions in their language, rather than hard and fast rules (see Distributional hypothesis). [34] For example, children overgeneralize the past tense marker "ed" and conjugate irregular verbs as if they were regular, producing forms like goed and eated and correct these deviancies over time. [35] It has also been proposed that the poverty of the stimulus problem can be largely avoided, if it is assumed that children employ similarity-based generalization strategies in language learning, generalizing about the usage of new words from similar words that they already know how to use. [36]

Language acquisition researcher Michael Ramscar has suggested that when children erroneously expect an ungrammatical form that then never occurs, the repeated failure of expectation serves as a form of implicit negative feedback that allows them to correct their errors over time such as how children correct grammar generalizations like goed to went through repetitive failure. [35] [37] In the late 2010s this very process was adapted for training large language models by next token prediction. This implies that word learning is a probabilistic, error-driven process, rather than a process of fast mapping, as many nativists assume.

In the domain of field research, the Pirahã language is claimed to be a counterexample to the basic tenets of universal grammar. This research has been led by Daniel Everett. Among other things, this language is alleged to lack all evidence for recursion, including embedded clauses, as well as quantifiers and colour terms. [38] According to the writings of Everett, the Pirahã showed these linguistic shortcomings not because they were simple-minded, but because their culture—which emphasized concrete matters in the present and also lacked creation myths and traditions of art making—did not necessitate it. [39]

Some other linguists have argued, however, that some of these properties have been misanalyzed, and that others are actually expected under current theories of universal grammar. [40] Chomsky himself has called Everett a charlatan, and other experts have even accused him of purposely ignoring instances of recursion. [41] Other linguists have attempted to reassess Pirahã to see if it did indeed use recursion. In a corpus analysis of the Pirahã language, linguists failed to disprove Everett's arguments against universal grammar and the lack of recursion in Pirahã. However, they also stated that there was "no strong evidence for the lack of recursion either" and they provided "suggestive evidence that Pirahã may have sentences with recursive structures". [42]

Everett has argued that even if a universal grammar is not impossible in principle, it should not be accepted because there are equally or more plausible theories that are simpler. In his words, "universal grammar doesn't seem to work, there doesn't seem to be much evidence for [it]. And what can we put in its place? A complex interplay of factors, of which culture, the values human beings share, plays a major role in structuring the way that we talk and the things that we talk about." [43] Michael Tomasello, a developmental psychologist, also supports this claim, arguing that "although many aspects of human linguistic competence have indeed evolved biologically, specific grammatical principles and constructions have not. And universals in the grammatical structure of different languages have come from more general processes and constraints of human cognition, communication, and vocal-auditory processing, operating during the conventionalization and transmission of the particular grammatical constructions of particular linguistic communities." [44]

See also

Notes

  1. Chomsky, Noam. "Tool Module: Chomsky's Universal Grammar" . Retrieved 2010-10-07.
  2. Evans, Nicholas; Levinson, Stephen C. (26 October 2009). "The myth of language universals: Language diversity and its importance for cognitive science". Behavioral and Brain Sciences. 32 (5): 429–48. doi: 10.1017/S0140525X0999094X . hdl: 11858/00-001M-0000-0012-C29E-4 . PMID   19857320. S2CID   2675474. Archived (PDF) from the original on 27 July 2018.
  3. Christensen, Christian Hejlesen (March 2019). "Arguments for and against the Idea of Universal Grammar". Leviathan (4): 12–28. doi: 10.7146/lev.v0i4.112677 . S2CID   172055557.
  4. "Tool Module: Chomsky's Universal Grammar". thebrain.mcgill.ca. Retrieved 2017-08-28.
  5. Valian 1986.
  6. 1 2 3 Chomsky, Noam (2007). "Approaching UG from Below". In Hans-Martin Gärtner; Uli Sauerland (eds.). Interfaces + Recursion = Language? Chomsky's Minimalism and the View from Syntax-Semantics. Studies in Generative Grammar. Berlin: Mouton de Gruyter. ISBN   978-3-11-018872-1.
  7. Baker, Mark C. (2003). "Syntax". In Mark Aronoff; Janie Rees-Miller (eds.). The Handbook of Linguistics. Wiley-Blackwell. ISBN   978-1-4051-0252-0.
  8. Hauser, Marc; Chomsky, Noam; Fitch, William Tecumseh (22 November 2002), "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" (PDF), Science, 298 (5598): 1569–1579, doi:10.1126/science.298.5598.1569, PMID   12446899, archived from the original (PDF) on 28 December 2013, retrieved 28 December 2013
  9. Hauser, Marc; Chomsky, Noam; Fitch, William Tecumseh (22 November 2002), "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" (PDF), Science, 298 (5598): 1569–1579, doi:10.1126/science.298.5598.1569, PMID   12446899, archived from the original (PDF) on 28 December 2013, retrieved 11 April 2024, We hypothesize that FLN only includes recursion and is the only uniquely human component of the faculty of language. [...] the core recursive aspect of FLN currently appears to lack any analog in animal communication and possibly other domains as well.
  10. Chomsky 2017, p. 3.
  11. Chomsky, Noam (2007). Language and mind (3. ed. Reprinted ed.). Cambridge: Cambridge University Press. p. 112. ISBN   978-0-521-67493-5.
  12. Lancelot, Claude, 1615?–1695 (1967). Grammaire generale et raisonnee, 1660. Scolar Press. OCLC   367432981.{{cite book}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)
  13. Jespersen 1965, p. 46-49.
  14. Jespersen 1965, p. 53.
  15. Chomsky, Noam. "Tool Module: Chomsky's Universal Grammar" . Retrieved 2010-10-07.
  16. Ambridge & Lieven, 2011.
  17. Chomsky and Berwick (2016). Why Only Us?. MIT Press. Page 94.
  18. Chomsky and Berwick (2016). Why Only Us?. MIT Press. Page 98.
  19. Northumbria University, Ewa Dąbrowska. "What exactly is Universal Grammar, and has anyone seen it?" (PDF). www.northumbria.ac.uk. Archived (PDF) from the original on 2022-10-09.
  20. Hudson Kam, C. L.; Newport, E. L. (2009). "Getting it right by getting it wrong: When learners change languages". Cognitive Psychology. 59 (1): 30–66. doi:10.1016/j.cogpsych.2009.01.001. PMC   2703698 . PMID   19324332.
  21. Dye, Melody (February 9, 2010). "The Advantages of Being Helpless". Scientific American. Retrieved June 10, 2014.
  22. 1 2 Fisher, Simon E.; Vernes, Sonja C. (January 2015). "Genetics and the Language Sciences". Annual Review of Linguistics. 1: 289–310. doi: 10.1146/annurev-linguist-030514-125024 . hdl: 11858/00-001M-0000-0019-DA19-1 .
  23. Wei, Xuehu; Adamson, Helyne; Schwendemann, Matthias; Goucha, Tómas; Friederici, Angela D.; Anwander, Alfred (19 February 2023). "Native language differences in the structural connectome of the human brain". NeuroImage. 270 (270): 119955. doi: 10.1016/j.neuroimage.2023.119955 . PMID   36805092.
  24. Kluender, R.; Kutas, M. (1993). "Subjacency as a processing phenomenon" (PDF). Language and Cognitive Processes. 8 (4): 573–633. doi:10.1080/01690969308407588 . Retrieved 2020-02-28.
  25. Barkley, C.; Kluender, R.; Kutas, M. (2015). "Referential processing in the human brain: An Event-Related Potential (ERP) study" (PDF). Brain Research. 1629: 143–159. doi:10.1016/j.brainres.2015.09.017. PMID   26456801. S2CID   17053154 . Retrieved 2020-02-28.
  26. Smith, Neil (2002). Chomsky: Ideas and Ideals (2nd ed.). Cambridge University Press. ISBN   0-521-47517-1.
  27. Kim, M.; Takero, H.; Fedovik, S. (2023). "Universal Syntactic Structures: Modeling Syntax for Various Natural Languages". arXiv: 2402.01641v1 [cs.CL].
  28. Rieux, Jacques; Rollin, Bernard E. (1975). "Translators' introduction". In Rieux, Jacques; Rollin, Bernard E. (eds.). General and Rational grammar: The Port-Royal Grammar by Antoine Arnauld and Claude Lancelot. Mouton. pp. 18–31. ISBN   90-279-3004-X.
  29. Sampson, Geoffrey (2005). The 'Language Instinct' Debate: Revised Edition. Bloomsbury Academic. ISBN   978-0-8264-7385-1.
  30. Cipriani, Enrico (2015). "The generative grammar between philosophy and science". European Journal of Literature and Linguistics. 4: 12–16.
  31. Hurford, James R. (1995). "Nativist and Functional Explanations in Language Acquisition" (PDF). In I. M. Roca (ed.). Logical Issues in Language Acquisition. Dordrecht, Holland and Providence, Rhode Island: Foris Publications. p. 88. Archived (PDF) from the original on 2022-10-09. Retrieved June 10, 2014.
  32. Christiansen, Morten H. and Chater, Nick (2008). "Language as Shaped by the Brain". Behavioral and Brain Sciences, 31.5: 489–509.
  33. Hinzen, Wolfram (September 2012). "The philosophical significance of Universal Grammar". Language Sciences. 34 (5): 635–649. doi:10.1016/j.langsci.2012.03.005.
  34. McDonald, Scott; Ramscar, Michael (2001). "Testing the distributional hypothesis: The influence of context on judgements of semantic similarity". Proceedings of the 23rd Annual Conference of the Cognitive Science Society: 611–616. CiteSeerX   10.1.1.104.7535 .
  35. 1 2 Fernández, Eva M.; Helen Smith Cairns (2011). Fundamentals of Psycholinguistics. Chichester, West Sussex, England: Wiley-Blackwell. ISBN   978-1-4051-9147-0.
  36. Yarlett, Daniel G.; Ramscar, Michael J. A. (2008). "Language Learning Through Similarity-Based Generalization". CiteSeerX   10.1.1.393.7298 .{{cite journal}}: Cite journal requires |journal= (help)
  37. Ramscar, Michael; Yarlett, Daniel (2007). "Linguistic self-correction in the absence of feedback: A new approach to the logical problem of language acquisition". Cognitive Science. 31 (6): 927–960. CiteSeerX   10.1.1.501.4207 . doi:10.1080/03640210701703576. PMID   21635323. S2CID   2277787.
  38. Everett, Daniel L. (August–October 2005). "Cultural Constraints on Grammar and Cognition in Pirahã: Another Look at the Design Features of Human Language" (PDF). Current Anthropology. 46 (4): 621–646. doi:10.1086/431525. hdl: 2066/41103 . S2CID   2223235. Archived (PDF) from the original on 2022-10-09.
  39. Schuessler, Jennifer (March 22, 2012). "How Do You Say 'Disagreement' in Pirahã?". The New York Times. p. C1. Retrieved June 10, 2014.
  40. Nevins, et al., 2007 Pirahã Exceptionality: a Reassessment . Archived May 21, 2013, at the Wayback Machine
  41. Folha de S.Paulo 1 February 2009.
  42. Piantadosi, Steven T.; Stearns, Laura; Everett, Daniel L.; Gibson, Edward (August 2012). "A corpus analysis of Pirahã grammar: An investigation of recursion" (PDF). Archived (PDF) from the original on 2022-10-09.
  43. McCrum, Robert (March 24, 2012). "Daniel Everett: 'There is no such thing as universal grammar". The Observer. Retrieved June 10, 2014.
  44. Tomasello, Michael (2008). Origins of human communication. Cambridge, MA: MIT Press. ISBN   978-0-262-51520-7.

Related Research Articles

Language acquisition is the process by which humans acquire the capacity to perceive and comprehend language. In other words, it is how human beings gain the ability to be aware of language, to understand it, and to produce and use words and sentences to communicate.

In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones.

Cognitive linguistics is an interdisciplinary branch of linguistics, combining knowledge and research from cognitive science, cognitive psychology, neuropsychology and linguistics. Models and theoretical accounts of cognitive linguistics are considered as psychologically real, and research in cognitive linguistics aims to help understand cognition in general and is seen as a road into the human mind.

<i>The Language Instinct</i> Book by Steven Pinker

The Language Instinct: How the Mind Creates Language is a 1994 book by Steven Pinker, written for a general audience. Pinker argues that humans are born with an innate capacity for language. He deals sympathetically with Noam Chomsky's claim that all human language shows evidence of a universal grammar, but dissents from Chomsky's skepticism that evolutionary theory can explain the human language instinct.

<span class="mw-page-title-main">Generative grammar</span> Theory in linguistics

Generative grammar, or generativism, is a linguistic theory that regards linguistics as the study of a hypothesised innate grammatical structure. It is a biological or biologistic modification of earlier structuralist theories of linguistics, deriving from logical syntax and glossematics. Generative grammar considers grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. It is a system of explicit rules that may be applied repeatedly to generate an indefinite number of sentences which can be as long as one wants them to be. The difference from structural and functional models is that the object is base-generated within the verb phrase in generative grammar. This purportedly cognitive structure is thought of as being a part of a universal grammar, a syntactic structure which is caused by a genetic mutation in humans.

<span class="mw-page-title-main">Ray Jackendoff</span> American linguist and philosophy professor

Ray Jackendoff is an American linguist. He is professor of philosophy, Seth Merrin Chair in the Humanities and, with Daniel Dennett, co-director of the Center for Cognitive Studies at Tufts University. He has always straddled the boundary between generative linguistics and cognitive linguistics, committed to both the existence of an innate universal grammar and to giving an account of language that is consistent with the current understanding of the human mind and cognition.

A linguistic universal is a pattern that occurs systematically across natural languages, potentially true for all of them. For example, All languages have nouns and verbs, or If a language is spoken, it has consonants and vowels. Research in this area of linguistics is closely tied to the study of linguistic typology, and intends to reveal generalizations across languages, likely tied to cognition, perception, or other abilities of the mind. The field originates from discussions influenced by Noam Chomsky's proposal of a Universal Grammar, but was largely pioneered by the linguist Joseph Greenberg, who derived a set of forty-five basic universals, mostly dealing with syntax, from a study of some thirty languages.

Principles and parameters is a framework within generative linguistics in which the syntax of a natural language is described in accordance with general principles and specific parameters that for particular languages are either turned on or off. For example, the position of heads in phrases is determined by a parameter. Whether a language is head-initial or head-final is regarded as a parameter which is either on or off for particular languages. Principles and parameters was largely formulated by the linguists Noam Chomsky and Howard Lasnik. Many linguists have worked within this framework, and for a period of time it was considered the dominant form of mainstream generative linguistics.

<i>Syntactic Structures</i> Book by Noam Chomsky

Syntactic Structures is an important work in linguistics by American linguist Noam Chomsky, originally published in 1957. A short monograph of about a hundred pages, it is recognized as one of the most significant and influential linguistic studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning, thus arguing for the independence of syntax from semantics.

In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.

Poverty of the stimulus (POS) is the controversial argument from linguistics that children are not exposed to rich enough data within their linguistic environments to acquire every feature of their language. This is considered evidence contrary to the empiricist idea that language is learned solely through experience. The claim is that the sentences children hear while learning a language do not contain the information needed to develop a thorough understanding of the grammar of the language.

In the field of psychology, nativism is the view that certain skills or abilities are "native" or hard-wired into the brain at birth. This is in contrast to the "blank slate" or tabula rasa view, which states that the brain has inborn capabilities for learning from the environment but does not contain content such as innate beliefs. This factor contributes to the ongoing nature versus nurture dispute, one borne from the current difficulty of reverse engineering the subconscious operations of the brain, especially the human brain.

<span class="mw-page-title-main">Plato's problem</span>

Plato's problem is the term given by Noam Chomsky to "the problem of explaining how we can know so much" given our limited experience. Chomsky believes that Plato asked how we should account for the rich, intrinsic, common structure of human cognition, when it seems underdetermined by extrinsic evidence presented to a person during human development. In linguistics this is referred to as the "argument from poverty of the stimulus" (APS). Such arguments are common in the natural sciences, where a developing theory is always "underdetermined by evidence". Chomsky's approach to Plato's problem involves treating cognition as a normal research topic in the natural sciences, so cognition can be studied to elucidate intertwined genetic, developmental, and biophysical factors. Plato's problem is most clearly illustrated in the Meno dialogue, in which Socrates demonstrates that an uneducated boy nevertheless understands geometric principles.

<span class="mw-page-title-main">Daniel Everett</span> American linguist (born 1951)

Daniel Leonard Everett is an American linguist and author best known for his study of the Amazon basin's Pirahã people and their language.

The generative approach to second language (L2) acquisition (SLA) is a cognitive based theory of SLA that applies theoretical insights developed from within generative linguistics to investigate how second languages and dialects are acquired and lost by individuals learning naturalistically or with formal instruction in foreign, second language and lingua franca settings. Central to generative linguistics is the concept of Universal Grammar (UG), a part of an innate, biologically endowed language faculty which refers to knowledge alleged to be common to all human languages. UG includes both invariant principles as well as parameters that allow for variation which place limitations on the form and operations of grammar. Subsequently, research within the Generative Second-Language Acquisition (GenSLA) tradition describes and explains SLA by probing the interplay between Universal Grammar, knowledge of one's native language and input from the target language. Research is conducted in syntax, phonology, morphology, phonetics, semantics, and has some relevant applications to pragmatics.

<span class="mw-page-title-main">Biolinguistics</span> Study of the biology and evolution of language

Biolinguistics can be defined as the study of biology and the evolution of language. It is highly interdisciplinary as it is related to various fields such as biology, linguistics, psychology, anthropology, mathematics, and neurolinguistics to explain the formation of language. It seeks to yield a framework by which we can understand the fundamentals of the faculty of language. This field was first introduced by Massimo Piattelli-Palmarini, professor of Linguistics and Cognitive Science at the University of Arizona. It was first introduced in 1971, at an international meeting at the Massachusetts Institute of Technology (MIT).

In linguistics, the innateness hypothesis, also known as the nativist hypothesis, holds that humans are born with at least some knowledge of linguistic structure. On this hypothesis, language acquisition involves filling in the details of an innate blueprint rather than being an entirely inductive process. The hypothesis is one of the cornerstones of generative grammar and related approaches in linguistics. Arguments in favour include the poverty of the stimulus, the universality of language acquisition, as well as experimental studies on learning and learnability. However, these arguments have been criticized, and the hypothesis is widely rejected in other traditions such as usage-based linguistics. The term was coined by Hilary Putnam in reference to the views of Noam Chomsky.

<i>Aspects of the Theory of Syntax</i> 1965 book by Noam Chomsky

Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.

<i>Lectures on Government and Binding</i> 1981 book by Noam Chomsky

Lectures on Government and Binding: The Pisa Lectures (LGB) is a book by the linguist Noam Chomsky, published in 1981. It is based on the lectures Chomsky gave at the GLOW conference and workshop held at the Scuola Normale Superiore in Pisa, Italy, in 1979. In this book, Chomsky presented his government and binding theory of syntax. It had great influence on the syntactic research in early 1980s, especially among the linguists working within the transformational grammar framework.

The basis of Noam Chomsky's linguistic theory lies in biolinguistics, the linguistic school that holds that the principles underpinning the structure of language are biologically preset in the human mind and hence genetically inherited. He argues that all humans share the same underlying linguistic structure, irrespective of sociocultural differences. In adopting this position Chomsky rejects the radical behaviorist psychology of B. F. Skinner, who viewed speech, thought, and all behavior as a completely learned product of the interactions between organisms and their environments. Accordingly, Chomsky argues that language is a unique evolutionary development of the human species and distinguished from modes of communication used by any other animal species. Chomsky's nativist, internalist view of language is consistent with the philosophical school of "rationalism" and contrasts with the anti-nativist, externalist view of language consistent with the philosophical school of "empiricism", which contends that all knowledge, including language, comes from external stimuli.

References

Further reading