David Lightfoot | |
---|---|
Born | David William Lightfoot February 10, 1945 |
Nationality | United States |
Occupation(s) | Linguist, academic, educator, author |
Years active | 1971—present |
Awards | The Linguistic Society of America's Linguistic Service Award (2013), The Linguistic Society of America's Distinguished Teaching Award (2013) |
Academic background | |
Alma mater | University of Michigan |
Thesis | Natural Logic and the Moods of Classical Greek (1971) |
Doctoral advisor | Robin Lakoff |
Academic work | |
Discipline | Linguistics |
Sub-discipline | Syntactic theory,language acquisition,language change |
Institutions | |
Notable works | Principles of Diachronic Syntax (CUP 1979), The Language Lottery: Toward a Biology of Grammars (MIT Press, 1982), How to Set Parameters: Arguments from Language Change (MIT Press, 1991), and The Development of Language: Acquisition, Change, and Evolution (Blackwell, 1999). |
David William Lightfoot (born February 10, 1945) is an American linguist who served both as assistant director, [1] [2] of the National Science Foundation's Directorate for Social, Behavioral and Economic Sciences from 2005 to 2009, [3] [4] and as the President of the Linguistic Society of America from 2010 to 2011. As of 2024, he is Emeritus Professor of linguistics at Georgetown University. [5] He is the founder of the Department of Linguistics at the University of Maryland. Lightfoot is a fellow of the American Association for the Advancement of Science (AAAS), and a fellow of the Linguistic Society of America (LSA). He is also a fellow of the American Council of Learned Societies. [6] Lightfoot has been a Guest Professor of linguistics at the Beijing Language and Culture University (BLCU) since 2016. [7] [a]
His research focuses on language acquisition, change, and evolution.
Lightfoot has published widely in generative syntax and is best known for his theoretical stance that an accurate description of the basic principles of generative grammar requires an understanding of how they could be acquired, thus linking them to human biology and development. In the 1970s, he was one of the linguists who helped renew interest in diachronic syntax, the study of syntactic change over time, and the emergence of new syntactic phenomena. [b]
More recently, Lightfoot argued that children are born to assign structures to their ambient language, yielding a view of language variation not based on parameters defined at Universal Grammar. This approach extends Minimalist thinking, by dispensing with parameters, evaluation metrics for the selection of grammars, and any independent parsing mechanism. Instead, both external and internal languages play crucial, interacting roles, allowing an “open” Universal Grammar. [8]
Lightfoot was born in Looe, Cornwall, UK but grew up in Plymouth. After receiving his B.A. in Classical Studies from Kings College London in 1966, he worked for a year as Labor Relations manager at Ford Motor Company. He earned a Ph.D. in Linguistics from the University of Michigan in 1971. His doctoral thesis was titled "Natural Logic and the Moods of Classical Greek". Lightfoot worked under the supervision of Robin Lakoff. [9] [10]
Lightfoot is an American linguist and cognitive scientist known for his contributions to the study of language as a biological faculty. He held professorial appointments at McGill University, the University of Utrecht, and the University of Maryland. At Maryland, he founded and chaired the linguistics department for 12 years. There, he served as Associate Director of the Neuroscience and Cognitive Sciences program. [c] [11]
In 2001, Lightfoot became Dean of the Graduate School at Georgetown University. [12]
From 2005 to 2009, he served as Assistant Director of the National Science Foundation, overseeing the Directorate for Social, Behavioral, and Economic Sciences.
Returning to Georgetown in 2009, he directed the undergraduate Cognitive Science program and the graduate program in Communication, Culture & Technology.
In 2004, Lightfoot was elected a fellow of the American Association for the Advancement of Science (AAAS), and a fellow of the Linguistic Society of America (LSA) two years later. He also served as President of the LSA for one year (2010-2011).
Diachronic syntax is the study of how the structure of sentences in a language changes over time. Just like words and pronunciation can evolve, so can the rules that govern how we arrange words to form sentences. For example, in Old English, people could move words around more freely because endings on words showed their role in the sentence, but in Modern English, word order is much more fixed (we usually stick to subject-verb-object as in "The cat chased the mouse"). Diachronic syntax looks at these kinds of changes, helping us understand how and why languages evolve over centuries. [13]
His 1979 book, Principles of Diachronic Syntax, [14] significantly influenced the study of syntactic change. In this work, Lightfoot presented a framework for understanding syntactic change, focusing on the concept of "radical reanalysis," where new generations of speakers reinterpret linguistic structures based on the input they receive. Lightfoot argued that syntactic changes occur when the complexity of a language's grammar becomes too opaque for new learners to process, leading to reanalysis. His ideas were centered around generative grammar, and he emphasized the need to separate theories of grammar from theories of change, which was considered a major step forward in diachronic syntax. A key contribution was his proposal of the "Transparency Principle," which suggested that language changes when the connection between underlying structures and surface forms becomes too complex or opaque for learners to process easily. While some reviewers, like Fischer, van der Leek, [15] and Warner, [16] recognized the methodological clarity of his work, they also critiqued aspects such as the "Transparency Principle," suggesting that some of Lightfoot's explanations were overly simplistic. Despite these criticisms, Principles of Diachronic Syntax played a role in renewing interest in the study of syntactic change within generative grammar. Lightfoot's analyses, particularly his treatment of English modals, have remained influential in shaping research on how syntactic structures evolve over time. [16]
Lightfoot's methodology also made a lasting impact by arguing that simultaneous changes in a language's grammar could often be traced back to a single underlying cause. His case studies, particularly in the history of English, such as the development of modal verbs and impersonal constructions, demonstrated how his theoretical insights could be applied to real historical data. [16]
In The Language Lottery: Toward a Biology of Grammar, [17] Lightfoot presented an introduction to the biolinguistic approach of generative grammar, positing that humans are born with an innate capacity for language acquisition. Rooted in Noam Chomsky’s theories, Lightfoot compared linguistic development to a lottery, where every child draws the capacity to acquire any language as a result of a genetic "language program." Reviews were largely positive, with Yukio Otsu praising the book for effectively illustrating complex linguistic theories for a broad readership, though he noted that some specialized terms could be challenging for non-linguists. [18] Lyle Jenkins lauded the work for fostering interdisciplinary dialogue and appreciated its "explanatory clarity" on the role of innate structures in language, though he suggested that some sections might require background knowledge in linguistics to fully grasp. [19] Fred D'Agostino also praised Lightfoot's structured explanations of generative grammar's foundational principles, considering them "particularly clear and forceful". [20]
His 2006 book, How New Languages Emerge, [21] studies the processes involved in language change, particularly focusing on how new languages come into being. Central to his argument is the distinction between internal (I-language) and external (E-language) systems. Internal language refers to an individual's mental grammar shaped by biological factors, while external language encompasses the societal and environmental influences. Lightfoot stresses the role of children in language development, as they construct new grammars based on linguistic cues from their surroundings. The book combines linguistic theory, historical linguistics, and cognitive science to explain how languages evolve over time. Lightfoot argues that language change is not only a social phenomenon but also deeply connected to the cognitive processes of language acquisition. He shows that structural changes in language are contingent on shifts in grammatical cues encountered by children, which gradually spread through the community. [22] [23] [24] [25] [26] French historical linguist Chris H. Reintges described the work as "an engaged manifesto for a new historical linguistics". Reintges said that Lightfoot's work "shows that the emergence of novel grammars, while part of language change, is a phenomenon of much broader scope that may shed new light on well-studied cases of morphosyntactic change." [25]
Lightfoot's 2020 book, Born to Parse: How Children Select Their Languages, signified, as Chinese linguist Yu Fu noted, [27] his own "major shift from the parameter-based approach to the parsing-based approach to language acquisition and change." [28] The work also presented a radical shift in linguistic theory, emphasizing that children naturally parse their ambient language using an internal linguistic system. This approach challenges long-standing views of language acquisition, particularly the idea of Universal Grammar (UG) as being parameter-based. Lightfoot dispenses with the need for predefined grammatical parameters, an evaluation metric for grammar selection, and an independent parsing mechanism. Instead, he argues that language variation arises as children parse external language, making sense of their internal linguistic structures. This perspective aligns with the Minimalist Program, contributing to a simplified view of grammar acquisition while addressing historical developments in English syntax, such as modal verbs and verb movement. Lightfoot also provides several case studies on English and theoretical insights, demonstrating how children's parsing abilities lead to language change and the emergence of specific linguistic properties. [29] [30]
The following outline is provided as an overview and topical guide to linguistics:
In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.
Universal grammar (UG), in modern linguistics, is the theory of the innate biological component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that there are innate constraints on what the grammar of a possible human language could be. When linguistic stimuli are received in the course of language acquisition, children then adopt specific syntactic rules that conform to UG. The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established, as some linguists have argued languages are so diverse that such universality is rare, and the theory of universal grammar remains controversial among linguists.
In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones.
In linguistics, X-bar theory is a model of phrase-structure grammar and a theory of syntactic category formation that was first proposed by Noam Chomsky in 1970 reformulating the ideas of Zellig Harris (1951), and further developed by Ray Jackendoff, along the lines of the theory of generative grammar put forth in the 1950s by Chomsky. It attempts to capture the structure of phrasal categories with a single uniform structure called the X-bar schema, basing itself on the assumption that any phrase in natural language is an XP that is headed by a given syntactic category X. It played a significant role in resolving issues that phrase structure rules had, representative of which is the proliferation of grammatical rules, which is against the thesis of generative grammar.
Generative grammar is a research tradition in linguistics that aims to explain the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge. Generative linguists, or generativists, tend to share certain working assumptions such as the competence–performance distinction and the notion that some domain-specific aspects of grammar are partly innate in humans. These assumptions are rejected in non-generative approaches such as usage-based models of language. Generative linguistics includes work in core areas such as syntax, semantics, phonology, psycholinguistics, and language acquisition, with additional extensions to topics including biolinguistics and music cognition.
In linguistics, the minimalist program is a major line of inquiry that has been developing inside generative grammar since the early 1990s, starting with a 1993 paper by Noam Chomsky.
Construction grammar is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human language. Constructions include words, morphemes, fixed expressions and idioms, and abstract grammatical rules such as the passive voice or the ditransitive. Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts, or from other constructions that are recognized to exist. In construction grammar, every utterance is understood to be a combination of multiple different constructions, which together specify its precise meaning and form.
Principles and parameters is a framework within generative linguistics in which the syntax of a natural language is described in accordance with general principles and specific parameters that for particular languages are either turned on or off. For example, the position of heads in phrases is determined by a parameter. Whether a language is head-initial or head-final is regarded as a parameter which is either on or off for particular languages. Principles and parameters was largely formulated by the linguists Noam Chomsky and Howard Lasnik. Many linguists have worked within this framework, and for a period of time it was considered the dominant form of mainstream generative linguistics.
Syntactic Structures is an important work in linguistics by American linguist Noam Chomsky, originally published in 1957. A short monograph of about a hundred pages, it is recognized as one of the most significant and influential linguistic studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning, thus arguing for the independence of syntax from semantics.
In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.
Frederick J. (Fritz) Newmeyer is an American linguist who is Professor Emeritus of Linguistics at the University of Washington and adjunct professor in the University of British Columbia Department of Linguistics and the Simon Fraser University Department of Linguistics. He has published widely in theoretical and English syntax and is best known for his work on the history of generative syntax and for his arguments that linguistic formalism and linguistic functionalism are not incompatible, but rather complementary. In the early 1990s he was one of the linguists who helped to renew interest in the evolutionary origin of language. More recently, Newmeyer argued that facts about linguistic typology are better explained by parsing constraints than by the principles and parameters model of grammar. Nevertheless, he has continued to defend the basic principles of generative grammar, arguing that Ferdinand de Saussure's langue/parole distinction as well Noam Chomsky's distinction between linguistic competence and linguistic performance are essentially correct.
The generative approach to second language (L2) acquisition (SLA) is a cognitive based theory of SLA that applies theoretical insights developed from within generative linguistics to investigate how second languages and dialects are acquired and lost by individuals learning naturalistically or with formal instruction in foreign, second language and lingua franca settings. Central to generative linguistics is the concept of Universal Grammar (UG), a part of an innate, biologically endowed language faculty which refers to knowledge alleged to be common to all human languages. UG includes both invariant principles as well as parameters that allow for variation which place limitations on the form and operations of grammar. Subsequently, research within the Generative Second-Language Acquisition (GenSLA) tradition describes and explains SLA by probing the interplay between Universal Grammar, knowledge of one's native language and input from the target language. Research is conducted in syntax, phonology, morphology, phonetics, semantics, and has some relevant applications to pragmatics.
Biolinguistics can be defined as the study of biology and the evolution of language. It is highly interdisciplinary as it is related to various fields such as biology, linguistics, psychology, anthropology, mathematics, and neurolinguistics to explain the formation of language. It seeks to yield a framework by which we can understand the fundamentals of the faculty of language. This field was first introduced by Massimo Piattelli-Palmarini, professor of Linguistics and Cognitive Science at the University of Arizona. It was first introduced in 1971, at an international meeting at the Massachusetts Institute of Technology (MIT).
The term linguistic performance was used by Noam Chomsky in 1960 to describe "the actual use of language in concrete situations". It is used to describe both the production, sometimes called parole, as well as the comprehension of language. Performance is defined in opposition to "competence"; the latter describes the mental knowledge that a speaker or listener has of language.
In certain theories of linguistics, thematic relations, also known as semantic roles, are the various roles that a noun phrase may play with respect to the action or state described by a governing verb, commonly the sentence's main verb. For example, in the sentence "Susan ate an apple", Susan is the doer of the eating, so she is an agent; an apple is the item that is eaten, so it is a patient.
The linguistics wars were extended disputes among American theoretical linguists that occurred mostly during the 1960s and 1970s, stemming from a disagreement between Noam Chomsky and several of his associates and students. The debates started in 1967 when linguists Paul Postal, John R. Ross, George Lakoff, and James D. McCawley —self-dubbed the "Four Horsemen of the Apocalypse"—proposed an alternative approach in which the relation between semantics and syntax is viewed differently, which treated deep structures as meaning rather than syntactic objects. While Chomsky and other generative grammarians argued that meaning is driven by an underlying syntax, generative semanticists posited that syntax is shaped by an underlying meaning. This intellectual divergence led to two competing frameworks in generative semantics and interpretive semantics.
Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.
In linguistics, the autonomy of syntax is the assumption that syntax is arbitrary and self-contained with respect to meaning, semantics, pragmatics, discourse function, and other factors external to language. The autonomy of syntax is advocated by linguistic formalists, and in particular by generative linguistics, whose approaches have hence been called autonomist linguistics.
Giuseppe Longobardi is a linguist and academic who serves as an Anniversary Professor at the University of York and is an elected Member of the Academia Europaea. He is most known for combining generative grammatical principles with problems of analytic philosophy, such as the expression of reference, and with historical-comparative explanations.
{{cite book}}
: CS1 maint: date and year (link)