Edward Stabler

Last updated

Edward Stabler is a Professor of Linguistics at the University of California, Los Angeles. [1] His primary areas of research are (1) Natural Language Processing [2] (NLP), (2) Parsing and formal language theory, [3] [4] [5] and (3) Philosophy of Logic and Language. He was a member of the faculty at UCLA from 1984 to 2016. His work involves the production of software for minimalist grammars (MGs) [6] [7] [8] and related systems.

Contents

Early life and education

Stabler received his Ph.D. from the Department of Linguistics and Philosophy at MIT in 1981.

Recent publications

Related Research Articles

<span class="mw-page-title-main">Evolutionary linguistics</span> Sociobiological approaches to linguistics

Evolutionary linguistics or Darwinian linguistics is a sociobiological approach to the study of language. Evolutionary linguists consider linguistics as a subfield of sociobiology and evolutionary psychology. The approach is also closely linked with evolutionary anthropology, cognitive linguistics and biolinguistics. Studying languages as the products of nature, it is interested in the biological origin and development of language. Evolutionary linguistics is contrasted with humanistic approaches, especially structural linguistics.

The following outline is provided as an overview and topical guide to linguistics:

<span class="mw-page-title-main">Recursion</span> Process of repeating items in a self-similar way

Recursion occurs when a thing is defined in terms of itself or of its type. Recursion is used in a variety of disciplines ranging from linguistics to logic. The most common application of recursion is in mathematics and computer science, where a function being defined is applied within its own definition. While this apparently defines an infinite number of instances, it is often done in such a way that no infinite loop or infinite chain of references can occur.

In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.

Semantics is the study of reference, meaning, or truth. The term can be used to refer to subfields of several distinct disciplines, including philosophy, linguistics and computer science.

<span class="mw-page-title-main">Universal grammar</span> Theory of the genetic component of the language faculty

Universal grammar (UG), in modern linguistics, is the theory of the genetic component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that there are innate constraints on what the grammar of a possible human language could be. When linguistic stimuli are received in the course of language acquisition, children then adopt specific syntactic rules that conform to UG. The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established, as some linguists have argued languages are so diverse that such universality is rare. It is a matter of empirical investigation to determine precisely what properties are universal and what linguistic capacities are innate.

Cognitive linguistics is an interdisciplinary branch of linguistics, combining knowledge and research from cognitive science, cognitive psychology, neuropsychology and linguistics. Models and theoretical accounts of cognitive linguistics are considered as psychologically real, and research in cognitive linguistics aims to help understand cognition in general and is seen as a road into the human mind.

<span class="mw-page-title-main">Generative grammar</span> Theory in linguistics

Generative grammar, or generativism, is a linguistic theory that regards linguistics as the study of a hypothesised innate grammatical structure. It is a biological or biologistic modification of earlier structuralist theories of linguistics, deriving ultimately from glossematics. Generative grammar considers grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. It is a system of explicit rules that may apply repeatedly to generate an indefinite number of sentences which can be as long as one wants them to be. The difference from structural and functional models is that the object is base-generated within the verb phrase in generative grammar. This purportedly cognitive structure is thought of as being a part of a universal grammar, a syntactic structure which is caused by a genetic mutation in humans.

<span class="mw-page-title-main">Image schema</span>

An image schema is a recurring structure within our cognitive processes which establishes patterns of understanding and reasoning. As an understudy to embodied cognition, image schemas are formed from our bodily interactions, from linguistic experience, and from historical context. The term is introduced in Mark Johnson's book The Body in the Mind; in case study 2 of George Lakoff's Women, Fire and Dangerous Things: and further explained by Todd Oakley in The Oxford handbook of cognitive linguistics; by Rudolf Arnheim in Visual Thinking; by the collection From Perception to Meaning: Image Schemas in Cognitive Linguistics edited by Beate Hampe and Joseph E. Grady.

Montague grammar is an approach to natural language semantics, named after American logician Richard Montague. The Montague grammar is based on mathematical logic, especially higher-order predicate logic and lambda calculus, and makes use of the notions of intensional logic, via Kripke models. Montague pioneered this approach in the 1960s and early 1970s.

<span class="mw-page-title-main">Ray Jackendoff</span> American linguist and philosophy professor

Ray Jackendoff is an American linguist. He is professor of philosophy, Seth Merrin Chair in the Humanities and, with Daniel Dennett, co-director of the Center for Cognitive Studies at Tufts University. He has always straddled the boundary between generative linguistics and cognitive linguistics, committed to both the existence of an innate universal grammar and to giving an account of language that is consistent with the current understanding of the human mind and cognition.

Construction grammar is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human language. Constructions include words, morphemes, fixed expressions and idioms, and abstract grammatical rules such as the passive voice or the ditransitive. Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts, or from other constructions that are recognized to exist. In construction grammar, every utterance is understood to be a combination of multiple different constructions, which together specify its precise meaning and form.

The cognitive revolution was an intellectual movement that began in the 1950s as an interdisciplinary study of the mind and its processes. It later became known collectively as cognitive science. The relevant areas of interchange were between the fields of psychology, linguistics, computer science, anthropology, neuroscience, and philosophy. The approaches used were developed within the then-nascent fields of artificial intelligence, computer science, and neuroscience. In the 1960s, the Harvard Center for Cognitive Studies and the Center for Human Information Processing at the University of California, San Diego were influential in developing the academic study of cognitive science. By the early 1970s, the cognitive movement had surpassed behaviorism as a psychological paradigm. Furthermore, by the early 1980s the cognitive approach had become the dominant line of research inquiry across most branches in the field of psychology.

Generative semantics was a research program in theoretical linguistics which held that syntactic structures are computed on the basis of meanings rather than the other way around. Generative semantics developed out of transformational generative grammar in the mid-1960s, but stood in opposition to it. The period in which the two research programs coexisted was marked by intense and often personal clashes now known as the linguistics wars. Its proponents included Haj Ross, Paul Postal, James McCawley, and George Lakoff, who dubbed themselves "The Four Horsemen of the Apocalypse".

In computer science, a grammar is informally called a recursive grammar if it contains production rules that are recursive, meaning that expanding a non-terminal according to these rules can eventually lead to a string that includes the same non-terminal again. Otherwise it is called a non-recursive grammar.

Donkey sentences are sentences that contain a pronoun with clear meaning but whose syntactical role in the sentence poses challenges to grammarians. Such sentences defy straightforward attempts to generate their formal language equivalents. The difficulty is with understanding how English speakers parse such sentences.

Linguistics is the scientific study of human language. It is called a scientific study because it entails a comprehensive, systematic, objective, and precise analysis of all aspects of language, particularly its nature and structure. Linguistics is concerned with both the cognitive and social aspects of language. It is considered a scientific field as well as an academic discipline; it has been classified as a social science, natural science, cognitive science, or part of the humanities.

Formal semantics is the study of grammatical meaning in natural languages using formal tools from logic and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

<span class="mw-page-title-main">Formalism (linguistics)</span>

In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.

Morten H. Christiansen is a Danish cognitive scientist known for his work on the evolution of language, and connectionist modeling of human language acquisition. He is Professor in the Department of Psychology and Co-Director of the Cognitive Science Program at Cornell University as well Senior Scientist at the Haskins Labs and Professor in the School of Communication and Culture at Aarhus University. His research has produced evidence for considering language to be a cultural system that is shaped by general-purpose cognitive and learning mechanisms, rather than from innate language-specific mental structures.

References

  1. " Oak Park grad writes book on the college admissions process". The Acorn, by Stephanie Bertholdo
  2. Montserrat Sanz; Itziar Laka; Michael K. Tanenhaus (29 August 2013). Language Down the Garden Path: The Cognitive and Biological Basis for Linguistic Structures. Oxford University Press. pp. 101–. ISBN   978-0-19-967713-9.
  3. Robert Levine Associate Professor of Linguistics Ohio State University (11 February 1992). Formal Grammar : Theory and Implementation: Theory and Implementation. Oxford University Press, USA. pp. 7–. ISBN   978-0-19-534492-9.
  4. Kirk Hazen (25 August 2014). An Introduction to Language. John Wiley & Sons. pp. 286–. ISBN   978-0-470-65896-3.
  5. "Book Reviews: The MIT Encyclopedia of the Cognitive Sciences". aclweb.org.
  6. The Mathematics of Language. Walter de Gruyter. 2003. pp. 414–. ISBN   978-3-11-017620-9.
  7. Johan Van Benthem; Amitabha Gupta; Rohit Parikh (2 April 2011). Proof, Computation and Agency: Logic at the Crossroads. Springer Science & Business Media. pp. 267–. ISBN   978-94-007-0080-2.
  8. Computer Design: The Design and Application of Digital Circuits, Equipment & Systems. 1985.