Alien languages, i.e. languages of extraterrestrial beings, are a hypothetical subject since none have been encountered so far. [1] The research in these hypothetical languages is variously called exolinguistics, xenolinguistics [2] or astrolinguistics. [3] [4] A group of prominent linguists and animal communication scientists, including Noam Chomsky, have examined such hypothetical languages in the book Xenolinguistics: Towards a Science of Extraterrestrial Language , edited by astrobiologist Douglas Vakoch and linguist Jeffrey Punske. [5] The question of what form alien languages might take and the possibility for humans to recognize and translate them has been part of the linguistics and language studies courses, e.g., at the Bowling Green State University (2001). [6]
Noam Chomsky (1983), starting with his hypothesis of a genetically predetermined universal grammar of human languages, held that it would be impossible for a human to naturally learn an alien language because it would most probably violate the universal grammar inborn in humans. Humans would have to study an alien language by the slow way of discovery, the same way as scientists do research in, say, physics. [7]
Linguist Keren Rice posits that basic communication between humans and aliens should be possible, unless "the things that we think are common to languages—situating in time [and] space, talking about participants, etc.—are so radically different that the human language provides no starting point for it." [8]
Jessica Coon, a professor of linguistics at McGill University, was consulted for the linguistic aspect of the 2016 film Arrival . While acknowledging that the graphical language in the film was art without linguistic meaning, she stated that the film was a fairly accurate portrayal of the approach human linguists would use in trying to understand an alien language. [9]
Laurance Doyle and others have suggested an application of Zipf's law for detection of alien language in the search for extraterrestrial intelligence. [10] [11]
Solomon W. Golomb posited that in order to gain the ability to build radio transmitters or other devices capable of interstellar communication, or any other technology beyond the most rudimentary tools, knowledge must be accumulated over the course of many generations. Golomb further reasoned that since this requires that those who have learned knowledge from others can keep passing it on even after those who originally created the knowledge are dead, any beings capable of building civilizations must have an innate understanding that information retains its meaning no matter who utters it, and not block information out based on the generation of the messenger or deeming the same words acceptable or unacceptable depending on who utters them. It was held by Golomb that this ability, by being a necessary condition for accumulating information into culture in the first place, must be innate as something that is needed to form culture from the beginning cannot be an effect of culture. Golomb argued that this would create a common linguistic ground assisting humans with this ability in learning extraterrestrial languages. [12]
Ian Roberts, a professor of linguistics at the University of Cambridge says: “We are the only species that have language in the sense of an open-ended system which can be used to express anything you want to express". Roberts sits on the Advisory Council of Messaging Extraterrestrial Intelligence (Meti), an organisation founded in 2015 to send messages from Earth to outer space in the hope of receiving a reply.
Charles Francis Hockett was an American linguist who developed many influential ideas in American structuralist linguistics. He represents the post-Bloomfieldian phase of structuralism often referred to as "distributionalism" or "taxonomic structuralism". His academic career spanned over half a century at Cornell and Rice universities. Hockett was also a firm believer of linguistics as a branch of anthropology, making contributions that were significant to the field of anthropology as well.
Language is a structured system of communication that consists of grammar and vocabulary. It is the primary means by which humans convey meaning, both in spoken and signed forms, and may also be conveyed through writing. Human language is characterized by its cultural and historical diversity, with significant variations observed between cultures and across time. Human languages possess the properties of productivity and displacement, which enable the creation of an infinite number of sentences, and the ability to refer to objects, events, and ideas that are not immediately present in the discourse. The use of human language relies on social convention and is acquired through learning.
Universal grammar (UG), in modern linguistics, is the theory of the innate biological component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that there are innate constraints on what the grammar of a possible human language could be. When linguistic stimuli are received in the course of language acquisition, children then adopt specific syntactic rules that conform to UG. The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established, as some linguists have argued languages are so diverse that such universality is rare, and the theory of universal grammar remains controversial among linguists.
In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) was the earliest model of grammar proposed within the research tradition of generative grammar. Like current generative theories, it treated grammar as a system of formal rules that generate all and only grammatical sentences of a given language. What was distinctive about transformational grammar was that it posited transformation rules which mapped a sentence's deep structure to its pronounced form. For example, in many variants of transformational grammar, the English active voice sentence "Emma saw Daisy" and its passive counterpart "Daisy was seen by Emma" would share a common deep structure generated by phrase structure rules. They would differ in that only the latter would have its structure modified by a passivization transformation rule.
Cognitive linguistics is an interdisciplinary branch of linguistics, combining knowledge and research from cognitive science, cognitive psychology, neuropsychology and linguistics. Models and theoretical accounts of cognitive linguistics are considered as psychologically real, and research in cognitive linguistics aims to help understand cognition in general and is seen as a road into the human mind.
The Language Instinct: How the Mind Creates Language is a 1994 book by Steven Pinker, written for a general audience. Pinker argues that humans are born with an innate capacity for language. He deals sympathetically with Noam Chomsky's claim that all human language shows evidence of a universal grammar, but dissents from Chomsky's skepticism that evolutionary theory can explain the human language instinct.
Deep structure and surface structure are concepts used in linguistics, specifically in the study of syntax in the Chomskyan tradition of transformational generative grammar.
Generative grammar is a research tradition in linguistics that aims to explain the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge. Generative linguists, or generativists, tend to share certain working assumptions such as the competence–performance distinction and the notion that some domain-specific aspects of grammar are partly innate in humans. These assumptions are rejected in non-generative approaches such as usage-based models of language. Generative linguistics includes work in core areas such as syntax, semantics, phonology, psycholinguistics, and language acquisition, with additional extensions to topics including biolinguistics and music cognition.
A linguistic universal is a pattern that occurs systematically across natural languages, potentially true for all of them. For example, All languages have nouns and verbs, or If a language is spoken, it has consonants and vowels. Research in this area of linguistics is closely tied to the study of linguistic typology, and intends to reveal generalizations across languages, likely tied to cognition, perception, or other abilities of the mind. The field originates from discussions influenced by Noam Chomsky's proposal of a Universal Grammar, but was largely pioneered by the linguist Joseph Greenberg, who derived a set of forty-five basic universals, mostly dealing with syntax, from a study of some thirty languages.
Principles and parameters is a framework within generative linguistics in which the syntax of a natural language is described in accordance with general principles and specific parameters that for particular languages are either turned on or off. For example, the position of heads in phrases is determined by a parameter. Whether a language is head-initial or head-final is regarded as a parameter which is either on or off for particular languages. Principles and parameters was largely formulated by the linguists Noam Chomsky and Howard Lasnik. Many linguists have worked within this framework, and for a period of time it was considered the dominant form of mainstream generative linguistics.
Syntactic Structures is an important work in linguistics by American linguist Noam Chomsky, originally published in 1957. A short monograph of about a hundred pages, it is recognized as one of the most significant and influential linguistic studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning, thus arguing for the independence of syntax from semantics.
In linguistics and social sciences, markedness is the state of standing out as nontypical or divergent as opposed to regular or common. In a marked–unmarked relation, one term of an opposition is the broader, dominant one. The dominant default or minimum-effort form is known as unmarked; the other, secondary one is marked. In other words, markedness involves the characterization of a "normal" linguistic unit against one or more of its possible "irregular" forms.
The term Cartesian linguistics was coined by Noam Chomsky in his book Cartesian Linguistics: A Chapter in the History of Rationalist Thought (1966). The adjective "Cartesian" pertains to René Descartes, a prominent 17th-century philosopher. As well as Descartes, Chomsky surveys other examples of rationalist thought in 17th-century linguistics, in particular the Port-Royal Grammar (1660), which foreshadows some of his own ideas concerning universal grammar.
Biolinguistics can be defined as the study of biology and the evolution of language. It is highly interdisciplinary as it is related to various fields such as biology, linguistics, psychology, anthropology, mathematics, and neurolinguistics to explain the formation of language. It seeks to yield a framework by which we can understand the fundamentals of the faculty of language. This field was first introduced by Massimo Piattelli-Palmarini, professor of Linguistics and Cognitive Science at the University of Arizona. It was first introduced in 1971, at an international meeting at the Massachusetts Institute of Technology (MIT).
Traditional transmission is one of the 13 design features of language developed by anthropologist Charles F. Hockett to distinguish the features of human language from that of animal communication. Critically, animal communication might display some of the thirteen features but never all of them. It is typically considered as one of the crucial characteristics distinguishing human from animal communication and provides significant support for the argument that language is learned socially within a community and not inborn where the acquisition of information is via the avenue of genetic inheritance.
The history of linguistics in the United States began to discover a greater understanding of humans and language. By trying to find a greater ‘parent language’ through similarities in different languages, a number of connections were discovered. Many contributors and new ideas helped shape the study of linguistics in the United States into what we know it as today. In the 1920s, linguistics focused on grammatical analysis and grammatical structure, especially of languages indigenous to North America, such as Chippewa, Apache, and more. In addition to scholars who have paved the way for linguistics in the United States, the Linguistic Society of America is a group that has contributed to the research of linguistics in America. The United States has long been known for its diverse collection of linguistic features and dialects that are spread across the country. In recent years, the study of linguistics in the United States has broadened to include nonstandard varieties of English speaking, such as Chicano English and African American English, as well as the question if language perpetuates inequalities.
In linguistics, the innateness hypothesis, also known as the nativist hypothesis, holds that humans are born with at least some knowledge of linguistic structure. On this hypothesis, language acquisition involves filling in the details of an innate blueprint rather than being an entirely inductive process. The hypothesis is one of the cornerstones of generative grammar and related approaches in linguistics. Arguments in favour include the poverty of the stimulus, the universality of language acquisition, as well as experimental studies on learning and learnability. However, these arguments have been criticized, and the hypothesis is widely rejected in other traditions such as usage-based linguistics. The term was coined by Hilary Putnam in reference to the views of Noam Chomsky.
Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.
Lectures on Government and Binding: The Pisa Lectures (LGB) is a book by the linguist Noam Chomsky, published in 1981. It is based on the lectures Chomsky gave at the GLOW conference and workshop held at the Scuola Normale Superiore in Pisa, Italy, in 1979. In this book, Chomsky presented his government and binding theory of syntax. It had great influence on the syntactic research in early 1980s, especially among the linguists working within the transformational grammar framework.
Distributionalism was a general theory of language and a discovery procedure for establishing elements and structures of language based on observed usage. The purpose of distributionalism was to provide a scientific basis for syntax as independent of meaning. Zellig Harris defined 'distribution' as follows.
“The DISTRIBUTION of an element is the total of all environments in which it occurs, i.e. the sum of all the (different) positions of an element relative to the occurrence of other elements[.]”
{{cite book}}
: CS1 maint: location (link)