A linguistic universal is a pattern that occurs systematically across natural languages, potentially true for all of them. For example, All languages have nouns and verbs , or If a language is spoken, it has consonants and vowels. Research in this area of linguistics is closely tied to the study of linguistic typology, and intends to reveal generalizations across languages, likely tied to cognition, perception, or other abilities of the mind. The field originates from discussions influenced by Noam Chomsky's proposal of a Universal Grammar, but was largely pioneered by the linguist Joseph Greenberg, who derived a set of forty-five basic universals, mostly dealing with syntax, from a study of some thirty languages.
Though there has been significant research into linguistic universals, in more recent time some linguists, including Nicolas Evans and Stephen C. Levinson, have argued against the existence of absolute linguistic universals that are shared across all languages. These linguists cite problems such as ethnocentrism amongst cognitive scientists, and thus linguists, as well as insufficient research into all of the world's languages in discussions related to linguistic universals, instead promoting these similarities as simply strong tendencies.
Linguists distinguish between two kinds of universals: absolute (opposite: statistical, often called tendencies) and implicational (opposite: non-implicational). Absolute universals apply to every known language and are quite few in number; an example is All languages have pronouns . An implicational universal applies to languages with a particular feature that is always accompanied by another feature, such as If a language has trial grammatical number, it also has dual grammatical number , while non-implicational universals just state the existence (or non-existence) of one particular feature.
Also in contrast to absolute universals are tendencies, statements that may not be true for all languages but nevertheless are far too common to be the result of chance. [1] They also have implicational and non-implicational forms. An example of the latter would be The vast majority of languages have nasal consonants . [2] However, most tendencies, like their universal counterparts, are implicational. For example, With overwhelmingly greater-than-chance frequency, languages with normal SOV order are postpositional . Strictly speaking, a tendency is not a kind of universal, but exceptions to most statements called universals can be found. For example, Latin is an SOV language with prepositions. Often it turns out that these exceptional languages are undergoing a shift from one type of language to another. In the case of Latin, its descendant Romance languages switched to SVO, which is a much more common order among prepositional languages.
Universals may also be bidirectional or unidirectional. In a bidirectional universal two features each imply the existence of each other. For example, languages with postpositions usually have SOV order, and likewise SOV languages usually have postpositions. The implication works both ways, and thus the universal is bidirectional. By contrast, in a unidirectional universal the implication works only one way. Languages that place relative clauses before the noun they modify again usually have SOV order, so pre-nominal relative clauses imply SOV. On the other hand, SOV languages worldwide show little preference for pre-nominal relative clauses, and thus SOV implies little about the order of relative clauses. As the implication works only one way, the proposed universal is a unidirectional one.
Linguistic universals in syntax are sometimes held up as evidence for universal grammar (although epistemological arguments are more common). Other explanations for linguistic universals have been proposed, for example, that linguistic universals tend to be properties of language that aid communication. If a language were to lack one of these properties, it has been argued, it would probably soon evolve into a language having that property. [3]
Michael Halliday has argued for a distinction between descriptive and theoretical categories in resolving the matter of the existence of linguistic universals, a distinction he takes from J.R. Firth and Louis Hjelmslev. He argues that "theoretical categories, and their inter-relations construe an abstract model of language...; they are interlocking and mutually defining". Descriptive categories, by contrast, are those set up to describe particular languages. He argues that "When people ask about 'universals', they usually mean descriptive categories that are assumed to be found in all languages. The problem is there is no mechanism for deciding how much alike descriptive categories from different languages have to be before they are said to be 'the same thing'". [4]
Noam Chomsky's work related to the innateness hypothesis as it pertains to our ability to rapidly learn any language without formal instruction and with limited input, or what he refers to as a poverty of the stimulus, is what began research into linguistic universals. This led to his proposal for a shared underlying grammar structure for all languages, a concept he called universal grammar (UG), which he claimed must exist somewhere in the human brain prior to language acquisition. Chomsky defines UG as "the system of principles, conditions, and rules that are elements or properties of all human languages... by necessity." [5] He states that UG expresses "the essence of human language," [5] and believes that the structure-dependent rules of UG allow humans to interpret and create an infinite number of novel grammatical sentences. Chomsky asserts that UG is the underlying connection between all languages and that the various differences between languages are all relative with respect to UG. He claims that UG is essential to our ability to learn languages, and thus uses it as evidence in a discussion of how to form a potential 'theory of learning' for how humans learn all or most of our cognitive processes throughout our lives. The discussion of Chomsky's UG, its innateness, and its connection to how humans learn language has been one of the more covered topics in linguistics studies to date. However, there is division amongst linguists between those who support Chomsky's claims of UG and those who argued against the existence of an underlying shared grammar structure that can account for all languages.
In semantics, research into linguistic universals has taken place in a number of ways. Some linguists, starting with Gottfried Leibniz, have pursued the search for a hypothetic irreducible semantic core of all languages. A modern variant of this approach can be found in the natural semantic metalanguage of Anna Wierzbicka and associates. See, for example, [6] and [7] Other lines of research suggest cross-linguistic tendencies to use body part terms metaphorically as adpositions, [8] or tendencies to have morphologically simple words for cognitively salient concepts. [9] The human body, being a physiological universal, provides an ideal domain for research into semantic and lexical universals. In a seminal study, Cecil H. Brown (1976) proposed a number of universals in the semantics of body part terminology, including the following: in any language, there will be distinct terms for BODY, HEAD, ARM, EYES, NOSE, and MOUTH; if there is a distinct term for FOOT, there will be a distinct term for HAND; similarly, if there are terms for INDIVIDUAL TOES, then there are terms for INDIVIDUAL FINGERS. Subsequent research has shown that most of these features have to be considered cross-linguistic tendencies rather than true universals. Several languages like Tidore and Kuuk Thaayorre lack a general term meaning 'body'. On the basis of such data it has been argued that the highest level in the partonomy of body part terms would be the word for 'person'. [10]
Some other examples of proposed linguistic universals in semantics include the idea that all languages possess words with the meaning '(biological) mother' and 'you (second person singular pronoun)' as well as statistical tendencies of meanings of basic color terms in relation to the number of color terms used by a respective language. Some theories of color naming suggest that if a language possesses only two terms for describing color, their respective meanings will be 'black' and 'white' (or perhaps 'dark' and 'light'), and if a language possesses more than two color terms, then the additional terms will follow trends related to the focal colors, which are determined by the physiology of how color is perceived, rather than linguistics. Thus, if a language possesses three color terms, the third will mean 'red', and if a language possesses four color terms, the next will mean 'yellow' or 'green'. If there are five color terms, then both 'yellow' and 'green' are added, if six, then 'blue' is added, and so on.
Nicolas Evans and Stephen C. Levinson are two linguists who have written against the existence of linguistic universals, making a particular mention towards issues with Chomsky's proposal for a Universal Grammar. They argue that across the 6,000-8,000 languages spoken around the world today, there are merely strong tendencies rather than universals at best. [11] In their view, these arise primarily due to the fact that many languages are connected to one another through shared historical backgrounds or common lineage, such as group Romance languages in Europe that were all derived from ancient Latin, and therefore it can be expected that they share some core similarities. Evans and Levinson believe that linguists who have previously proposed or supported concepts associated with linguistic universals have done so "under the assumption that most languages are English-like in their structure" [11] and only after analyzing a limited range of languages. They identify ethnocentrism, the idea "that most cognitive scientists, linguists included, speak only familiar European languages, all close cousins in structure," [11] as a possible influence towards the various issues they identify in the assertions made on linguistic universals. With regards to Chomsky's universal grammar, these linguists claim that the explanation of the structure and rules applied to UG are either false due to a lack of detail into the various constructions used when creating or interpreting a grammatical sentence, or that the theory is unfalsifiable due to the vague and oversimplified assertions made by Chomsky. Instead, Evans and Levinson highlight the vast diversity that exists amongst the many languages spoken around the world to advocate for further investigation into the many cross-linguistic variations that do exist. Their article promotes linguistic diversity by citing multiple examples of variation in how "languages can be structured at every level: phonetic, phonological, morphological, syntactic and semantic." [11] They claim that increased understanding and acceptance of linguistic diversity over the concepts of false claims of linguistic universals, better stated to them as strong tendencies, will lead to more enlightening discoveries in the studies of human cognition.
Functional linguistics is an approach to the study of language characterized by taking systematically into account the speaker's and the hearer's side, and the communicative needs of the speaker and of the given language community. Linguistic functionalism spawned in the 1920s to 1930s from Ferdinand de Saussure's systematic structuralist approach to language (1916).
Joseph Harold Greenberg was an American linguist, known mainly for his work concerning linguistic typology and the genetic classification of languages.
Language is a structured system of communication that consists of grammar and vocabulary. It is the primary means by which humans convey meaning, both in spoken and signed forms, and may also be conveyed through writing. Human language is characterized by its cultural and historical diversity, with significant variations observed between cultures and across time. Human languages possess the properties of productivity and displacement, which enable the creation of an infinite number of sentences, and the ability to refer to objects, events, and ideas that are not immediately present in the discourse. The use of human language relies on social convention and is acquired through learning.
The following outline is provided as an overview and topical guide to linguistics:
The Proto-Human language, also known as Proto-Sapiens or Proto-World, is the hypothetical direct genetic predecessor of all human languages.
In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.
Linguistic relativity asserts that language influences worldview or cognition. One form of linguistic relativity, linguistic determinism, regards peoples' languages as determining and influencing the scope of cultural perceptions of their surrounding world.
Universal grammar (UG), in modern linguistics, is the theory of the innate biological component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that there are innate constraints on what the grammar of a possible human language could be. When linguistic stimuli are received in the course of language acquisition, children then adopt specific syntactic rules that conform to UG. The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established, as some linguists have argued languages are so diverse that such universality is rare, and the theory of universal grammar remains controversial among linguists.
Linguistic typology is a field of linguistics that studies and classifies languages according to their structural features to allow their comparison. Its aim is to describe and explain the structural diversity and the common properties of the world's languages. Its subdisciplines include, but are not limited to phonological typology, which deals with sound features; syntactic typology, which deals with word order and form; lexical typology, which deals with language vocabulary; and theoretical typology, which aims to explain the universal tendencies.
In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones.
Anna Wierzbicka is a Polish linguist who is Emeritus Professor at the Australian National University, Canberra. Brought up in Poland, she graduated from Warsaw University and emigrated to Australia in 1972, where she has lived since. With over twenty published books, many of which have been translated into other languages, she is a prolific writer.
Natural semantic metalanguage (NSM) is a linguistic theory that reduces lexicons down to a set of semantic primitives. It is based on the conception of Polish professor Andrzej Bogusławski. The theory was formally developed by Anna Wierzbicka at Warsaw University and later at the Australian National University in the early 1970s, and Cliff Goddard at Australia's Griffith University.
Principles and parameters is a framework within generative linguistics in which the syntax of a natural language is described in accordance with general principles and specific parameters that for particular languages are either turned on or off. For example, the position of heads in phrases is determined by a parameter. Whether a language is head-initial or head-final is regarded as a parameter which is either on or off for particular languages. Principles and parameters was largely formulated by the linguists Noam Chomsky and Howard Lasnik. Many linguists have worked within this framework, and for a period of time it was considered the dominant form of mainstream generative linguistics.
In linguistics and social sciences, markedness is the state of standing out as nontypical or divergent as opposed to regular or common. In a marked–unmarked relation, one term of an opposition is the broader, dominant one. The dominant default or minimum-effort form is known as unmarked; the other, secondary one is marked. In other words, markedness involves the characterization of a "normal" linguistic unit against one or more of its possible "irregular" forms.
Two types of language change can be characterized as linguistic drift: a unidirectional short-term and cyclic long-term drift.
In the field of psychology, nativism is the view that certain skills or abilities are "native" or hard-wired into the brain at birth. This is in contrast to the "blank slate" or tabula rasa view, which states that the brain has inborn capabilities for learning from the environment but does not contain content such as innate beliefs. This factor contributes to the ongoing nature versus nurture dispute, one borne from the current difficulty of reverse engineering the subconscious operations of the brain, especially the human brain.
The linguistics wars were extended disputes among American theoretical linguists that occurred mostly during the 1960s and 1970s, stemming from a disagreement between Noam Chomsky and several of his associates and students. The debates started in 1967 when linguists Paul Postal, John R. Ross, George Lakoff, and James D. McCawley —self-dubbed the "Four Horsemen of the Apocalypse"—proposed an alternative approach in which the relation between semantics and syntax is viewed differently, which treated deep structures as meaning rather than syntactic objects. While Chomsky and other generative grammarians argued that meaning is driven by an underlying syntax, generative semanticists posited that syntax is shaped by an underlying meaning. This intellectual divergence led to two competing frameworks in generative semantics and interpretive semantics.
The concept of linguistic relativity concerns the relationship between language and thought, specifically whether language influences thought, and, if so, how. This question has led to research in multiple disciplines—including anthropology, cognitive science, linguistics, and philosophy. Among the most debated theories in this area of work is the Sapir–Whorf hypothesis. This theory states that the language a person speaks will affect the way that this person thinks. The theory varies between two main proposals: that language structure determines how individuals perceive the world and that language structure influences the world view of speakers of a given language but does not determine it.
Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.
In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.