In lexical semantics, opposites are words lying in an inherently incompatible binary relationship. For example, something that is even entails that it is not odd. It is referred to as a 'binary' relationship because there are two members in a set of opposites. The relationship between opposites is known as opposition. A member of a pair of opposites can generally be determined by the question What is the opposite of X ?
The term antonym (and the related antonymy) is commonly taken to be synonymous with opposite, but antonym also has other more restricted meanings. Graded (or gradable) antonyms are word pairs whose meanings are opposite and which lie on a continuous spectrum (hot, cold). Complementary antonyms are word pairs whose meanings are opposite but whose meanings do not lie on a continuous spectrum (push, pull). Relational antonyms are word pairs where opposite makes sense only in the context of the relationship between the two meanings (teacher, pupil). These more restricted meanings may not apply in all scholarly contexts, with Lyons (1968, 1977) defining antonym to mean gradable antonyms, and Crystal (2003) warning that antonymy and antonym should be regarded with care.
Opposition is a semantic relation in which one word has a sense or meaning that negates or, in terms of a scale, is distant from a related word. Some words lack a lexical opposite due to an accidental gap in the language's lexicon. For instance, while the word "devout" has no direct opposite, it is easy to conceptualize a scale of devoutness, where "devout" lies at the positive end with a missing counterpart at the negative end. In certain cases, opposites can be formed with prefixes like "un-" or "non-," with varying levels of naturalness. For example, "undevout" is found in Webster's 1828 dictionary, while the prefix pattern of "non-person" could theoretically extend to "non-platypus."
Conversely, some words appear to be derived from a prefix suggesting opposition, yet the root term does not exist. An example is "inept," which seems to be "in-" + *"ept," although the word "ept" itself does not exist[ citation needed ]. Such words are known as unpaired words. Opposites may be viewed as a special type of incompatibility. [1] Words that are incompatible create the following type of entailment (where X is a given word and Y is a different word incompatible with word X): [2]
An example of an incompatible pair of words is cat : dog:
This incompatibility is also found in the opposite pairs fast : slow and stationary : moving, as can be seen below:
It's fast entails It's not slow [5]
Cruse (2004) identifies some basic characteristics of opposites:
Some planned languages abundantly use such devices to reduce vocabulary multiplication. Esperanto has mal- (compare bona = "good" and malbona = "bad"), Damin has kuri- (tjitjuu "small", kuritjitjuu "large") and Newspeak has un- (as in ungood, "bad").
Some classes of opposites include:
An antonym is one of a pair of words with opposite meanings. Each word in the pair is the antithesis of the other. A word may have more than one antonym. There are three categories of antonyms identified by the nature of the relationship between the opposed meanings.
A gradable antonym is one of a pair of words with opposite meanings where the two meanings lie on a continuous spectrum. Temperature is such a continuous spectrum so hot and cold, two meanings on opposite ends of the spectrum, are gradable antonyms. Other examples include: heavy : light, fat : skinny, dark : light, young : old, early : late, empty : full, dull : interesting.
A complementary antonym, sometimes called a binary or contradictory antonym (Aarts, Chalker & Weiner 2014), is one of a pair of words with opposite meanings, where the two meanings do not lie on a continuous spectrum. There is no continuous spectrum between odd and even but they are opposite in meaning and are therefore complementary antonyms. Other examples include: mortal : immortal, exit : entrance, exhale : inhale, occupied : vacant.
A relational antonym is one of a pair of words that refer to a relationship from opposite points of view. There is no lexical opposite of teacher, but teacher and pupil are opposite within the context of their relationship. This makes them relational antonyms. Other examples include: husband : wife, doctor : patient, predator : prey, teach : learn, servant : master, come : go, parent : child.
An auto-antonym is a word that can have opposite meanings in different contexts or under separate definitions:
Lexicology is the branch of linguistics that analyzes the lexicon of a specific language. A word is the smallest meaningful unit of a language that can stand on its own, and is made up of small components called morphemes and even smaller elements known as phonemes, or distinguishing sounds. Lexicology examines every feature of a word – including formation, spelling, origin, usage, and definition.
Semantics is the study of linguistic meaning. It examines what meaning is, how words get their meaning, and how the meaning of a complex expression depends on its parts. Part of this process involves the distinction between sense and reference. Sense is given by the ideas and concepts associated with an expression while reference is the object to which an expression points. Semantics contrasts with syntax, which studies the rules that dictate how to create grammatically correct sentences, and pragmatics, which investigates how people use language in communication.
A semantic network, or frame network is a knowledge base that represents semantic relations between concepts in a network. This is often used as a form of knowledge representation. It is a directed or undirected graph consisting of vertices, which represent concepts, and edges, which represent semantic relations between concepts, mapping or connecting semantic fields. A semantic network may be instantiated as, for example, a graph database or a concept map. Typical standardized semantic networks are expressed as semantic triples.
Semantic properties or meaning properties are those aspects of a linguistic unit, such as a morpheme, word, or sentence, that contribute to the meaning of that unit. Basic semantic properties include being meaningful or meaningless – for example, whether a given word is part of a language's lexicon with a generally understood meaning; polysemy, having multiple, typically related, meanings; ambiguity, having meanings which aren't necessarily related; and anomaly, where the elements of a unit are semantically incompatible with each other, although possibly grammatically sound. Beyond the expression itself, there are higher-level semantic relations that describe the relationship between units: these include synonymy, antonymy, and hyponymy.
A semantic feature is a component of the concept associated with a lexical item. More generally, it can also be a component of the concept associated with any grammatical unit, whether composed or not. An individual semantic feature constitutes one component of a word's intention, which is the inherent sense or concept evoked. Linguistic meaning of a word is proposed to arise from contrasts and significant differences with other words. Semantic features enable linguistics to explain how words that share certain features may be members of the same semantic domain. Correspondingly, the contrast in meanings of words is explained by diverging semantic features. For example, father and son share the common components of "human", "kinship", "male" and are thus part of a semantic domain of male family relations. They differ in terms of "generation" and "adulthood", which is what gives each its individual meaning.
Hypernymy and hyponymy are the semantic relations between a generic term (hypernym) and a more specific term (hyponym). The hypernym is also called a supertype, umbrella term, or blanket term. The hyponym names a subtype of the hypernym. The semantic field of the hyponym is included within that of the hypernym. For example, pigeon, crow, and hen are all hyponyms of bird and animal; bird and animal are both hypernyms of pigeon, crow, and hen.
Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.
In linguistics, focus is a grammatical category that conveys which part of the sentence contributes new, non-derivable, or contrastive information. In the English sentence "Mary only insulted BILL", focus is expressed prosodically by a pitch accent on "Bill" which identifies him as the only person whom Mary insulted. By contrast, in the sentence "Mary only INSULTED Bill", the verb "insult" is focused and thus expresses that Mary performed no other actions towards Bill. Focus is a cross-linguistic phenomenon and a major topic in linguistics. Research on focus spans numerous subfields including phonetics, syntax, semantics, pragmatics, and sociolinguistics.
Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently, not necessarily some difference between a person's conceptual world and the real world.
Semantic change is a form of language change regarding the evolution of word usage—usually to the point that the modern meaning is radically different from the original usage. In diachronic linguistics, semantic change is a change in one of the meanings of a word. Every word has a variety of senses and connotations, which can be added, removed, or altered over time, often to the extent that cognates across space and time have very different meanings. The study of semantic change can be seen as part of etymology, onomasiology, semasiology, and semantics.
In linguistics, a word sense is one of the meanings of a word. For example, a dictionary may have over 50 different senses of the word "play", each of these having a different meaning based on the context of the word's usage in a sentence, as follows:
We went to see the playRomeo and Juliet at the theater.
The coach devised a great play that put the visiting team on the defensive.
The children went out to play in the park.
In linguistics, semantic analysis is the process of relating syntactic structures, from the levels of words, phrases, clauses, sentences and paragraphs to the level of the writing as a whole, to their language-independent meanings. It also involves removing features specific to particular linguistic and cultural contexts, to the extent that such a project is possible. The elements of idiom and figurative speech, being cultural, are often also converted into relatively invariant meanings in semantic analysis. Semantics, although related to pragmatics, is distinct in that the former deals with word or sentence choice in any given context, while pragmatics considers the unique or particular meaning derived from context or tone. To reiterate in different terms, semantics is about universally coded meaning, and pragmatics, the meaning encoded in words that is then interpreted by an audience.
A semantic lexicon is a digital dictionary of words labeled with semantic classes so associations can be drawn between words that have not previously been encountered. Semantic lexicons are built upon semantic networks, which represent the semantic relations between words. The difference between a semantic lexicon and a semantic network is that a semantic lexicon has definitions for each word, or a "gloss".
Structural semantics is a linguistic school and paradigm that emerged in Europe from the 1930s, inspired by the structuralist linguistic movement started by Ferdinand de Saussure's 1916 work "Cours De Linguistique Generale".
Computational lexicology is a branch of computational linguistics, which is concerned with the use of computers in the study of lexicon. It has been more narrowly described by some scholars as the use of computers in the study of machine-readable dictionaries. It is distinguished from computational lexicography, which more properly would be the use of computers in the construction of dictionaries, though some researchers have used computational lexicography as synonymous.
In linguistics, converses or relational antonyms are pairs of words that refer to a relationship from opposite points of view, such as parent/child or borrow/lend. The relationship between such words is called a converse relation. Converses can be understood as a pair of words where one word implies a relationship between two objects, while the other implies the existence of the same relationship when the objects are reversed. Converses are sometimes referred to as complementary antonyms because an "either/or" relationship is present between them. One exists only because the other exists.
The linguistics wars were extended disputes among American theoretical linguists that occurred mostly during the 1960s and 1970s, stemming from a disagreement between Noam Chomsky and several of his associates and students. The debates started in 1967 when linguists Paul Postal, John R. Ross, George Lakoff, and James D. McCawley —self-dubbed the "Four Horsemen of the Apocalypse"—proposed an alternative approach in which the relation between semantics and syntax is viewed differently, which treated deep structures as meaning rather than syntactic objects. While Chomsky and other generative grammarians argued that meaning is driven by an underlying syntax, generative semanticists posited that syntax is shaped by an underlying meaning. This intellectual divergence led to two competing frameworks in generative semantics and interpretive semantics.
Lexical field theory, or word-field theory, was introduced on March 12, 1931, by the German linguist Jost Trier. He argued that words acquired their meaning through their relationships to other words within the same word-field. An extension of the sense of one word narrows the meaning of neighboring words, with the words in a field fitting neatly together like a mosaic. If a single word undergoes a semantic change, then the whole structure of the lexical field changes. The lexical field is often used in English to describe terms further with use of different words.
The Integrational theory of language is the general theory of language that has been developed within the general linguistic approach of integrational linguistics.
M. Lynne Murphy is a professor of linguistics at the University of Sussex, England. She runs the blog Separated by a Common Language under the username Lynneguist and has written five books.