Opposite (semantics)

Last updated

In lexical semantics, opposites are words lying in an inherently incompatible binary relationship. For example, something that is male entails that it is not female. It is referred to as a 'binary' relationship because there are two members in a set of opposites. The relationship between opposites is known as opposition. A member of a pair of opposites can generally be determined by the question What is the opposite of  X ?

Contents

The term antonym (and the related antonymy) is commonly taken to be synonymous with opposite, but antonym also has other more restricted meanings. Graded (or gradable) antonyms are word pairs whose meanings are opposite and which lie on a continuous spectrum (hot, cold). Complementary antonyms are word pairs whose meanings are opposite but whose meanings do not lie on a continuous spectrum (push, pull). Relational antonyms are word pairs where opposite makes sense only in the context of the relationship between the two meanings (teacher, pupil). These more restricted meanings may not apply in all scholarly contexts, with Lyons (1968, 1977) defining antonym to mean gradable antonyms, and Crystal (2003) warning that antonymy and antonym should be regarded with care.

General discussion

Opposition is a semantic relation in which one word has a sense or meaning that negates or is, in the sense of scale, distant from a related word. Other words are capable of being opposed, but the language in question has an accidental gap in its lexicon. For example, the word devout lacks a lexical opposite, but it is fairly easy to conceptualize a parameter of devoutness where devout lies at the positive pole with a missing member at the negative pole. Opposites of such words can nevertheless sometimes be formed with the prefixes un- or non-, with varying degrees of naturalness. For example, the word undevout appears in Webster's dictionary of 1828, while the pattern of non-person could conceivably be extended to non-platypus. Conversely, some words appear to be a prefixed form of an opposite, but the opposite term does not exist, such as inept, which appears to be in- + *ept; such a word is known as an unpaired word.

Opposites may be viewed as a special type of incompatibility. [1] Words that are incompatible create the following type of entailment (where X is a given word and Y is a different word incompatible with word X): [2]

sentence A is  X  entails  sentence A is not  Y  [3]

An example of an incompatible pair of words is cat : dog:

It's a cat entails It's not a dog [4]

This incompatibility is also found in the opposite pairs fast : slow and stationary : moving, as can be seen below:

It's fast entails It's not slow [5]

It's stationary entails It's not moving

Cruse (2004) identifies some basic characteristics of opposites:

Some planned languages abundantly use such devices to reduce vocabulary multiplication. Esperanto has mal- (compare bona = "good" and malbona = "bad"), Damin has kuri- (tjitjuu "small", kuritjitjuu "large") and Newspeak has un- (as in ungood, "bad").

Some classes of opposites include:

Types of antonyms

An antonym is one of a pair of words with opposite meanings. Each word in the pair is the antithesis of the other. A word may have more than one antonym. There are three categories of antonyms identified by the nature of the relationship between the opposed meanings.

Gradable antonyms

A gradable antonym is one of a pair of words with opposite meanings where the two meanings lie on a continuous spectrum. Temperature is such a continuous spectrum so hot and cold, two meanings on opposite ends of the spectrum, are gradable antonyms. Other examples include: heavy : light, fat : skinny, dark : light, young : old, early : late, empty : full, dull : interesting.

Complementary antonyms

A complementary antonym, sometimes called a binary or contradictory antonym (Aarts, Chalker & Weiner 2014), is one of a pair of words with opposite meanings, where the two meanings do not lie on a continuous spectrum. There is no continuous spectrum between odd and even but they are opposite in meaning and are therefore complementary antonyms. Other examples include: mortal : immortal, exit : entrance, exhale : inhale, occupied : vacant.

Relational antonyms

A relational antonym is one of a pair of words that refer to a relationship from opposite points of view. There is no lexical opposite of teacher, but teacher and pupil are opposite within the context of their relationship. This makes them relational antonyms. Other examples include: husband : wife, doctor : patient, predator : prey, teach : learn, servant : master, come : go, parent : child.

Auto-antonyms

An auto-antonym is a word that can have opposite meanings in different contexts or under separate definitions:

See also

Notes

  1. Incompatibility can be compared to exclusive disjunction in logic.
  2. There are four types of entailment useful to lexical semantics:
    • unilateral entailment: It's a fish unilaterally entails It's an animal. (It is unilateral, i.e. one-directional, because It's an animal does not entail It's a fish since it could be a dog or a cat or some other animal.)
    • logical equivalence (or multilateral entailment): The party commenced at midnight entails The party began at midnight AND The party began at midnight also entails The party commenced since both cannot be simultaneously true. On the Aristotelian square of opposition, the A and E type propositions ('All As are Bs' and 'No As are Bs', respectively) are contraries of each other. Propositions that cannot be simultaneously false (e.g. 'Something is red' and 'Something is not red') are said to be subcontraries.
    • contradiction: It's dead entails It's not alive AND It's not alive entails It's dead AND It's alive entails It's not dead AND It's not dead entails It's alive. It's dead and It's alive are said to be in a contradictory relation.
  3. Stated differently, if the proposition expressed by the sentence A is  X  is TRUE, then the proposition expressed by the sentence A is not  Y  is also TRUE.
  4. It is assumed here that it has the same referent.
  5. It is also assumed here the reference point of comparison for these adjectives remains the same in both sentences. For example, a rabbit might be fast compared to turtle but slow compared to a sport car. It is essential when determining the relationships between the lexical meaning of words to keep the situational context identical.

Bibliography

Related Research Articles

Lexicology is the branch of linguistics that analyzes the lexicon of a specific language. A word is the smallest meaningful unit of a language that can stand on its own, and is made up of small components called morphemes and even smaller elements known as phonemes, or distinguishing sounds. Lexicology examines every feature of a word – including formation, spelling, origin, usage, and definition.

<span class="mw-page-title-main">Semantics</span> Study of meaning in language

Semantics is the study of linguistic meaning. It examines what meaning is, how words get their meaning, and how the meaning of a complex expression depends on its parts. Part of this process involves the distinction between sense and reference. Sense is given by the ideas and concepts associated with an expression while reference is the object to which an expression points. Semantics contrasts with syntax, which studies the rules that dictate how to create grammatically correct sentences, and pragmatics, which investigates how people use language in communication.

<span class="mw-page-title-main">Semantic network</span> Knowledge base that represents semantic relations between concepts in a network

A semantic network, or frame network is a knowledge base that represents semantic relations between concepts in a network. This is often used as a form of knowledge representation. It is a directed or undirected graph consisting of vertices, which represent concepts, and edges, which represent semantic relations between concepts, mapping or connecting semantic fields. A semantic network may be instantiated as, for example, a graph database or a concept map. Typical standardized semantic networks are expressed as semantic triples.

Semantic properties or meaning properties are those aspects of a linguistic unit, such as a morpheme, word, or sentence, that contribute to the meaning of that unit. Basic semantic properties include being meaningful or meaningless – for example, whether a given word is part of a language's lexicon with a generally understood meaning; polysemy, having multiple, typically related, meanings; ambiguity, having meanings which aren't necessarily related; and anomaly, where the elements of a unit are semantically incompatible with each other, although possibly grammatically sound. Beyond the expression itself, there are higher-level semantic relations that describe the relationship between units: these include synonymy, antonymy, and hyponymy.

A semantic feature is a component of the concept associated with a lexical item. More generally, it can also be a component of the concept associated with any grammatical unit, whether composed or not. An individual semantic feature constitutes one component of a word's intention, which is the inherent sense or concept evoked. Linguistic meaning of a word is proposed to arise from contrasts and significant differences with other words. Semantic features enable linguistics to explain how words that share certain features may be members of the same semantic domain. Correspondingly, the contrast in meanings of words is explained by diverging semantic features. For example, father and son share the common components of "human", "kinship", "male" and are thus part of a semantic domain of male family relations. They differ in terms of "generation" and "adulthood", which is what gives each its individual meaning.

<span class="mw-page-title-main">Hypernymy and hyponymy</span> Semantic relations involving the type-of property

Hypernymy and hyponymy are the semantic relations between a generic term (hypernym) and a specific instance of it (hyponym). The hypernym is also called a supertype, umbrella term, or blanket term. The hyponym is a subtype of the hypernym. The semantic field of the hyponym is included within that of the hypernym. For example, pigeon, crow, and hen are all hyponyms of bird and animal; bird and animal are both hypernyms of pigeon, crow, and hen.

Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.

In linguistics, focus is a grammatical category that conveys which part of the sentence contributes new, non-derivable, or contrastive information. In the English sentence "Mary only insulted BILL", focus is expressed prosodically by a pitch accent on "Bill" which identifies him as the only person Mary insulted. By contrast, in the sentence "Mary only INSULTED Bill", the verb "insult" is focused and thus expresses that Mary performed no other actions towards Bill. Focus is a cross-linguistic phenomenon and a major topic in linguistics. Research on focus spans numerous subfields including phonetics, syntax, semantics, pragmatics, and sociolinguistics.

Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently, not necessarily some difference between a person's conceptual world and the real world.

Frame semantics is a theory of linguistic meaning developed by Charles J. Fillmore that extends his earlier case grammar. It relates linguistic semantics to encyclopedic knowledge. The basic idea is that one cannot understand the meaning of a single word without access to all the essential knowledge that relates to that word. For example, one would not be able to understand the word "sell" without knowing anything about the situation of commercial transfer, which also involves, among other things, a seller, a buyer, goods, money, the relation between the money and the goods, the relations between the seller and the goods and the money, the relation between the buyer and the goods and the money and so on. Thus, a word activates, or evokes, a frame of semantic knowledge relating to the specific concept to which it refers.

Semantic change is a form of language change regarding the evolution of word usage—usually to the point that the modern meaning is radically different from the original usage. In diachronic linguistics, semantic change is a change in one of the meanings of a word. Every word has a variety of senses and connotations, which can be added, removed, or altered over time, often to the extent that cognates across space and time have very different meanings. The study of semantic change can be seen as part of etymology, onomasiology, semasiology, and semantics.

In linguistics, a word sense is one of the meanings of a word. For example, a dictionary may have over 50 different senses of the word "play", each of these having a different meaning based on the context of the word's usage in a sentence, as follows:

We went to see the playRomeo and Juliet at the theater.

The coach devised a great play that put the visiting team on the defensive.

The children went out to play in the park.

In linguistics, semantic analysis is the process of relating syntactic structures, from the levels of words, phrases, clauses, sentences and paragraphs to the level of the writing as a whole, to their language-independent meanings. It also involves removing features specific to particular linguistic and cultural contexts, to the extent that such a project is possible. The elements of idiom and figurative speech, being cultural, are often also converted into relatively invariant meanings in semantic analysis. Semantics, although related to pragmatics, is distinct in that the former deals with word or sentence choice in any given context, while pragmatics considers the unique or particular meaning derived from context or tone. To reiterate in different terms, semantics is about universally coded meaning, and pragmatics, the meaning encoded in words that is then interpreted by an audience.

<span class="mw-page-title-main">Semantic lexicon</span>

A semantic lexicon is a digital dictionary of words labeled with semantic classes so associations can be drawn between words that have not previously been encountered. Semantic lexicons are built upon semantic networks, which represent the semantic relations between words. The difference between a semantic lexicon and a semantic network is that a semantic lexicon has definitions for each word, or a "gloss".

In linguistics, a semantic field is a lexical set of words grouped semantically that refers to a specific subject. The term is also used in anthropology, computational semiotics, and technical exegesis.

Structural semantics is a linguistic school and paradigm that emerged in Europe from the 1930s, inspired by the structuralist linguistic movement started by Ferdinand de Saussure's 1916 work "Cours De Linguistique Generale".

<span class="mw-page-title-main">Distributional semantics</span> Field of linguistics

Distributional semantics is a research area that develops and studies theories and methods for quantifying and categorizing semantic similarities between linguistic items based on their distributional properties in large samples of language data. The basic idea of distributional semantics can be summed up in the so-called distributional hypothesis: linguistic items with similar distributions have similar meanings.

Computational lexicology is a branch of computational linguistics, which is concerned with the use of computers in the study of lexicon. It has been more narrowly described by some scholars as the use of computers in the study of machine-readable dictionaries. It is distinguished from computational lexicography, which more properly would be the use of computers in the construction of dictionaries, though some researchers have used computational lexicography as synonymous.

The Integrational theory of language is the general theory of language that has been developed within the general linguistic approach of integrational linguistics.

M. Lynne Murphy is a professor of linguistics at the University of Sussex. She runs the blog Separated by a Common Language under the username Lynneguist and has written five books.