Conceptual semantics

Last updated

Conceptual semantics is a framework for semantic analysis developed mainly by Ray Jackendoff in 1976. Its aim is to provide a characterization of the conceptual elements by which a person understands words and sentences, and thus to provide an explanatory semantic representation (title of a Jackendoff 1976 paper). Explanatory in this sense refers to the ability of a given linguistic theory to describe how a component of language is acquired by a child (as proposed by Noam Chomsky; see Levels of adequacy).

Contents

Recently, conceptual semantics in particular, and lexical semantics in general, have taken on increasing importance in linguistics and psycholinguistics. Many contemporary theories of syntax (how sentences are constructed from individual words) rely on elements that are idiosyncratic to words themselves. As a result, a sound theory accounting for the properties of the meanings of words is required.

Meaning and decomposition

Jackendoff has claimed that the goal of conceptual semantics is to investigate:

"...how linguistic utterances are related to human cognition, where cognition is a human capacity that is to a considerable degree independent of language, interacting with the perceptual and action systems as well as language."

(Jackendoff 2006:355)

Conceptual semantics distinguishes a single, universal meaning to a word. Instead of having a lexical semantic meaning in addition to the conceptual representation of the actual referent, here the two are combined into what Jackendoff calls "lexical concepts" (Murphy 2010:59). Conceptual semantics is considered to be not just a linguistic theory, but a theory on human cognition. As do many semantic theories, Jackendoff claims that a decompositional method is necessary to explore conceptualization. Just as one of the ways a physical scientist tries to understand matter is by breaking it down into progressively smaller parts, so a scientific study of conceptualization proceeds by breaking down, or decomposing, meanings into smaller parts. However, this decomposition cannot go on forever, for at some point, meanings can no longer be broken down.

This is the level of conceptual structure, the level of mental representations which encode the human understanding of the world, containing the primitive conceptual elements out of which meanings are built, plus their rules of combination. Conceptual semantics does not work with a mental dictionary, in the classical sense. There are no definitions attached to concepts and reference, only the idea of the concept or reference itself. Just as generative syntax posits a finite set of syntactic categories and rules for combining them, so, too, does Conceptual Semantics posit 'a finite set of mental primitives and a finite set of principles of mental combination' governing their interaction (Jackendoff 1990: 9). Jackendoff refers to this set of primitives and the rules governing them as the 'grammar of sentential concepts' (Jackendoff 1990: 9).

His starting point is a close analysis of the meanings of lexemes dedicated to bringing out parallelisms and contrasts which reveal the nature of the conceptual structures underlying them. Jackendoff considers the lexicon to be made of three parts: phonological, syntactic, and conceptual. These three aspects of a concept give a "full picture of a word" (Murphy 2010:60). What his method shows, he says, is that the psychological organization on which meaning rests 'lies a very short distance below the surface of everyday lexical items – and that progress can be made in exploring it' (1991: 44). Jackendoff claims that a decompositional method is necessary to explore conceptual structure, in which the concepts underlying word meaning are broken down into their smallest elements: conceptual primitives envisaged as the semantic equivalents of phonological features. Conceptual Semantics posits 'a finite set of mental primitives and a finite set of principles of mental combination' governing their interaction. The conceptual structure of a lexical item is an element with zero or more open argument slots, which are filled by the syntactic complements of the lexical item.

Semantic structures

Conceptual semantics breaks lexical concepts up into ontological categories: events, states, places, amounts, things, and property, to name a few. These ontological categories are called semantic primes, or semantic primitives. Jackendoff poses that any concept in the human brain can be expressed using these semantic primes. Conceptual semantics is compositional, in that the meanings of phrases, clauses, and sentences can be determined from the lexical concepts that make them up. (Murphy 2010:66)

Problems

Jackendoff's system has been criticised for its highly abstract primitives, which linguists such as Wierzbicka (2007a, 2007b) and Goddard (1998, 2001) have called "obscure". The main reason for this is because one requires special training to understand them, and they often must be translated into plain English to be communicated. Another criticism often raised against conceptual semantics is that it is arbitrary. In its current state, there are no clear procedures for determining when a primitive is justified. Another criticism Wierzbicka and Goddard have raised is that the theory was formulated around and applied only to English, though it claims to be universal.

Jackendoff responds to these criticisms by saying:

In fact, an isolated primitive can never be justified: a primitive makes sense only in the context of the overall system of primitives in which it is embedded. With this proviso, however, I think a particular choice of primitives should be justified on the grounds of its capacity for expressing generalizations and explaining the distribution of the data. That is, a proposed system of primitives is subject to the usual scientific standards of evaluation.

(Jackendoff 1990)

See also

Related Research Articles

<span class="mw-page-title-main">Semantics</span> Study of meaning in language

Semantics is the study of reference, meaning, or truth. The term can be used to refer to subfields of several distinct disciplines, including philosophy, linguistics and computer science.

In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones.

Anna Wierzbicka is a Polish linguist who is Emeritus Professor at the Australian National University, Canberra. Brought up in Poland, she graduated from Warsaw University and emigrated to Australia in 1972, where she has lived since. With over twenty published books, many of which have been translated into other languages, she is a prolific writer.

The natural semantic metalanguage (NSM) is a linguistic theory that reduces lexicons down to a set of semantic primitives. It is based on the conception of Polish professor Andrzej Bogusławski. The theory was formally developed by Anna Wierzbicka at Warsaw University and later at the Australian National University in the early 1970s, and Cliff Goddard at Australia's Griffith University.

Lexical semantics, as a subfield of linguistic semantics, is the study of word meanings. It includes the study of how words structure their meaning, how they act in grammar and compositionality, and the relationships between the distinct senses and uses of a word.

<span class="mw-page-title-main">Ray Jackendoff</span> American linguist and philosophy professor

Ray Jackendoff is an American linguist. He is professor of philosophy, Seth Merrin Chair in the Humanities and, with Daniel Dennett, co-director of the Center for Cognitive Studies at Tufts University. He has always straddled the boundary between generative linguistics and cognitive linguistics, committed to both the existence of an innate universal grammar and to giving an account of language that is consistent with the current understanding of the human mind and cognition.

A linguistic universal is a pattern that occurs systematically across natural languages, potentially true for all of them. For example, All languages have nouns and verbs, or If a language is spoken, it has consonants and vowels. Research in this area of linguistics is closely tied to the study of linguistic typology, and intends to reveal generalizations across languages, likely tied to cognition, perception, or other abilities of the mind. The field originates from discussions influenced by Noam Chomsky's proposal of a Universal Grammar, but was largely pioneered by the linguist Joseph Greenberg, who derived a set of forty-five basic universals, mostly dealing with syntax, from a study of some thirty languages.

Theta roles are the names of the participant roles associated with a predicate: the predicate may be a verb, an adjective, a preposition, or a noun. If an object is in motion or in a steady state as the speakers perceives the state, or it is the topic of discussion, it is called a theme. The participant is usually said to be an argument of the predicate. In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.

Force dynamics is a semantic category that describes the way in which entities interact with reference to force. Force Dynamics gained a good deal of attention in cognitive linguistics due to its claims of psychological plausibility and the elegance with which it generalizes ideas not usually considered in the same context. The semantic category of force dynamics pervades language on several levels. Not only does it apply to expressions in the physical domain like leaning on or dragging, but it also plays an important role in expressions involving psychological forces. Furthermore, the concept of force dynamics can be extended to discourse. For example, the situation in which speakers A and B argue, after which speaker A gives in to speaker B, exhibits a force dynamic pattern.

Construction grammar is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human language. Constructions include words, morphemes, fixed expressions and idioms, and abstract grammatical rules such as the passive voice or the ditransitive. Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts, or from other constructions that are recognized to exist. In construction grammar, every utterance is understood to be a combination of multiple different constructions, which together specify its precise meaning and form.

Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently, not necessarily some difference between a person's conceptual world and the real world.

In linguistics, linguistic competence is the system of unconscious knowledge that one knows when they know a language. It is distinguished from linguistic performance, which includes all other factors that allow one to use one's language in practice.

Bootstrapping is a term used in language acquisition in the field of linguistics. It refers to the idea that humans are born innately equipped with a mental faculty that forms the basis of language. It is this language faculty that allows children to effortlessly acquire language. As a process, bootstrapping can be divided into different domains, according to whether it involves semantic bootstrapping, syntactic bootstrapping, prosodic bootstrapping, or pragmatic bootstrapping.

In certain theories of linguistics, thematic relations, also known as semantic roles, are the various roles that a noun phrase may play with respect to the action or state described by a governing verb, commonly the sentence's main verb. For example, in the sentence "Susan ate an apple", Susan is the doer of the eating, so she is an agent; an apple is the item that is eaten, so it is a patient.

The linguistics wars were a protracted academic dispute inside American theoretical linguistics that took place mostly in the 1960s and 1970s, stemming from an intellectual falling out between Noam Chomsky and some of his early colleagues and doctoral students. The debate began in 1967, when linguists Paul Postal, "Haj" Ross, George Lakoff, and James McCawley—self-dubbed the "Four Horsemen of the Apocalypse" —proposed an approach to the relationship between syntax and semantics, which treated deep structures as meanings rather than syntactic objects. While Chomsky and other generative grammarians argued that the meaning of a sentence was derived from its syntax, the generative semanticists argued that syntax was derived from meaning.

Cliff Goddard is a professor of linguistics at Griffith University, Queensland, Australia. He is, with Anna Wierzbicka, a leading proponent of the Natural Semantic Metalanguage approach to linguistic analysis. Goddard's research has explored cognitive and cultural aspects of everyday language and language use. He is considered a leading scholar in the fields of semantics and cross-cultural pragmatics. His work spans English, indigenous Australian languages, and South East Asian languages.

The Modular Online Growth and Use of Language (MOGUL) project is the cover term name for any research on language carried out using the Modular Cognition Framework (MCF).

The Integrational theory of language is the general theory of language that has been developed within the general linguistic approach of integrational linguistics.

The Modular Cognition Framework (MCF) is an open-ended theoretical framework for research into the way the mind is organized. It draws on the common ground shared by contemporary research in the various areas that are collectively known as cognitive science and is designed to be applicable to all these fields of research. It was established, by Michael Sharwood Smith and John Truscott in the first decade of the 21st century with a particular focus on language cognition when it was known as the MOGUL framework.

In linguistics, the syntax–semantics interface is the interaction between syntax and semantics. Its study encompasses phenomena that pertain to both syntax and semantics, with the goal of explaining correlations between form and meaning. Specific topics include scope, binding, and lexical semantic properties such as verbal aspect and nominal individuation, semantic macroroles, and unaccusativity.

References