Mentalist postulate

Last updated

The mentalist postulate is the thesis that meaning in natural language is an information structure that is mentally encoded by human beings. It is a basic premise of some branches of cognitive semantics. Semantic theories implicitly or explicitly incorporating the mentalist postulate include force dynamics and conceptual semantics.

Two implications of the mentalist postulate are: first, that research on the nature of mental representations can serve to constrain or enrich semantic theories; and secondly, that results of semantic theories bear directly on the nature of human conceptualization. [1]

Related Research Articles

Concept Mental representation or an abstract object

Concepts are defined as abstract ideas or general notions that occur in the mind, in speech, or in thought. They are understood to be the fundamental building blocks of thoughts and beliefs. They play an important role in all aspects of cognition. As such, concepts are studied by several disciplines, such as linguistics, psychology, and philosophy, and these disciplines are interested in the logical and psychological structure of concepts, and how they are put together to form thoughts and sentences. The study of concepts has served as an important flagship of an emerging interdisciplinary approach called cognitive science.

Semantics is the study of meaning, reference, or truth. The term can be used to refer to subfields of several distinct disciplines including linguistics, philosophy, and computer science.

Discourse

The term discourse identifies and describes written and spoken communications. In semantics and discourse analysis, a discourse is a conceptual generalization of conversation. In a field of enquiry and social practice, the discourse is the vocabulary for investigation of the subject, e.g. legal discourse, medical discourse, religious discourse, et cetera. In the works of the philosopher Michel Foucault, a discourse is “an entity of sequences, of signs, in that they are enouncements (énoncés).”

Jerry Fodor

Jerry Alan Fodor was an American philosopher and the author of many crucial works in the fields of philosophy of mind and cognitive science. His writings in these fields laid the groundwork for the modularity of mind and the language of thought hypotheses, and he is recognized as having had "an enormous influence on virtually every portion of the philosophy of mind literature since 1960." Until his death in 2017 he held the position of State of New Jersey Professor of Philosophy, Emeritus, at Rutgers University.

The notion that language influences thought has a long history in a variety of fields. There are two bodies of thought forming around this debate. One body of thought stems from linguistics and is known as the Sapir-Whorf hypothesis. There is a strong and a weak version of the hypothesis which argue for more or less influence of language on thought. The strong version, linguistic determinism, argues that without language there is and can be no thought while the weak version, linguistic relativity, supports the idea that there are some influences from language on thought. And on the opposing side, there are 'language of thought' theories (LOTH) which believe that public language is inessential to private thought. LOTH theories address the debate of whether thought is possible without language which is related to the question of whether language evolved for thought. These ideas are difficult to study because it proves challenging to parse the effects of culture versus thought versus language in all academic fields.

Ray Jackendoff

Ray Jackendoff is an American linguist. He is professor of philosophy, Seth Merrin Chair in the Humanities and, with Daniel Dennett, co-director of the Center for Cognitive Studies at Tufts University. He has always straddled the boundary between generative linguistics and cognitive linguistics, committed to both the existence of an innate universal grammar and to giving an account of language that is consistent with the current understanding of the human mind and cognition.

Inferential role semantics is an approach to the theory of meaning that identifies the meaning of an expression with its relationship to other expressions, in contradistinction to denotationalism, according to which denotations are the primary sort of meaning.

Conceptual semantics is a framework for semantic analysis developed mainly by Ray Jackendoff in 1976. Its aim is to provide a characterization of the conceptual elements by which a person understands words and sentences, and thus to provide an explanatory semantic representation. Explanatory in this sense refers to the ability of a given linguistic theory to describe how a component of language is acquired by a child.

Prototype theory is a mode of graded categorization in cognitive science, where some members of a conceptual category are more central than others. In this theory, any given concept in any given language has a real world example that best represents this concept. For example: when asked to give an example of the concept furniture, a couch is more frequently cited than, say, a wardrobe. Prototype theory has also been applied in linguistics, as part of the mapping from phonological structure to semantics.

Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently, not necessarily some difference between a person's conceptual world and the real world.

Frame semantics is a theory of linguistic meaning developed by Charles J. Fillmore that extends his earlier case grammar. It relates linguistic semantics to encyclopedic knowledge. The basic idea is that one cannot understand the meaning of a single word without access to all the essential knowledge that relates to that word. For example, one would not be able to understand the word "sell" without knowing anything about the situation of commercial transfer, which also involves, among other things, a seller, a buyer, goods, money, the relation between the money and the goods, the relations between the seller and the goods and the money, the relation between the buyer and the goods and the money and so on. Thus, a word activates, or evokes, a frame of semantic knowledge relating to the specific concept to which it refers.

In philosophy of mind, the computational theory of mind (CTM), also known as computationalism, is a family of views that hold that the human mind is an information processing system and that cognition and consciousness together are a form of computation. Warren McCulloch and Walter Pitts (1943) were the first to suggest that neural activity is computational. They argued that neural computations explain cognition. The theory was proposed in its modern form by Hilary Putnam in 1967, and developed by his PhD student, philosopher and cognitive scientist Jerry Fodor in the 1960s, 1970s and 1980s. Despite being vigorously disputed in analytic philosophy in the 1990s due to work by Putnam himself, John Searle, and others, the view is common in modern cognitive psychology and is presumed by many theorists of evolutionary psychology. In the 2000s and 2010s the view has resurfaced in analytic philosophy.

In the philosophy of language, metaphysics, and metasemantics, meaning "is a relationship between two sorts of things: signs and the kinds of things they intend, express, or signify".

A mental representation, in philosophy of mind, cognitive psychology, neuroscience, and cognitive science, is a hypothetical internal cognitive symbol that represents external reality, or else a mental process that makes use of such a symbol: "a formal system for making explicit certain entities or types of information, together with a specification of how the system does this".

Computational semantics is the study of how to automate the process of constructing and reasoning with meaning representations of natural language expressions. It consequently plays an important role in natural language processing and computational linguistics.

<i>Aspects of the Theory of Syntax</i>

Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.

This is an index of articles in philosophy of language

Ideasthesia Idea in psychology

Ideasthesia is a neuroscientific phenomenon in which activations of concepts (inducers) evoke perception-like sensory experiences (concurrents). The name comes from the Ancient Greek ἰδέα (idéa) and αἴσθησις (aísthēsis), meaning "sensing concepts" or "sensing ideas". The notion was introduced by neuroscientist Danko Nikolić as an alternative explanation for a set of phenomena traditionally covered by synesthesia.

The Integrational theory of language is the general theory of language that has been developed within the general linguistic approach of integrational linguistics.

Semantic folding theory describes a procedure for encoding the semantics of natural language text in a semantically grounded binary representation. This approach provides a framework for modelling how language data is processed by the neocortex.

References

  1. Jackendoff, Ray (1988). "Conceptual Semantics". In Umberto Eco; Marco Santambrogio; Patrizia Violi (eds.). Meaning and mental representations. Indiana University Press. pp. 81–97. ISBN   978-0-253-33724-5.