Truth condition

Last updated

In semantics and pragmatics, a truth condition is the condition under which a sentence is true. For example, "It is snowing in Nebraska" is true precisely when it is snowing in Nebraska. Truth conditions of a sentence do not necessarily reflect current reality. They are merely the conditions under which the statement would be true. [1]

More formally, a truth condition makes for the truth of a sentence in an inductive definition of truth (for details, see the semantic theory of truth). Understood this way, truth conditions are theoretical entities. To illustrate with an example: suppose that, in a particular truth theory [2] which is a theory of truth where truth is somehow made acceptable despite semantic terms as close as possible, the word "Nixon" refers to Richard M. Nixon, and "is alive" is associated with the set of currently living things. Then one way of representing the truth condition of "Nixon is alive" is as the ordered pair <Nixon, {x: x is alive}>. And we say that "Nixon is alive" is true if and only if the referent (or referent of) "Nixon" belongs to the set associated with "is alive", that is, if and only if Nixon is alive.

In semantics, the truth condition of a sentence is almost universally considered distinct from its meaning. The meaning of a sentence is conveyed if the truth conditions for the sentence are understood. Additionally, there are many sentences that are understood although their truth condition is uncertain. One popular argument for this view is that some sentences are necessarily true—that is, they are true whatever happens to obtain. All such sentences have the same truth conditions, but arguably do not thereby have the same meaning. Likewise, the sets {x: x is alive} and {x: x is alive and x is not a rock} are identical—they have precisely the same members—but presumably the sentences "Nixon is alive" and "Nixon is alive and is not a rock" have different meanings.

See also

Notes and references

  1. Birner, Betty J. Introduction to Pragmatics. 2013. Wiley-Blackwell.
  2. Field, Hartry (1972). Tarski's Theory of Truth. The Journal of Philosophy,69(13), 347-375. doi : 10.2307/2024879

Related Research Articles

<span class="mw-page-title-main">Logical disjunction</span> Logical connective OR

In logic, disjunction, also known as logical disjunction or logical or or logical addition or inclusive disjunction, is a logical connective typically notated as and read aloud as "or". For instance, the English language sentence "it is sunny or it is warm" can be represented in logic using the disjunctive formula , assuming that abbreviates "it is sunny" and abbreviates "it is warm".

In the philosophy of language, a proper name – examples include a name of a specific person or place – is a name which ordinarily is taken to uniquely identify its referent in the world. As such it presents particular challenges for theories of meaning, and it has become a central problem in analytic philosophy. The common-sense view was originally formulated by John Stuart Mill in A System of Logic (1843), where he defines it as "a word that answers the purpose of showing what thing it is that we are talking about but not of telling anything about it". This view was criticized when philosophers applied principles of formal logic to linguistic propositions. Gottlob Frege pointed out that proper names may apply to imaginary and nonexistent entities, without becoming meaningless, and he showed that sometimes more than one proper name may identify the same entity without having the same sense, so that the phrase "Homer believed the morning star was the evening star" could be meaningful and not tautological in spite of the fact that the morning star and the evening star identifies the same referent. This example became known as Frege's puzzle and is a central issue in the theory of proper names.

<span class="mw-page-title-main">Semantics</span> Study of meaning in language

Semantics is the study of reference, meaning, or truth. The term can be used to refer to subfields of several distinct disciplines, including philosophy, linguistics and computer science.

<span class="mw-page-title-main">Pragmatics</span> Branch of linguistics and semiotics relating context to meaning

In linguistics and related fields, pragmatics is the study of how context contributes to meaning. The field of study evaluates how human language is utilized in social interactions, as well as the relationship between the interpreter and the interpreted. Linguists who specialize in pragmatics are called pragmaticians. The field has been represented since 1986 by the International Pragmatics Association (IPrA).

<span class="mw-page-title-main">Saul Kripke</span> American philosopher and logician (1940–2022)

Saul Aaron Kripke was an American analytic philosopher and logician. He was Distinguished Professor of Philosophy at the Graduate Center of the City University of New York and emeritus professor at Princeton University. Kripke is considered one of the most important philosophers of the latter half of the 20th century. Since the 1960s, he has been a central figure in a number of fields related to mathematical and modal logic, philosophy of language and mathematics, metaphysics, epistemology, and recursion theory.

<span class="mw-page-title-main">Sense and reference</span> Distinction in the philosophy of language

In the philosophy of language, the distinction between sense and reference was an idea of the German philosopher and mathematician Gottlob Frege in 1892, reflecting the two ways he believed a singular term may have meaning.

A semantic theory of truth is a theory of truth in the philosophy of language which holds that truth is a property of sentences.

Truth-conditional semantics is an approach to semantics of natural language that sees meaning as being the same as, or reducible to, their truth conditions. This approach to semantics is principally associated with Donald Davidson, and attempts to carry out for the semantics of natural language what Tarski's semantic theory of truth achieves for the semantics of logic.

David Benjamin Kaplan is an American philosopher. He is the Hans Reichenbach Professor of Scientific Philosophy at the UCLA Department of Philosophy. His philosophical work focuses on the philosophy of language, logic, metaphysics, epistemology and the philosophy of Frege and Russell. He is best known for his work on demonstratives, propositions, and reference in intensional contexts. He was elected a Fellow of the American Academy of Arts & Sciences in 1983 and a Corresponding Fellow of the British Academy in 2007.

Cognitive semantics is part of the cognitive linguistics movement. Semantics is the study of linguistic meaning. Cognitive semantics holds that language is part of a more general human cognitive ability, and can therefore only describe the world as people conceive of it. It is implicit that different linguistic communities conceive of simple things and processes in the world differently, not necessarily some difference between a person's conceptual world and the real world.

In formal linguistics, discourse representation theory (DRT) is a framework for exploring meaning under a formal semantics approach. One of the main differences between DRT-style approaches and traditional Montagovian approaches is that DRT includes a level of abstract mental representations within its formalism, which gives it an intrinsic ability to handle meaning across sentence boundaries. DRT was created by Hans Kamp in 1981. A very similar theory was developed independently by Irene Heim in 1982, under the name of File Change Semantics (FCS). Discourse representation theories have been used to implement semantic parsers and natural language understanding systems.

In the philosophy of language, the descriptivist theory of proper names is the view that the meaning or semantic content of a proper name is identical to the descriptions associated with it by speakers, while their referents are determined to be the objects that satisfy these descriptions. Bertrand Russell and Gottlob Frege have both been associated with the descriptivist theory, which is sometimes called the mediated reference theory or Frege–Russell view.

In philosophy—more specifically, in its sub-fields semantics, semiotics, philosophy of language, metaphysics, and metasemantics—meaning "is a relationship between two sorts of things: signs and the kinds of things they intend, express, or signify".

Kent Bach is an American philosopher and Professor of Philosophy at San Francisco State University. His primary areas of research include the philosophy of language, linguistics and epistemology. He is the author of three books: Exit-existentialism: A philosophy of self-awareness, Linguistic Communication and Speech Acts, and Thought and Reference published by Wadsworth, the MIT Press, and Oxford University Press, respectively.

The analytic–synthetic distinction is a semantic distinction used primarily in philosophy to distinguish between propositions that are of two types: analytic propositions and synthetic propositions. Analytic propositions are true or not true solely by virtue of their meaning, whereas synthetic propositions' truth, if any, derives from how their meaning relates to the world.

In analytic philosophy, philosophy of language investigates the nature of language and the relations between language, language users, and the world. Investigations may include inquiry into the nature of meaning, intentionality, reference, the constitution of sentences, concepts, learning, and thought.

<i>Naming and Necessity</i> Philosophy book by Saul Kripke

Naming and Necessity is a 1980 book with the transcript of three lectures, given by the philosopher Saul Kripke, at Princeton University in 1970, in which he dealt with the debates of proper names in the philosophy of language. The transcript was brought out originally in 1972 in Semantics of Natural Language, edited by Donald Davidson and Gilbert Harman. Among analytic philosophers, Naming and Necessity is widely considered one of the most important philosophical works of the twentieth century.

Donkey sentences are sentences that contain a pronoun with clear meaning but whose syntactical role in the sentence poses challenges to grammarians. Such sentences defy straightforward attempts to generate their formal language equivalents. The difficulty is with understanding how English speakers parse such sentences.

Dynamic semantics is a framework in logic and natural language semantics that treats the meaning of a sentence as its potential to update a context. In static semantics, knowing the meaning of a sentence amounts to knowing when it is true; in dynamic semantics, knowing the meaning of a sentence means knowing "the change it brings about in the information state of anyone who accepts the news conveyed by it." In dynamic semantics, sentences are mapped to functions called context change potentials, which take an input context and return an output context. Dynamic semantics was originally developed by Irene Heim and Hans Kamp in 1981 to model anaphora, but has since been applied widely to phenomena including presupposition, plurals, questions, discourse relations, and modality.

Formal semantics is the study of grammatical meaning in natural languages using formal tools from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.