Intensional logic

Last updated

Intensional logic is an approach to predicate logic that extends first-order logic, which has quantifiers that range over the individuals of a universe ( extensions ), by additional quantifiers that range over terms that may have such individuals as their value ( intensions ). The distinction between intensional and extensional entities is parallel to the distinction between sense and reference.

Contents

Overview

Logic is the study of proof and deduction as manifested in language (abstracting from any underlying psychological or biological processes). [1] Logic is not a closed, completed science, and presumably, it will never stop developing: the logical analysis can penetrate into varying depths of the language [2] (sentences regarded as atomic, or splitting them to predicates applied to individual terms, or even revealing such fine logical structures like modal, temporal, dynamic, epistemic ones).

In order to achieve its special goal, logic was forced to develop its own formal tools, most notably its own grammar, detached from simply making direct use of the underlying natural language. [3] Functors (also known as function words) belong to the most important categories in logical grammar (along with basic categories like sentence and individual name): [4] a functor can be regarded as an "incomplete" expression with argument places to fill in. If we fill them in with appropriate subexpressions, then the resulting entirely completed expression can be regarded as a result, an output. [5] Thus, a functor acts like a function sign, [6] taking on input expressions, resulting in a new, output expression. [5]

Semantics links expressions of language to the outside world. Also logical semantics has developed its own structure. Semantic values can be attributed to expressions in basic categories: the reference of an individual name (the "designated" object named by that) is called its extension; and as for sentences, their truth value is their extension. [7]

As for functors, some of them are simpler than others: extension can be attributed to them in a simple way. In case of a so-called extensional functor we can in a sense abstract from the "material" part of its inputs and output, and regard the functor as a function turning directly the extension of its input(s) into the extension of its output. Of course, it is assumed that we can do so at all: the extension of input expression(s) determines the extension of the resulting expression. Functors for which this assumption does not hold are called intensional. [8]

Natural languages abound with intensional functors; [9] this can be illustrated by intensional statements. Extensional logic cannot reach inside such fine logical structures of the language, but stops at a coarser level. The attempts for such deep logical analysis have a long past: authors as early as Aristotle had already studied modal syllogisms. [10] Gottlob Frege developed a kind of two-dimensional semantics: for resolving questions like those of intensional statements, Frege introduced a distinction between two semantic values: sentences (and individual terms) have both an extension and an intension. [6] These semantic values can be interpreted, transferred also for functors (except for intensional functors, they have only intension).

As mentioned, motivations for settling problems that belong today to intensional logic have a long past. As for attempts of formalizations, the development of calculi often preceded the finding of their corresponding formal semantics. Intensional logic is not alone in that: also Gottlob Frege accompanied his (extensional) calculus with detailed explanations of the semantical motivations, but the formal foundation of its semantics appeared only in the 20th century. Thus sometimes similar patterns repeated themselves for the history of development of intensional logic like earlier for that of extensional logic. [11]

There are some intensional logic systems that claim to fully analyze the common language:

Modal logic is historically the earliest area in the study of intensional logic, originally motivated by formalizing "necessity" and "possibility" (recently, this original motivation belongs to alethic logic, just one of the many branches of modal logic). [12]

Modal logic can be regarded also as the most simple appearance of such studies: it extends extensional logic just with a few sentential functors: [13] these are intensional, and they are interpreted (in the metarules of semantics) as quantifying over possible worlds. For example, the Necessity operator (the 'box') when applied to a sentence A says 'The sentence "('box')A" is true in world i if and only if it is true in all worlds accessible from world i'. The corresponding Possibility operator (the 'diamond') when applied to A asserts that "('diamond')A" is true in world i if and only if A is true in some worlds (at least one) accessible to world i. The exact semantic content of these assertions therefore depends crucially on the nature of the accessibility relation. For example, is world i accessible from itself? The answer to this question characterizes the precise nature of the system, and many exist, answering moral and temporal questions (in a temporal system, the accessibility relation relates states or 'instants' and only the future is accessible from a given moment. The Necessity operator corresponds to 'for all future moments' in this logic. The operators are related to one another by similar dualities to those relating existential and universal quantifiers [14] (for example by the analogous correspondents of De Morgan's laws). I.e., Something is necessary if and only if its negation is not possible, i.e. inconsistent. Syntactically, the operators are not quantifiers, they do not bind variables, [15] but govern whole sentences. This gives rise to the problem of referential opacity, i.e. the problem of quantifying over or 'into' modal contexts. The operators appear in the grammar as sentential functors, [14] they are called modal operators. [15]

As mentioned, precursors of modal logic include Aristotle. Medieval scholarly discussions accompanied its development, for example about de re versus de dicto modalities: said in recent terms, in the de re modality the modal functor is applied to an open sentence, the variable is bound by a quantifier whose scope includes the whole intensional subterm. [10]

Modern modal logic began with the Clarence Irving Lewis. His work was motivated by establishing the notion of strict implication. [16] The possible worlds approach enabled more exact study of semantical questions. Exact formalization resulted in Kripke semantics (developed by Saul Kripke, Jaakko Hintikka, Stig Kanger). [13]

Type-theoretical intensional logic

Already in 1951, Alonzo Church had developed an intensional calculus. The semantical motivations were explained expressively, of course without those tools that we now use for establishing semantics for modal logic in a formal way, because they had not been invented then: [17] Church did not provide formal semantic definitions. [18]

Later, the possible worlds approach to semantics provided tools for a comprehensive study in intensional semantics. Richard Montague could preserve the most important advantages of Church's intensional calculus in his system. Unlike its forerunner, Montague grammar was built in a purely semantical way: a simpler treatment became possible, thank to the new formal tools invented since Church's work. [17]

See also

Notes

  1. Ruzsa 2000 , p. 10
  2. Ruzsa 2000 , p. 13
  3. Ruzsa 2000 , p. 12
  4. Ruzsa 2000 , p. 21
  5. 1 2 Ruzsa 2000 , p. 22
  6. 1 2 Ruzsa 2000 , p. 24
  7. Ruzsa 2000 , pp. 22–23
  8. Ruzsa 2000 , pp. 25–26
  9. Ruzsa 1987 , p. 724
  10. 1 2 Ruzsa 2000 , pp. 246–247
  11. Ruzsa 2000 , p. 128
  12. Ruzsa 2000 , p. 252
  13. 1 2 Ruzsa 2000 , p. 247
  14. 1 2 Ruzsa 2000 , p. 245
  15. 1 2 Ruzsa 2000 , p. 269
  16. Ruzsa 2000 , p. 256
  17. 1 2 Ruzsa 2000 , p. 297
  18. Ruzsa 1989 , p. 492

Related Research Articles

First-order logic—also called predicate logic, predicate calculus, quantificational logic—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables. Rather than propositions such as "all men are mortal", in first-order logic one can have expressions in the form "for all x, if x is a man, then x is mortal"; where "for all x" is a quantifier, x is a variable, and "... is a man" and "... is mortal" are predicates. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.

<span class="mw-page-title-main">Saul Kripke</span> American philosopher and logician (1940–2022)

Saul Aaron Kripke was an American analytic philosopher and logician. He was Distinguished Professor of Philosophy at the Graduate Center of the City University of New York and emeritus professor at Princeton University. From the 1960s until his death, he was a central figure in a number of fields related to mathematical and modal logic, philosophy of language and mathematics, metaphysics, epistemology, and recursion theory.

Modal logic is a kind of logic used to represent statements about necessity and possibility. It plays a major role in philosophy and related fields as a tool for understanding concepts such as knowledge, obligation, and causation. For instance, in epistemic modal logic, the formula can be used to represent the statement that is known. In deontic modal logic, that same formula can represent that is a moral obligation. Modal logic considers the inferences that modal statements give rise to. For instance, most epistemic modal logics treat the formula as a tautology, representing the principle that only true statements can count as knowledge. However, this formula is not a tautology in deontic modal logic, since what ought to be true can be false.

De dicto and de re are two phrases used to mark a distinction in intensional statements, associated with the intensional operators in many such statements. The distinction is used regularly in analytical metaphysics and in philosophy of language.

Montague grammar is an approach to natural language semantics, named after American logician Richard Montague. The Montague grammar is based on mathematical logic, especially higher-order predicate logic and lambda calculus, and makes use of the notions of intensional logic, via Kripke models. Montague pioneered this approach in the 1960s and early 1970s.

A possible world is a complete and consistent way the world is or could have been. Possible worlds are widely used as a formal device in logic, philosophy, and linguistics in order to provide a semantics for intensional and modal logic. Their metaphysical status has been a subject of controversy in philosophy, with modal realists such as David Lewis arguing that they are literally existing alternate realities, and others such as Robert Stalnaker arguing that they are not.

In logic, the semantics of logic or formal semantics is the study of the semantics, or interpretations, of formal languages and natural languages usually trying to capture the pre-theoretic notion of logical consequence.

Kripke semantics is a formal semantics for non-classical logic systems created in the late 1950s and early 1960s by Saul Kripke and André Joyal. It was first conceived for modal logics, and later adapted to intuitionistic logic and other non-classical systems. The development of Kripke semantics was a breakthrough in the theory of non-classical logics, because the model theory of such logics was almost non-existent before Kripke.

In semantics, mathematical logic and related disciplines, the principle of compositionality is the principle that the meaning of a complex expression is determined by the meanings of its constituent expressions and the rules used to combine them. The principle is also called Frege's principle, because Gottlob Frege is widely credited for the first modern formulation of it. However, the principle has never been explicitly stated by Frege, and arguably it was already assumed by George Boole decades before Frege's work.

David Benjamin Kaplan is an American philosopher. He is the Hans Reichenbach Professor of Scientific Philosophy at the UCLA Department of Philosophy. His philosophical work focuses on the philosophy of language, logic, metaphysics, epistemology and the philosophy of Frege and Russell. He is best known for his work on demonstratives, propositions, and reference in intensional contexts. He was elected a Fellow of the American Academy of Arts & Sciences in 1983 and a Corresponding Fellow of the British Academy in 2007.

<span class="mw-page-title-main">Jaakko Hintikka</span> Finnish philosopher and logician

Kaarlo Jaakko Juhani Hintikka was a Finnish philosopher and logician. Hintikka is regarded as the founder of formal epistemic logic and of game semantics for logic.

Transparent intensional logic is a logical system created by Pavel Tichý. Due to its rich procedural semantics TIL is in particular apt for the logical analysis of natural language. From the formal point of view, TIL is a hyperintensional, partial, typed lambda calculus.

Logic is the formal science of using reason and is considered a branch of both philosophy and mathematics and to a lesser extent computer science. Logic investigates and classifies the structure of statements and arguments, both through the study of formal systems of inference and the study of arguments in natural language. The scope of logic can therefore be very large, ranging from core topics such as the study of fallacies and paradoxes, to specialized analyses of reasoning such as probability, correct reasoning, and arguments involving causality. One of the aims of logic is to identify the correct and incorrect inferences. Logicians study the criteria for the evaluation of arguments.

In semantics, a donkey sentence is a sentence containing a pronoun which is semantically bound but syntactically free. They are a classic puzzle in formal semantics and philosophy of language because they are fully grammatical and yet defy straightforward attempts to generate their formal language equivalents. In order to explain how speakers are able to understand them, semanticists have proposed a variety of formalisms including systems of dynamic semantics such as Discourse representation theory. Their name comes from the example sentence "Every farmer who owns a donkey beats it", in which "it" acts as a donkey pronoun because it is semantically but not syntactically bound by the indefinite noun phrase "a donkey". The phenomenon is known as donkey anaphora.

In logic and semantics, the term statement is variously understood to mean either:

  1. a meaningful declarative sentence that is true or false, or
  2. a proposition. Which is the assertion that is made by a true or false declarative sentence.

Formal semantics is the study of grammatical meaning in natural languages using formal concepts from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

<i>Meaning and Necessity</i> 1947 book by Rudolf Carnap

Meaning and Necessity: A Study in Semantics and Modal Logic is a book about semantics and modal logic by the philosopher Rudolf Carnap. The book, in which Carnap discusses the nature of linguistic expressions, was a continuation of his previous work in semantics in Introduction to Semantics (1942) and Formalization of Logic (1943). Considered an important discussion of semantics, it was influential and provided a basis for further developments in modal logic.

Japaridze's polymodal logic (GLP) is a system of provability logic with infinitely many provability modalities. This system has played an important role in some applications of provability algebras in proof theory, and has been extensively studied since the late 1980s. It is named after Giorgi Japaridze.

This is a glossary of logic. Logic is the study of the principles of valid reasoning and argumentation.

References