This article possibly contains original research .(February 2016) |
Algorithmic logic is a calculus of programs that allows the expression of semantic properties of programs by appropriate logical formulas. It provides a framework that enables proving the formulas from the axioms of program constructs such as assignment, iteration and composition instructions and from the axioms of the data structures in question see Mirkowska & Salwicki (1987), Banachowski et al. (1977).
The following diagram helps to locate algorithmic logic among other logics.
The formalized language of algorithmic logic (and of algorithmic theories of various data structures) contains three types of well formed expressions: Terms - i.e. expressions denoting operations on elements of data structures, formulas - i.e. expressions denoting the relations among elements of data structures, programs - i.e. algorithms - these expressions describe the computations. For semantics of terms and formulas consult pages on first-order logic and Tarski's semantics. The meaning of a program is the set of possible computations of the program.
Algorithmic logic is one of many logics of programs. Another logic of programs is dynamic logic, see dynamic logic, Harel, Kozen & Tiuryn (2000).
First-order logic—also called predicate logic, predicate calculus, quantificational logic—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.
In artificial intelligence, with implications for cognitive science, the frame problem describes an issue with using first-order logic to express facts about a robot in the world. Representing the state of a robot with traditional first-order logic requires the use of many axioms that simply imply that things in the environment do not change arbitrarily. For example, Hayes describes a "block world" with rules about stacking blocks together. In a first-order logic system, additional axioms are required to make inferences about the environment. The frame problem is the problem of finding adequate collections of axioms for a viable description of a robot environment.
In logic, a logical connective is a logical constant. Connectives can be used to connect logical formulas. For instance in the syntax of propositional logic, the binary connective can be used to join the two atomic formulas and , rendering the complex formula .
In mathematics and theoretical computer science, a type theory is the formal presentation of a specific type system. Type theory is the academic study of type systems.
In programming language theory and proof theory, the Curry–Howard correspondence is the direct relationship between computer programs and mathematical proofs.
Modal logic is a kind of logic used to represent statements about necessity and possibility. It plays a major role in philosophy and related fields as a tool for understanding concepts such as knowledge, obligation, and causation. For instance, in epistemic modal logic, the formula can be used to represent the statement that is known. In deontic modal logic, that same formula can represent that is a moral obligation. Modal logic considers the inferences that modal statements give rise to. For instance, most epistemic modal logics treat the formula as a tautology, representing the principle that only true statements can count as knowledge. However, this formula is not a tautology in deontic modal logic, since what ought to be true can be false.
In logic and mathematics, second-order logic is an extension of first-order logic, which itself is an extension of propositional logic. Second-order logic is in turn extended by higher-order logic and type theory.
In computer science, a loop invariant is a property of a program loop that is true before each iteration. It is a logical assertion, sometimes checked with a code assertion. Knowing its invariant(s) is essential in understanding the effect of a loop.
Tarski's undefinability theorem, stated and proved by Alfred Tarski in 1933, is an important limitative result in mathematical logic, the foundations of mathematics, and in formal semantics. Informally, the theorem states that "arithmetical truth cannot be defined in arithmetic".
Default logic is a non-monotonic logic proposed by Raymond Reiter to formalize reasoning with default assumptions.
Kripke semantics is a formal semantics for non-classical logic systems created in the late 1950s and early 1960s by Saul Kripke and André Joyal. It was first conceived for modal logics, and later adapted to intuitionistic logic and other non-classical systems. The development of Kripke semantics was a breakthrough in the theory of non-classical logics, because the model theory of such logics was almost non-existent before Kripke.
The simply typed lambda calculus, a form of type theory, is a typed interpretation of the lambda calculus with only one type constructor that builds function types. It is the canonical and simplest example of a typed lambda calculus. The simply typed lambda calculus was originally introduced by Alonzo Church in 1940 as an attempt to avoid paradoxical use of the untyped lambda calculus.
The situation calculus is a logic formalism designed for representing and reasoning about dynamical domains. It was first introduced by John McCarthy in 1963. The main version of the situational calculus that is presented in this article is based on that introduced by Ray Reiter in 1991. It is followed by sections about McCarthy's 1986 version and a logic programming formulation.
Negation as failure is a non-monotonic inference rule in logic programming, used to derive from failure to derive . Note that can be different from the statement of the logical negation of , depending on the completeness of the inference algorithm and thus also on the formal logic system.
The event calculus is a logical theory for representing and reasoning about events and about the way in which they change the state of some real or artificial world. It deals both with action events, which are performed by agents, and with external events, which are outside the control of any agent.
In computational complexity theory, the potential method is a method used to analyze the amortized time and space complexity of a data structure, a measure of its performance over sequences of operations that smooths out the cost of infrequent but expensive operations.
In theoretical computer science, the modal μ-calculus is an extension of propositional modal logic by adding the least fixed point operator μ and the greatest fixed point operator ν, thus a fixed-point logic.
Dexter Campbell Kozen is an American theoretical computer scientist. He is Professor Emeritus and Joseph Newton Pew, Jr. Professor in Engineering at Cornell University.
In logic, philosophy, and theoretical computer science, dynamic logic is an extension of modal logic capable of encoding properties of computer programs.