Calculus of structures

Last updated

In mathematical logic, the calculus of structures is a proof calculus with deep inference for studying the structural proof theory of noncommutative logic. The calculus has since been applied to study linear logic, classical logic, modal logic, and process calculi, and many benefits are claimed to follow in these investigations from the way in which deep inference is made available in the calculus.

Related Research Articles

<span class="mw-page-title-main">Discrete mathematics</span> Study of discrete mathematical structures

Discrete mathematics is the study of mathematical structures that can be considered "discrete" rather than "continuous". Objects studied in discrete mathematics include integers, graphs, and statements in logic. By contrast, discrete mathematics excludes topics in "continuous mathematics" such as real numbers, calculus or Euclidean geometry. Discrete objects can often be enumerated by integers; more formally, discrete mathematics has been characterized as the branch of mathematics dealing with countable sets. However, there is no exact definition of the term "discrete mathematics".

Propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. It deals with propositions and relations between propositions, including the construction of arguments based on them. Compound propositions are formed by connecting propositions by logical connectives. Propositions that contain no logical connectives are called atomic propositions.

<span class="mw-page-title-main">Theorem</span> In mathematics, a statement that has been proved

In mathematics, a theorem is a statement that has been proved, or can be proved. The proof of a theorem is a logical argument that uses the inference rules of a deductive system to establish that the theorem is a logical consequence of the axioms and previously proved theorems.

In logic and proof theory, natural deduction is a kind of proof calculus in which logical reasoning is expressed by inference rules closely related to the "natural" way of reasoning. This contrasts with Hilbert-style systems, which instead use axioms as much as possible to express the logical laws of deductive reasoning.

Proof theory is a major branch of mathematical logic and theoretical computer science within which proofs are treated as formal mathematical objects, facilitating their analysis by mathematical techniques. Proofs are typically presented as inductively-defined data structures such as lists, boxed lists, or trees, which are constructed according to the axioms and rules of inference of a given logical system. Consequently, proof theory is syntactic in nature, in contrast to model theory, which is semantic in nature.

In philosophy of logic and logic, a rule of inference, inference rule or transformation rule is a logical form consisting of a function which takes premises, analyzes their syntax, and returns a conclusion.

In mathematical logic, sequent calculus is a style of formal logical argumentation in which every line of a proof is a conditional tautology instead of an unconditional tautology. Each conditional tautology is inferred from other conditional tautologies on earlier lines in a formal argument according to rules and procedures of inference, giving a better approximation to the natural style of deduction used by mathematicians than to David Hilbert's earlier style of formal logic, in which every line was an unconditional tautology. More subtle distinctions may exist; for example, propositions may implicitly depend upon non-logical axioms. In that case, sequents signify conditional theorems in a first-order language rather than conditional tautologies.

In mathematical logic, a sequent is a very general kind of conditional assertion.

In programming language theory and proof theory, the Curry–Howard correspondence is the direct relationship between computer programs and mathematical proofs.

Noncommutative logic is an extension of linear logic that combines the commutative connectives of linear logic with the noncommutative multiplicative connectives of the Lambek calculus. Its sequent calculus relies on the structure of order varieties, and the correctness criterion for its proof nets is given in terms of partial permutations. It also has a denotational semantics in which formulas are interpreted by modules over some specific Hopf algebras.

In mathematical logic, structural proof theory is the subdiscipline of proof theory that studies proof calculi that support a notion of analytic proof, a kind of proof whose semantic properties are exposed. When all the theorems of a logic formalised in a structural proof theory have analytic proofs, then the proof theory can be used to demonstrate such things as consistency, provide decision procedures, and allow mathematical or computational witnesses to be extracted as counterparts to theorems, the kind of task that is more often given to model theory.

Calculus in its most general sense is any method or system of calculation.

In mathematical logic, Deep inference names a general idea in structural proof theory that breaks with the classical sequent calculus by generalising the notion of structure to permit inference to occur in contexts of high structural complexity. The term deep inference is generally reserved for proof calculi where the structural complexity is unbounded; in this article we will use non-shallow inference to refer to calculi that have structural complexity greater than the sequent calculus, but not unboundedly so, although this is not at present established terminology.

In mathematical logic, a proof calculus or a proof system is built to prove statements.

Logic is the formal science of using reason and is considered a branch of both philosophy and mathematics and to a lesser extent computer science. Logic investigates and classifies the structure of statements and arguments, both through the study of formal systems of inference and the study of arguments in natural language. The scope of logic can therefore be very large, ranging from core topics such as the study of fallacies and paradoxes, to specialized analyses of reasoning such as probability, correct reasoning, and arguments involving causality. One of the aims of logic is to identify the correct and incorrect inferences. Logicians study the criteria for the evaluation of arguments.

In mathematical logic, a judgment or assertion is a statement or enunciation in a metalanguage. For example, typical judgments in first-order logic would be that a string is a well-formed formula, or that a proposition is true. Similarly, a judgment may assert the occurrence of a free variable in an expression of the object language, or the provability of a proposition. In general, a judgment may be any inductively definable assertion in the metatheory.

Mathematics is a field of study that investigates topics such as number, space, structure, and change.

<span class="mw-page-title-main">Logic</span> Study of correct reasoning

Logic is the study of correct reasoning. It includes both formal and informal logic. Formal logic is the science of deductively valid inferences or logical truths. It studies how conclusions follow from premises due to the structure of arguments alone, independent of their topic and content. Informal logic is associated with informal fallacies, critical thinking, and argumentation theory. It examines arguments expressed in natural language while formal logic uses formal language. When used as a countable noun, the term "a logic" refers to a logical formal system that articulates a proof system. Logic plays a central role in many fields, such as philosophy, mathematics, computer science, and linguistics.

<span class="mw-page-title-main">Cirquent calculus</span>

Cirquent calculus is a proof calculus that manipulates graph-style constructs termed cirquents, as opposed to the traditional tree-style objects such as formulas or sequents. Cirquents come in a variety of forms, but they all share one main characteristic feature, making them different from the more traditional objects of syntactic manipulation. This feature is the ability to explicitly account for possible sharing of subcomponents between different components. For instance, it is possible to write an expression where two subexpressions F and E, while neither one is a subexpression of the other, still have a common occurrence of a subexpression G.

References