Predicate transformer semantics

Last updated

Predicate transformer semantics were introduced by Edsger Dijkstra in his seminal paper "Guarded commands, nondeterminacy and formal derivation of programs". They define the semantics of an imperative programming paradigm by assigning to each statement in this language a corresponding predicate transformer: a total function between two predicates on the state space of the statement. In this sense, predicate transformer semantics are a kind of denotational semantics. Actually, in guarded commands, Dijkstra uses only one kind of predicate transformer: the well-known weakest preconditions (see below).

Contents

Moreover, predicate transformer semantics are a reformulation of Floyd–Hoare logic. Whereas Hoare logic is presented as a deductive system, predicate transformer semantics (either by weakest-preconditions or by strongest-postconditions see below) are complete strategies to build valid deductions of Hoare logic. In other words, they provide an effective algorithm to reduce the problem of verifying a Hoare triple to the problem of proving a first-order formula. Technically, predicate transformer semantics perform a kind of symbolic execution of statements into predicates: execution runs backward in the case of weakest-preconditions, or runs forward in the case of strongest-postconditions.

Weakest preconditions

Definition

For a statement S and a postcondition R, a weakest precondition is a predicate Q such that for any precondition P, if and only if . In other words, it is the "loosest" or least restrictive requirement needed to guarantee that R holds after S. Uniqueness follows easily from the definition: If both Q and Q' are weakest preconditions, then by the definition so and so , and thus . We often use to denote the weakest precondition for statement S with respect to a postcondition R.

Conventions

We use T to denote the predicate that is everywhere true and F to denote the one that is everywhere false. We shouldn't at least conceptually confuse ourselves with a Boolean expression defined by some language syntax, which might also contain true and false as Boolean scalars. For such scalars we need to do a type coercion such that we have T = predicate(true) and F = predicate(false). Such a promotion is carried out often casually, so people tend to take T as true and F as false.

Skip

Abort

Assignment

We give below two equivalent weakest-preconditions for the assignment statement. In these formulas, is a copy of R where free occurrences of x are replaced by E. Hence, here, expression E is implicitly coerced into a valid term of the underlying logic: it is thus a pure expression, totally defined, terminating and without side effect.

where y is a fresh variable and not free in E and R (representing the final value of variable x)

Provided that E is well defined, we just apply the so-called one-point rule on version 1. Then

The first version avoids a potential duplication of x in R, whereas the second version is simpler when there is at most a single occurrence of x in R. The first version also reveals a deep duality between weakest-precondition and strongest-postcondition (see below).

An example of a valid calculation of wp (using version 2) for assignments with integer valued variable x is:

This means that in order for the postcondition x > 10 to be true after the assignment, the precondition x > 15 must be true before the assignment. This is also the "weakest precondition", in that it is the "weakest" restriction on the value of x which makes x > 10 true after the assignment.

Sequence

For example,

Conditional

As example:

While loop

Partial Correctness

Ignoring termination for a moment, we can define the rule for the weakest liberal precondition, denoted wlp, using a predicate \textit{INV}, called the [[loop \textit{INV}ariant]], typically supplied by the programmer:

Total Correctness

To show total correctness, we also have to show that the loop terminates. For this we define a well-founded relation on the state space denoted as (wfs, <) and define a variant function vf , such that we have:

where v is a fresh tuple of variables

Informally, in the above conjunction of three formulas:

  • the first one means that the variant must be part of the well-founded relation before entering the loop;
  • the second one means that the body of the loop (i.e. statement S) must preserve the invariant and reduce the variant;
  • the last one means that the loop postcondition R must be established when the loop finishes.

However, the conjunction of those three is not a necessary condition. Exactly, we have

Non-deterministic guarded commands

Actually, Dijkstra's Guarded Command Language (GCL) is an extension of the simple imperative language given until here with non-deterministic statements. Indeed, GCL aims to be a formal notation to define algorithms. Non-deterministic statements represent choices left to the actual implementation (in an effective programming language): properties proved on non-deterministic statements are ensured for all possible choices of implementation. In other words, weakest-preconditions of non-deterministic statements ensure

Notice that the definitions of weakest-precondition given above (in particular for while-loop) preserve this property.

Selection

Selection is a generalization of if statement:

[ citation needed ]

Here, when two guards and are simultaneously true, then execution of this statement can run any of the associated statement or .

Repetition

Repetition is a generalization of while statement in a similar way.

Specification statement

Refinement calculus extends GCL with the notion of specification statement. Syntactically, we prefer to write a specification statement as

which specifies a computation that starts in a state satisfying pre and is guaranteed to end in a state satisfying post by changing only x. We call a logical constant employed to aid in a specification. For example, we can specify a computation that increment x by 1 as

Another example is a computation of a square root of an integer.

The specification statement appears like a primitive in the sense that it does not contain other statements. However, it is very expressive, as pre and post are arbitrary predicates. Its weakest precondition is as follows.

where s is fresh.

It combines Morgan's syntactic idea with the sharpness idea by Bijlsma, Matthews and Wiltink. [1] The very advantage of this is its capability of defining wp of goto L and other jump statements. [2]

Goto statement

Formalization of jump statements like goto L takes a very long bumpy process. A common belief seems to indicate the goto statement could only be argued operationally. This is probably due to a failure to recognize that goto L is actually miraculous (i.e. non-strict) and does not follow Dijkstra's coined Law of Miracle Excluded, as stood in itself. But it enjoys an extremely simple operational view from the weakest precondition perspective, which was unexpected. We define

where wpL is the weakest precondition at label L.

For goto L execution transfers control to label L at which the weakest precondition has to hold. The way that wpL is referred to in the rule should not be taken as a big surprise. It is just for some Q computed to that point. This is like any wp rules, using constituent statements to give wp definitions, even though goto L appears a primitive. The rule does not require the uniqueness for locations where wpL holds within a program, so theoretically it allows the same label to appear in multiple locations as long as the weakest precondition at each location is the same wpL. The goto statement can jump to any of such locations. This actually justifies that we could place the same labels at the same location multiple times, as , which is the same as . Also, it does not imply any scoping rule, thus allowing a jump into a loop body, for example. Let us calculate wp of the following program S, which has a jump into the loop body.

     wp(do x > 0 → L: x := x-1 od;  if x < 0 → x := -x; goto L ⫿ x ≥ 0 → skip fi,  post)    =   { sequential composition and alternation rules }      wp(do x > 0 → L: x := x-1 od, (x<0 ∧ wp(x := -x; goto L, post)) ∨ (x ≥  0 ∧ post)    =   { sequential composition, goto, assignment rules }      wp(do x > 0 → L: x := x-1 od, x<0 ∧ wpL(x ← -x) ∨ x≥0 ∧ post)    =   { repetition rule }      the strongest solution of                Z: [ Z ≡ x > 0 ∧ wp(L: x := x-1, Z) ∨ x < 0 ∧ wpL(x ← -x) ∨ x=0 ∧ post ]        =  { assignment rule, found wpL = Z(x ← x-1) }      the strongest solution of                Z: [ Z ≡ x > 0 ∧ Z(x ← x-1) ∨ x < 0 ∧ Z(x ← x-1) (x ← -x) ∨ x=0 ∧ post]    =  { substitution }      the strongest solution of                Z:[ Z ≡ x > 0 ∧ Z(x ← x-1) ∨ x < 0 ∧ Z(x ← -x-1) ∨ x=0 ∧ post ]    =  { solve the equation by approximation }      post(x ← 0)

Therefore,

wp(S, post) = post(x ← 0).

Other predicate transformers

Weakest liberal precondition

An important variant of the weakest precondition is the weakest liberal precondition, which yields the weakest condition under which S either does not terminate or establishes R. It therefore differs from wp in not guaranteeing termination. Hence it corresponds to Hoare logic in partial correctness: for the statement language given above, wlp differs with wp only on while-loop, in not requiring a variant (see above).

Strongest postcondition

Given S a statement and R a precondition (a predicate on the initial state), then is their strongest-postcondition: it implies any postcondition satisfied by the final state of any execution of S, for any initial state satisfying R. In other words, a Hoare triple is provable in Hoare logic if and only if the predicate below hold:

Usually, strongest-postconditions are used in partial correctness. Hence, we have the following relation between weakest-liberal-preconditions and strongest-postconditions:

For example, on assignment we have:

where y is fresh

Above, the logical variable y represents the initial value of variable x. Hence,

On sequence, it appears that sp runs forward (whereas wp runs backward):

Win and sin predicate transformers

Leslie Lamport has suggested win and sin as predicate transformers for concurrent programming. [3]

Predicate transformers properties

This section presents some characteristic properties of predicate transformers. [4] Below, S denotes a predicate transformer (a function between two predicates on the state space) and P a predicate. For instance, S(P) may denote wp(S,P) or sp(S,P). We keep x as the variable of the state space.

Monotonic

Predicate transformers of interest (wp, wlp, and sp) are monotonic. A predicate transformer S is monotonic if and only if:

This property is related to the consequence rule of Hoare logic.

Strict

A predicate transformer S is strict iff:

For instance, wp is artificially made strict, whereas wlp is generally not. In particular, if statement S may not terminate then is satisfiable. We have

Indeed, T is a valid invariant of that loop.

The non-strict but monotonic or conjunctive predicate transformers are called miraculous and can also be used to define a class of programming constructs, in particular, jump statements, which Dijkstra cared less about. Those jump statements include straight goto L, break and continue in a loop and return statements in a procedure body, exception handling, etc. It turns out that all jump statements are executable miracles, [5] i.e. they can be implemented but not strict.

Terminating

A predicate transformer S is terminating iff:

Actually, this terminology makes sense only for strict predicate transformers: indeed, is the weakest-precondition ensuring termination of S.

It seems that naming this property non-aborting would be more appropriate: in total correctness, non-termination is abortion, whereas in partial correctness, it is not.

Conjunctive

A predicate transformer S is conjunctive iff:

This is the case for , even if statement S is non-deterministic as a selection statement or a specification statement.

Disjunctive

A predicate transformer S is disjunctive iff:

This is generally not the case of when S is non-deterministic. Indeed, consider a non-deterministic statement S choosing an arbitrary boolean. This statement is given here as the following selection statement:

Then, reduces to the formula .

Hence, reduces to the tautology

Whereas, the formula reduces to the wrong proposition.

Applications

Beyond predicate transformers

Weakest-preconditions and strongest-postconditions of imperative expressions

In predicate transformers semantics, expressions are restricted to terms of the logic (see above). However, this restriction seems too strong for most existing programming languages, where expressions may have side effects (call to a function having a side effect), may not terminate or abort (like division by zero). There are many proposals to extend weakest-preconditions or strongest-postconditions for imperative expression languages and in particular for monads.

Among them, Hoare Type Theory combines Hoare logic for a Haskell-like language, separation logic and type theory. [9] This system is currently implemented as a Coq library called Ynot. [10] In this language, evaluation of expressions corresponds to computations of strongest-postconditions.

Probabilistic Predicate Transformers

Probabilistic Predicate Transformers are an extension of predicate transformers for probabilistic programs. Indeed, such programs have many applications in cryptography (hiding of information using some randomized noise), distributed systems (symmetry breaking). [11]

See also

Notes

  1. Chen, Wei and Udding, Jan Tijmen, "The Specification Statement Refined" WUCS-89-37 (1989). https://openscholarship.wustl.edu/cse_research/749
  2. Chen, Wei, "A wp Characterization of Jump Statements," 2021 International Symposium on Theoretical Aspects of Software Engineering (TASE), 2021, pp. 15-22. doi: 10.1109/TASE52547.2021.00019.
  3. Lamport, Leslie (July 1990). "win and sin: Predicate Transformers for Concurrency". ACM Trans. Program. Lang. Syst. 12 (3): 396–428. CiteSeerX   10.1.1.33.90 . doi:10.1145/78969.78970. S2CID   209901.
  4. Back, Ralph-Johan; Wright, Joakim (2012) [1978]. Refinement Calculus: A Systematic Introduction. Texts in Computer Science. Springer. ISBN   978-1-4612-1674-2.
  5. Chen, Wei, "Exit Statements are Executable Miracles" WUCS-91-53 (1991). https://openscholarship.wustl.edu/cse_research/671
  6. Dijkstra, Edsger W. (1968). "A Constructive Approach to the Problem of Program Correctness". BIT Numerical Mathematics. 8 (3): 174–186. doi:10.1007/bf01933419. S2CID   62224342.
  7. Wirth, N. (April 1971). "Program development by stepwise refinement" (PDF). Comm. ACM. 14 (4): 221–7. doi:10.1145/362575.362577. hdl: 20.500.11850/80846 . S2CID   13214445.
  8. Tutorial on Hoare Logic: a Coq library, giving a simple but formal proof that Hoare logic is sound and complete with respect to an operational semantics.
  9. Nanevski, Aleksandar; Morrisett, Greg; Birkedal, Lars (September 2008). "Hoare Type Theory, Polymorphism and Separation" (PDF). Journal of Functional Programming. 18 (5–6): 865–911. doi:10.1017/S0956796808006953. S2CID   6956622.
  10. Ynot a Coq library implementing Hoare Type Theory.
  11. Morgan, Carroll; McIver, Annabelle; Seidel, Karen (May 1996). "Probabilistic Predicate Transformers" (PDF). ACM Trans. Program. Lang. Syst. 18 (3): 325–353. CiteSeerX   10.1.1.41.9219 . doi:10.1145/229542.229547. S2CID   5812195.

Related Research Articles

First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.

In artificial intelligence, with implications for cognitive science, the frame problem describes an issue with using first-order logic to express facts about a robot in the world. Representing the state of a robot with traditional first-order logic requires the use of many axioms that simply imply that things in the environment do not change arbitrarily. For example, Hayes describes a "block world" with rules about stacking blocks together. In a first-order logic system, additional axioms are required to make inferences about the environment. The frame problem is the problem of finding adequate collections of axioms for a viable description of a robot environment.

<span class="mw-page-title-main">Logical connective</span> Symbol connecting sentential formulas in logic

In logic, a logical connective is a logical constant. Connectives can be used to connect logical formulas. For instance in the syntax of propositional logic, the binary connective can be used to join the two atomic formulas and , rendering the complex formula .

In mathematics, the distributive property of binary operations is a generalization of the distributive law, which asserts that the equality

In set theory, Zermelo–Fraenkel set theory, named after mathematicians Ernst Zermelo and Abraham Fraenkel, is an axiomatic system that was proposed in the early twentieth century in order to formulate a theory of sets free of paradoxes such as Russell's paradox. Today, Zermelo–Fraenkel set theory, with the historically controversial axiom of choice (AC) included, is the standard form of axiomatic set theory and as such is the most common foundation of mathematics. Zermelo–Fraenkel set theory with the axiom of choice included is abbreviated ZFC, where C stands for "choice", and ZF refers to the axioms of Zermelo–Fraenkel set theory with the axiom of choice excluded.

Hoare logic is a formal system with a set of logical rules for reasoning rigorously about the correctness of computer programs. It was proposed in 1969 by the British computer scientist and logician Tony Hoare, and subsequently refined by Hoare and other researchers. The original ideas were seeded by the work of Robert W. Floyd, who had published a similar system for flowcharts.

Axiomatic semantics is an approach based on mathematical logic for proving the correctness of computer programs. It is closely related to Hoare logic.

In computer science, a loop invariant is a property of a program loop that is true before each iteration. It is a logical assertion, sometimes checked with a code assertion. Knowing its invariant(s) is essential in understanding the effect of a loop.

Computation tree logic (CTL) is a branching-time logic, meaning that its model of time is a tree-like structure in which the future is not determined; there are different paths in the future, any one of which might be an actual path that is realized. It is used in formal verification of software or hardware artifacts, typically by software applications known as model checkers, which determine if a given artifact possesses safety or liveness properties. For example, CTL can specify that when some initial condition is satisfied, then all possible executions of a program avoid some undesirable condition. In this example, the safety property could be verified by a model checker that explores all possible transitions out of program states satisfying the initial condition and ensures that all such executions satisfy the property. Computation tree logic belongs to a class of temporal logics that includes linear temporal logic (LTL). Although there are properties expressible only in CTL and properties expressible only in LTL, all properties expressible in either logic can also be expressed in CTL*.

Bunched logic is a variety of substructural logic proposed by Peter O'Hearn and David Pym. Bunched logic provides primitives for reasoning about resource composition, which aid in the compositional analysis of computer and other systems. It has category-theoretic and truth-functional semantics, which can be understood in terms of an abstract concept of resource, and a proof theory in which the contexts Γ in an entailment judgement Γ ⊢ A are tree-like structures (bunches) rather than lists or (multi)sets as in most proof calculi. Bunched logic has an associated type theory, and its first application was in providing a way to control the aliasing and other forms of interference in imperative programs. The logic has seen further applications in program verification, where it is the basis of the assertion language of separation logic, and in systems modelling, where it provides a way to decompose the resources used by components of a system.

Independence-friendly logic is an extension of classical first-order logic (FOL) by means of slashed quantifiers of the form and , where is a finite set of variables. The intended reading of is "there is a which is functionally independent from the variables in ". IF logic allows one to express more general patterns of dependence between variables than those which are implicit in first-order logic. This greater level of generality leads to an actual increase in expressive power; the set of IF sentences can characterize the same classes of structures as existential second-order logic.

The situation calculus is a logic formalism designed for representing and reasoning about dynamical domains. It was first introduced by John McCarthy in 1963. The main version of the situational calculus that is presented in this article is based on that introduced by Ray Reiter in 1991. It is followed by sections about McCarthy's 1986 version and a logic programming formulation.

Unifying Theories of Programming (UTP) in computer science deals with program semantics. It shows how denotational semantics, operational semantics and algebraic semantics can be combined in a unified framework for the formal specification, design and implementation of programs and computer systems.

In computer science, separation logic is an extension of Hoare logic, a way of reasoning about programs. It was developed by John C. Reynolds, Peter O'Hearn, Samin Ishtiaq and Hongseok Yang, drawing upon early work by Rod Burstall. The assertion language of separation logic is a special case of the logic of bunched implications (BI). A CACM review article by O'Hearn charts developments in the subject to early 2019.

In mathematics and philosophy, Łukasiewicz logic is a non-classical, many-valued logic. It was originally defined in the early 20th century by Jan Łukasiewicz as a three-valued modal logic; it was later generalized to n-valued as well as infinitely-many-valued (0-valued) variants, both propositional and first order. The ℵ0-valued version was published in 1930 by Łukasiewicz and Alfred Tarski; consequently it is sometimes called the Łukasiewicz–Tarski logic. It belongs to the classes of t-norm fuzzy logics and substructural logics.

<span class="mw-page-title-main">KeY</span>

The KeY tool is used in formal verification of Java programs. It accepts specifications written in the Java Modeling Language to Java source files. These are transformed into theorems of dynamic logic and then compared against program semantics that are likewise defined in terms of dynamic logic. KeY is significantly powerful in that it supports both interactive and fully automated correctness proofs. Failed proof attempts can be used for a more efficient debugging or verification-based testing. There have been several extensions to KeY in order to apply it to the verification of C programs or hybrid systems. KeY is jointly developed by Karlsruhe Institute of Technology, Germany; Technische Universität Darmstadt, Germany; and Chalmers University of Technology in Gothenburg, Sweden and is licensed under the GPL.

T-norm fuzzy logics are a family of non-classical logics, informally delimited by having a semantics that takes the real unit interval [0, 1] for the system of truth values and functions called t-norms for permissible interpretations of conjunction. They are mainly used in applied fuzzy logic and fuzzy set theory as a theoretical basis for approximate reasoning.

In mathematical logic, predicate functor logic (PFL) is one of several ways to express first-order logic by purely algebraic means, i.e., without quantified variables. PFL employs a small number of algebraic devices called predicate functors that operate on terms to yield terms. PFL is mostly the invention of the logician and philosopher Willard Quine.

In logic, philosophy, and theoretical computer science, dynamic logic is an extension of modal logic capable of encoding properties of computer programs.

In computer science, interference freedom is a technique for proving partial correctness of concurrent programs with shared variables. Hoare logic had been introduced earlier to prove correctness of sequential programs. In her PhD thesis under advisor David Gries, Susan Owicki extended this work to apply to concurrent programs.

References