Normal form (abstract rewriting)

Last updated

In abstract rewriting, [1] an object is in normal form if it cannot be rewritten any further, i.e. it is irreducible. Depending on the rewriting system, an object may rewrite to several normal forms or none at all. Many properties of rewriting systems relate to normal forms.

Contents

Definitions

Stated formally, if (A,→) is an abstract rewriting system, xA is in normal form if no yA exists such that xy, i.e. x is an irreducible term.

An object a is weakly normalizing if there exists at least one particular sequence of rewrites starting from a that eventually yields a normal form. A rewriting system has the weak normalization property or is (weakly) normalizing (WN) if every object is weakly normalizing. An object a is strongly normalizing if every sequence of rewrites starting from a eventually terminates with a normal form. An abstract rewriting system is strongly normalizing, terminating, noetherian, or has the (strong) normalization property (SN), if each of its objects is strongly normalizing. [2]

A rewriting system has the normal form property (NF) if for all objects a and normal forms b, b can be reached from a by a series of rewrites and inverse rewrites only if a reduces to b. A rewriting system has the unique normal form property (UN) if for all normal forms a, bS, a can be reached from b by a series of rewrites and inverse rewrites only if a is equal to b. A rewriting system has the unique normal form property with respect to reduction (UN) if for every term reducing to normal forms a and b, a is equal to b. [3]

Results

This section presents some well known results. First, SN implies WN. [4]

Confluence (abbreviated CR) implies NF implies UN implies UN. [3] The reverse implications do not generally hold. {a→b,a→c,c→c,d→c,d→e} is UN but not UN as b=e and b,e are normal forms. {a→b,a→c,b→b} is UN but not NF as b=c, c is a normal form, and b does not reduce to c. {a→b,a→c,b→b,c→c} is NF as there are no normal forms, but not CR as a reduces to b and c, and b,c have no common reduct.

WN and UN imply confluence. Hence CR, NF, UN, and UN coincide if WN holds. [5]

Examples

One example is that simplifying arithmetic expressions produces a number - in arithmetic, all numbers are normal forms. A remarkable fact is that all arithmetic expressions have a unique value, so the rewriting system is strongly normalizing and confluent: [6]

(3 + 5) * (1 + 2) ⇒ 8 * (1 + 2) ⇒ 8 * 3 ⇒ 24
(3 + 5) * (1 + 2) ⇒ (3 + 5) * 3 ⇒ 3*3 + 5*3 ⇒ 9 + 5*3 ⇒ 9 + 15 ⇒ 24

Examples of non-normalizing systems (not weakly or strongly) include counting to infinity (1 ⇒ 2 ⇒ 3 ⇒ ...) and loops such as the transformation function of the Collatz conjecture (1 ⇒ 2 ⇒ 4 ⇒ 1 ⇒ ..., it is an open problem if there are any other loops of the Collatz transformation). [7] Another example is the single-rule system { r(x,y)  r(y,x) }, which has no normalizing properties since from any term, e.g. r(4,2) a single rewrite sequence starts, viz. r(4,2)  r(2,4)  r(4,2)  r(2,4)  ..., which is infinitely long. This leads to the idea of rewriting "modulo commutativity" where a term is in normal form if no rules but commutativity apply. [8]

Weakly but not strongly normalizing rewrite system Cyclic locally, but not globally confluent rewrite system.png
Weakly but not strongly normalizing rewrite system

The system {ba, bc, cb, cd} (pictured) is an example of a weakly normalizing but not strongly normalizing system. a and d are normal forms, and b and c can be reduced to a or d, but the infinite reduction bcbc → ... means that neither b nor c is strongly normalizing.

Untyped lambda calculus

The pure untyped lambda calculus does not satisfy the strong normalization property, and not even the weak normalization property. Consider the term (application is left associative). It has the following rewrite rule: For any term ,

But consider what happens when we apply to itself:

Therefore, the term is not strongly normalizing. And this is the only reduction sequence, hence it is not weakly normalizing either.

Typed lambda calculus

Various systems of typed lambda calculus including the simply typed lambda calculus, Jean-Yves Girard's System F, and Thierry Coquand's calculus of constructions are strongly normalizing.

A lambda calculus system with the normalization property can be viewed as a programming language with the property that every program terminates. Although this is a very useful property, it has a drawback: a programming language with the normalization property cannot be Turing complete, otherwise one could solve the halting problem by seeing if the program type checks. This means that there are computable functions that cannot be defined in the simply typed lambda calculus, and similarly for the calculus of constructions and System F. A typical example is that of a self-interpreter in a total programming language. [10]

See also

Notes

    Related Research Articles

    Lambda calculus is a formal system in mathematical logic for expressing computation based on function abstraction and application using variable binding and substitution. Untyped lambda calculus, the topic of this article, is a universal model of computation that can be used to simulate any Turing machine. It was introduced by the mathematician Alonzo Church in the 1930s as part of his research into the foundations of mathematics. In 1936, Church found a formulation which was logically consistent, and documented it in 1940.

    Combinatory logic is a notation to eliminate the need for quantified variables in mathematical logic. It was introduced by Moses Schönfinkel and Haskell Curry, and has more recently been used in computer science as a theoretical model of computation and also as a basis for the design of functional programming languages. It is based on combinators, which were introduced by Schönfinkel in 1920 with the idea of providing an analogous way to build up functions—and to remove any mention of variables—particularly in predicate logic. A combinator is a higher-order function that uses only function application and earlier defined combinators to define a result from its arguments.

    <span class="mw-page-title-main">Church–Rosser theorem</span> Theorem in theoretical computer science

    In lambda calculus, the Church–Rosser theorem states that, when applying reduction rules to terms, the ordering in which the reductions are chosen does not make a difference to the eventual result.

    In mathematics, computer science, and logic, rewriting covers a wide range of methods of replacing subterms of a formula with other terms. Such methods may be achieved by rewriting systems. In their most basic form, they consist of a set of objects, plus relations on how to transform those objects.

    <span class="mw-page-title-main">Canonical form</span> Standard representation of a mathematical object

    In mathematics and computer science, a canonical, normal, or standardform of a mathematical object is a standard way of presenting that object as a mathematical expression. Often, it is one which provides the simplest representation of an object and allows it to be identified in a unique way. The distinction between "canonical" and "normal" forms varies from subfield to subfield. In most fields, a canonical form specifies a unique representation for every object, while a normal form simply specifies its form, without the requirement of uniqueness.

    In mathematical logic and computer science, the calculus of constructions (CoC) is a type theory created by Thierry Coquand. It can serve as both a typed programming language and as constructive foundation for mathematics. For this second reason, the CoC and its variants have been the basis for Coq and other proof assistants.

    In physics, mathematics and statistics, scale invariance is a feature of objects or laws that do not change if scales of length, energy, or other variables, are multiplied by a common factor, and thus represent a universality.

    System F is a typed lambda calculus that introduces, to simply typed lambda calculus, a mechanism of universal quantification over types. System F formalizes parametric polymorphism in programming languages, thus forming a theoretical basis for languages such as Haskell and ML. It was discovered independently by logician Jean-Yves Girard (1972) and computer scientist John C. Reynolds.

    In functional analysis, a branch of mathematics, the Borel functional calculus is a functional calculus, which has particularly broad scope. Thus for instance if T is an operator, applying the squaring function ss2 to T yields the operator T2. Using the functional calculus for larger classes of functions, we can for example define rigorously the "square root" of the (negative) Laplacian operator −Δ or the exponential

    <span class="mw-page-title-main">Lambda cube</span> A framework

    In mathematical logic and type theory, the λ-cube is a framework introduced by Henk Barendregt to investigate the different dimensions in which the calculus of constructions is a generalization of the simply typed λ-calculus. Each dimension of the cube corresponds to a new kind of dependency between terms and types. Here, "dependency" refers to the capacity of a term or type to bind a term or type. The respective dimensions of the λ-cube correspond to:

    In mathematics, fuzzy measure theory considers generalized measures in which the additive property is replaced by the weaker property of monotonicity. The central concept of fuzzy measure theory is the fuzzy measure, which was introduced by Choquet in 1953 and independently defined by Sugeno in 1974 in the context of fuzzy integrals. There exists a number of different classes of fuzzy measures including plausibility/belief measures, possibility/necessity measures, and probability measures, which are a subset of classical measures.

    In the lambda calculus, a term is in beta normal form if no beta reduction is possible. A term is in beta-eta normal form if neither a beta reduction nor an eta reduction is possible. A term is in head normal form if there is no beta-redex in head position. The normal form of a term, if one exists, is unique. However, a term may have more than one head normal form.

    The simply typed lambda calculus, a form of type theory, is a typed interpretation of the lambda calculus with only one type constructor that builds function types. It is the canonical and simplest example of a typed lambda calculus. The simply typed lambda calculus was originally introduced by Alonzo Church in 1940 as an attempt to avoid paradoxical use of the untyped lambda calculus.

    <span class="mw-page-title-main">Confluence (abstract rewriting)</span>

    In computer science and mathematics, confluence is a property of rewriting systems, describing which terms in such a system can be rewritten in more than one way, to yield the same result. This article describes the properties in the most abstract setting of an abstract rewriting system.

    In programming language semantics, normalisation by evaluation (NBE) is a method of obtaining the normal form of terms in the λ-calculus by appealing to their denotational semantics. A term is first interpreted into a denotational model of the λ-term structure, and then a canonical (β-normal and η-long) representative is extracted by reifying the denotation. Such an essentially semantic, reduction-free, approach differs from the more traditional syntactic, reduction-based, description of normalisation as reductions in a term rewrite system where β-reductions are allowed deep inside λ-terms.

    In linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the matrix being factorized is a normal or real symmetric matrix, the decomposition is called "spectral decomposition", derived from the spectral theorem.

    In mathematical logic and theoretical computer science, an abstract rewriting system is a formalism that captures the quintessential notion and properties of rewriting systems. In its simplest form, an ARS is simply a set together with a binary relation, traditionally denoted with ; this definition can be further refined if we index (label) subsets of the binary relation. Despite its simplicity, an ARS is sufficient to describe important properties of rewriting systems like normal forms, termination, and various notions of confluence.

    In the study of denotational semantics of the lambda calculus, Böhm trees, Lévy-Longo trees, and Berarducci trees are tree-like mathematical objects that capture the "meaning" of a term up to some set of "meaningless" terms.

    In rewriting, a reduction strategy or rewriting strategy is a relation specifying a rewrite for each object or term, compatible with a given reduction relation. Some authors use the term to refer to an evaluation strategy.

    In mathematical logic, the Scott–Curry theorem is a result in lambda calculus stating that if two non-empty sets of lambda terms A and B are closed under beta-convertibility then they are recursively inseparable.

    References

    1. Franz Baader; Tobias Nipkow (1998). Term Rewriting and All That. Cambridge University Press. ISBN   9780521779203.
    2. Ohlebusch, Enno (1998). "Church-Rosser theorems for abstract reduction modulo an equivalence relation". Rewriting Techniques and Applications. Lecture Notes in Computer Science. Vol. 1379. p. 18. doi:10.1007/BFb0052358. ISBN   978-3-540-64301-2.
    3. 1 2 Klop, J.W.; de Vrijer, R.C. (February 1989). "Unique normal forms for lambda calculus with surjective pairing". Information and Computation. 80 (2): 97–113. doi: 10.1016/0890-5401(89)90014-X .
    4. "logic - What is the difference between strong normalization and weak normalization in the context of rewrite systems?". Computer Science Stack Exchange. Retrieved 12 September 2021.
    5. Ohlebusch, Enno (17 April 2013). Advanced Topics in Term Rewriting. Springer Science & Business Media. pp. 13–14. ISBN   978-1-4757-3661-8.
    6. Terese (2003). Term rewriting systems. Cambridge, UK: Cambridge University Press. p. 1. ISBN   0-521-39115-6.
    7. Terese (2003). Term rewriting systems. Cambridge, UK: Cambridge University Press. p. 2. ISBN   0-521-39115-6.
    8. Dershowitz, Nachum; Jouannaud, Jean-Pierre (1990). "6. Rewrite Systems". In Jan van Leeuwen (ed.). Handbook of Theoretical Computer Science. Vol. B. Elsevier. pp. 9–10. CiteSeerX   10.1.1.64.3114 . ISBN   0-444-88074-7.
    9. N. Dershowitz and J.-P. Jouannaud (1990). "Rewrite Systems". In Jan van Leeuwen (ed.). Formal Models and Semantics. Handbook of Theoretical Computer Science. Vol. B. Elsevier. p. 268. ISBN   0-444-88074-7.
    10. Riolo, Rick; Worzel, William P.; Kotanchek, Mark (4 June 2015). Genetic Programming Theory and Practice XII. Springer. p. 59. ISBN   978-3-319-16030-6 . Retrieved 8 September 2021.