Graph reduction

Last updated

In computer science, graph reduction implements an efficient version of non-strict evaluation, an evaluation strategy where the arguments to a function are not immediately evaluated. This form of non-strict evaluation is also known as lazy evaluation and used in functional programming languages. The technique was first developed by Chris Wadsworth in 1971.

Contents

Motivation

A simple example of evaluating an arithmetic expression follows:

The above reduction sequence employs a strategy known as outermost tree reduction. The same expression can be evaluated using innermost tree reduction, yielding the reduction sequence:

Notice that the reduction order is made explicit by the addition of parentheses. This expression could also have been simply evaluated right to left, because addition is an associative operation.

Represented as a tree, the expression above looks like this:

Expression Tree.svg

This is where the term tree reduction comes from. When represented as a tree, we can think of innermost reduction as working from the bottom up, while outermost works from the top down.

The expression can also be represented as a directed acyclic graph, allowing sub-expressions to be shared:

Expression Graph.svg

As for trees, outermost and innermost reduction also applies to graphs. Hence we have graph reduction.

Now evaluation with outermost graph reduction can proceed as follows:

Expression Graph Reduction.svg

Notice that evaluation now only requires four steps. Outermost graph reduction is referred to as lazy evaluation and innermost graph reduction is referred to as eager evaluation.

Combinator graph reduction

Combinator graph reduction is a fundamental implementation technique for functional programming languages, in which a program is converted into a combinator representation which is mapped to a directed graph data structure in computer memory, and program execution then consists of rewriting parts of this graph ("reducing" it) so as to move towards useful results.

History

The concept of a graph reduction that allows evaluated values to be shared was first developed by Chris Wadsworth in his 1971 Ph.D. dissertation. [1] This dissertation was cited by Peter Henderson and James H. Morris Jr. in 1976 paper, “A lazy evaluator” [2] that introduced the notion of lazy evaluation. In 1976 David Turner incorporated lazy evaluation into SASL using combinators. [3] SASL was an early functional programming language first developed by Turner in 1972.

See also

Notes

  1. Hudak, Paul (September 1989). "Conception, evolution, and application of functional programming languages". ACM Computing Surveys. 21 (3): 359–411. CiteSeerX   10.1.1.83.6505 . doi:10.1145/72551.72554.
  2. A lazy evaluator
  3. Hudak, Paul; Hughes, John; Peyton Jones, Simon; Wadler, Philip. "A History of Haskell: Being Lazy with Class". History of Programming Languages Conference 2007.

Related Research Articles

In computer science, functional programming is a programming paradigm where programs are constructed by applying and composing functions. It is a declarative programming paradigm in which function definitions are trees of expressions that map values to other values, rather than a sequence of imperative statements which update the running state of the program.

In programming language theory, lazy evaluation, or call-by-need, is an evaluation strategy which delays the evaluation of an expression until its value is needed and which also avoids repeated evaluations (sharing).

Lambda calculus is a formal system in mathematical logic for expressing computation based on function abstraction and application using variable binding and substitution. It is a universal model of computation that can be used to simulate any Turing machine. It was introduced by the mathematician Alonzo Church in the 1930s as part of his research into the foundations of mathematics.

Combinatory logic is a notation to eliminate the need for quantified variables in mathematical logic. It was introduced by Moses Schönfinkel and Haskell Curry, and has more recently been used in computer science as a theoretical model of computation and also as a basis for the design of functional programming languages. It is based on combinators which were introduced by Schönfinkel in 1920 with the idea of providing an analogous way to build up functions—and to remove any mention of variables—particularly in predicate logic. A combinator is a higher-order function that uses only function application and earlier defined combinators to define a result from its arguments.

In mathematics and computer science in general, a fixed point of a function is a value that is mapped to itself by the function.

ISWIM is an abstract computer programming language devised by Peter Landin and first described in his article "The Next 700 Programming Languages", published in the Communications of the ACM in 1966.

A list comprehension is a syntactic construct available in some programming languages for creating a list based on existing lists. It follows the form of the mathematical set-builder notation as distinct from the use of map and filter functions.

In computer science, pattern matching is the act of checking a given sequence of tokens for the presence of the constituents of some pattern. In contrast to pattern recognition, the match usually has to be exact: "either it will or will not be a match." The patterns generally have the form of either sequences or tree structures. Uses of pattern matching include outputting the locations of a pattern within a token sequence, to output some component of the matched pattern, and to substitute the matching pattern with some other token sequence.

Curry is an experimental functional logic programming language, based on the Haskell language. It merges elements of functional and logic programming, including constraint programming integration.

The Glasgow Haskell Compiler (GHC) is an open-source native code compiler for the functional programming language Haskell. It provides a cross-platform environment for the writing and testing of Haskell code and it supports numerous extensions, libraries, and optimisations that streamline the process of generating and executing code. GHC is the most commonly used Haskell compiler. The lead developers are Simon Peyton Jones and Simon Marlow.

David A. Turner is a British computer scientist. He is best known for designing and implementing three programming languages, including the first for functional programming based on lazy evaluation, combinator graph reduction, and polymorphic types: SASL (1972), Kent Recursive Calculator (KRC) (1981), and the commercially supported Miranda (1985). Miranda had a strong influence on the later Haskell.

SIGPLAN is the Association for Computing Machinery's Special Interest Group on programming languages.

The SKI combinator calculus is a combinatory logic system and a computational system. It can be thought of as a computer programming language, though it is not convenient for writing software. Instead, it is important in the mathematical theory of algorithms because it is an extremely simple Turing complete language. It can be likened to a reduced version of the untyped lambda calculus. It was introduced by Moses Schönfinkel and Haskell Curry.

Philip Wadler American computer scientist

Philip Lee Wadler is an American computer scientist known for his contributions to programming language design and type theory. He is the chair of Theoretical Computer Science at the Laboratory for Foundations of Computer Science at School of Informatics, University of Edinburgh. He has contributed to the theory behind functional programming and the use of monads in functional programming, the design of the purely functional language Haskell, and the XQuery declarative query language. In 1984, he created the Orwell programming language. Wadler was involved in adding generic types to Java 5.0. He is also author of the paper Theorems for free! that gave rise to much research on functional language optimization.

In a programming language, an evaluation strategy is a set of rules for evaluating expressions. The term is often used to refer to the more specific notion of a parameter-passing strategy that defines the kind of value that is passed to the function for each parameter and whether to evaluate the parameters of a function call, and if so in what order. The notion of reduction strategy is distinct, although some authors conflate the two terms and the definition of each term is not widely agreed upon.

Simon Peyton Jones British computer scientist

Simon Peyton Jones is a British computer scientist who researches the implementation and applications of functional programming languages, particularly lazy functional programming.

In computer programming, a parser combinator is a higher-order function that accepts several parsers as input and returns a new parser as its output. In this context, a parser is a function accepting strings as input and returning some structure as output, typically a parse tree or a set of indices representing locations in the string where parsing stopped successfully. Parser combinators enable a recursive descent parsing strategy that facilitates modular piecewise construction and testing. This parsing technique is called combinatory parsing.

Haskell is a general-purpose, statically-typed, purely functional programming language with type inference and lazy evaluation. Designed for teaching, research and industrial applications, Haskell has pioneered a number of programming language features such as type classes, which enable type-safe operator overloading. Haskell's main implementation is the Glasgow Haskell Compiler (GHC). It is named after logician Haskell Curry.

In rewriting, a reduction strategy or rewriting strategy is a relation specifying a rewrite for each object or term, compatible with a given reduction relation. Some authors use the term to refer to an evaluation strategy.

References

Further reading