In computer science, graph reduction implements an efficient version of non-strict evaluation, an evaluation strategy where the arguments to a function are not immediately evaluated. This form of non-strict evaluation is also known as lazy evaluation and used in functional programming languages. The technique was first developed by Chris Wadsworth in 1971.
A simple example of evaluating an arithmetic expression follows:
The above reduction sequence employs a strategy known as outermost tree reduction. The same expression can be evaluated using innermost tree reduction, yielding the reduction sequence:
Notice that the reduction order is made explicit by the addition of parentheses. This expression could also have been simply evaluated right to left, because addition is an associative operation.
Represented as a tree, the expression above looks like this:
This is where the term tree reduction comes from. When represented as a tree, we can think of innermost reduction as working from the bottom up, while outermost works from the top down.
The expression can also be represented as a directed acyclic graph, allowing sub-expressions to be shared:
As for trees, outermost and innermost reduction also applies to graphs. Hence we have graph reduction.
Now evaluation with outermost graph reduction can proceed as follows:
Notice that evaluation now only requires four steps. Outermost graph reduction is referred to as lazy evaluation and innermost graph reduction is referred to as eager evaluation.
Combinator graph reduction is a fundamental implementation technique for functional programming languages, in which a program is converted into a combinator representation which is mapped to a directed graph data structure in computer memory, and program execution then consists of rewriting parts of this graph ("reducing" it) so as to move towards useful results.
The concept of a graph reduction that allows evaluated values to be shared was first developed by Chris Wadsworth in his 1971 Ph.D. dissertation. [1] This dissertation was cited by Peter Henderson and James H. Morris Jr. in 1976 paper, “A lazy evaluator” [2] that introduced the notion of lazy evaluation. In 1976 David Turner incorporated lazy evaluation into SASL using combinators. [3] SASL was an early functional programming language first developed by Turner in 1972.
In computer science, functional programming is a programming paradigm where programs are constructed by applying and composing functions. It is a declarative programming paradigm in which function definitions are trees of expressions that map values to other values, rather than a sequence of imperative statements which update the running state of the program.
In programming language theory, lazy evaluation, or call-by-need, is an evaluation strategy which delays the evaluation of an expression until its value is needed and which also avoids repeated evaluations.
Lambda calculus is a formal system in mathematical logic for expressing computation based on function abstraction and application using variable binding and substitution. Untyped lambda calculus, the topic of this article, is a universal model of computation that can be used to simulate any Turing machine. It was introduced by the mathematician Alonzo Church in the 1930s as part of his research into the foundations of mathematics. In 1936, Church found a formulation which was logically consistent, and documented it in 1940.
Combinatory logic is a notation to eliminate the need for quantified variables in mathematical logic. It was introduced by Moses Schönfinkel and Haskell Curry, and has more recently been used in computer science as a theoretical model of computation and also as a basis for the design of functional programming languages. It is based on combinators, which were introduced by Schönfinkel in 1920 with the idea of providing an analogous way to build up functions—and to remove any mention of variables—particularly in predicate logic. A combinator is a higher-order function that uses only function application and earlier defined combinators to define a result from its arguments.
A graph reduction machine is a special-purpose computer built to perform combinator calculations by graph reduction.
In combinatory logic for computer science, a fixed-point combinator, is a higher-order function that returns some fixed point of its argument function, if one exists.
ISWIM is an abstract computer programming language devised by Peter Landin and first described in his article "The Next 700 Programming Languages", published in the Communications of the ACM in 1966.
A list comprehension is a syntactic construct available in some programming languages for creating a list based on existing lists. It follows the form of the mathematical set-builder notation as distinct from the use of map and filter functions.
In computer science, pattern matching is the act of checking a given sequence of tokens for the presence of the constituents of some pattern. In contrast to pattern recognition, the match usually has to be exact: "either it will or will not be a match." The patterns generally have the form of either sequences or tree structures. Uses of pattern matching include outputting the locations of a pattern within a token sequence, to output some component of the matched pattern, and to substitute the matching pattern with some other token sequence.
In computer programming, especially functional programming and type theory, an algebraic data type (ADT) is a kind of composite type, i.e., a type formed by combining other types.
Curry is a declarative programming language, an implementation of the functional logic programming paradigm, and based on the Haskell language. It merges elements of functional and logic programming, including constraint programming integration.
The Glasgow Haskell Compiler (GHC) is a native or machine code compiler for the functional programming language Haskell. It provides a cross-platform software environment for writing and testing Haskell code and supports many extensions, libraries, and optimisations that streamline the process of generating and executing code. GHC is the most commonly used Haskell compiler. It is free and open-source software released under a BSD license.
David A. Turner was a British computer scientist. He is best known for designing and implementing three programming languages, including the first for functional programming based on lazy evaluation, combinator graph reduction, and polymorphic types: SASL (1972), Kent Recursive Calculator (KRC) (1981), and the commercially supported Miranda (1985).
The SKI combinator calculus is a combinatory logic system and a computational system. It can be thought of as a computer programming language, though it is not convenient for writing software. Instead, it is important in the mathematical theory of algorithms because it is an extremely simple Turing complete language. It can be likened to a reduced version of the untyped lambda calculus. It was introduced by Moses Schönfinkel and Haskell Curry.
Philip Lee Wadler is a UK-based American computer scientist known for his contributions to programming language design and type theory. He is holds the position of Personal Chair of theoretical computer science at the Laboratory for Foundations of Computer Science at the School of Informatics, University of Edinburgh. He has contributed to the theory behind functional programming and the use of monads; and the designs of the purely functional language Haskell and the XQuery declarative query language. In 1984, he created the Orwell language. Wadler was involved in adding generic types to Java 5.0. He is also author of "Theorems for free!", a paper that gave rise to much research on functional language optimization.
In a programming language, an evaluation strategy is a set of rules for evaluating expressions. The term is often used to refer to the more specific notion of a parameter-passing strategy that defines the kind of value that is passed to the function for each parameter and whether to evaluate the parameters of a function call, and if so in what order. The notion of reduction strategy is distinct, although some authors conflate the two terms and the definition of each term is not widely agreed upon.
In computer programming, a parser combinator is a higher-order function that accepts several parsers as input and returns a new parser as its output. In this context, a parser is a function accepting strings as input and returning some structure as output, typically a parse tree or a set of indices representing locations in the string where parsing stopped successfully. Parser combinators enable a recursive descent parsing strategy that facilitates modular piecewise construction and testing. This parsing technique is called combinatory parsing.
Haskell is a general-purpose, statically-typed, purely functional programming language with type inference and lazy evaluation. Designed for teaching, research, and industrial applications, Haskell has pioneered several programming language features such as type classes, which enable type-safe operator overloading, and monadic input/output (IO). It is named after logician Haskell Curry. Haskell's main implementation is the Glasgow Haskell Compiler (GHC).
In rewriting, a reduction strategy or rewriting strategy is a relation specifying a rewrite for each object or term, compatible with a given reduction relation. Some authors use the term to refer to an evaluation strategy.