Continuation

Last updated

In computer science, a continuation is an abstract representation of the control state of a computer program. A continuation implements (reifies) the program control state, i.e. the continuation is a data structure that represents the computational process at a given point in the process's execution; the created data structure can be accessed by the programming language, instead of being hidden in the runtime environment. Continuations are useful for encoding other control mechanisms in programming languages such as exceptions, generators, coroutines, and so on.

Contents

The "current continuation" or "continuation of the computation step" is the continuation that, from the perspective of running code, would be derived from the current point in a program's execution. The term continuations can also be used to refer to first-class continuations, which are constructs that give a programming language the ability to save the execution state at any point and return to that point at a later point in the program, possibly multiple times.

History

The earliest description of continuations was made by Adriaan van Wijngaarden in September 1964. Wijngaarden spoke at the IFIP Working Conference on Formal Language Description Languages held in Baden bei Wien, Austria. As part of a formulation for an Algol 60 preprocessor, he called for a transformation of proper procedures into continuation-passing style, [1] though he did not use this name, and his intention was to simplify a program and thus make its result more clear.

Christopher Strachey, Christopher P. Wadsworth and John C. Reynolds brought the term continuation into prominence in their work in the field of denotational semantics that makes extensive use of continuations to allow sequential programs to be analysed in terms of functional programming semantics. [1]

Steve Russell [2] invented the continuation in his second Lisp implementation for the IBM 704, though he did not name it. [3]

Reynolds (1993) gives a complete history of the discovery of continuations.

First-class continuations

First-class continuations are a language's ability to completely control the execution order of instructions. They can be used to jump to a function that produced the call to the current function, or to a function that has previously exited. One can think of a first-class continuation as saving the execution state of the program. It is important to note that true first-class continuations do not save program data – unlike a process image – only the execution context. This is illustrated by the "continuation sandwich" description:

Say you're in the kitchen in front of the refrigerator, thinking about a sandwich. You take a continuation right there and stick it in your pocket. Then you get some turkey and bread out of the refrigerator and make yourself a sandwich, which is now sitting on the counter. You invoke the continuation in your pocket, and you find yourself standing in front of the refrigerator again, thinking about a sandwich. But fortunately, there's a sandwich on the counter, and all the materials used to make it are gone. So you eat it. :-) [4]

In this description, the sandwich is part of the program data (e.g., an object on the heap), and rather than calling a "make sandwich" routine and then returning, the person called a "make sandwich with current continuation" routine, which creates the sandwich and then continues where execution left off.

Scheme was the first full production system providing first "catch" [1] and then call/cc. Bruce Duba introduced call/cc into SML.

Continuations are also used in models of computation including denotational semantics, the actor model, process calculi, and lambda calculus. These models rely on programmers or semantics engineers to write mathematical functions in the so-called continuation-passing style. This means that each function consumes a function that represents the rest of the computation relative to this function call. To return a value, the function calls this "continuation function" with a return value; to abort the computation it returns a value.

Functional programmers who write their programs in continuation-passing style gain the expressive power to manipulate the flow of control in arbitrary ways. The cost is that they must maintain the invariants of control and continuations by hand, which can be a highly complex undertaking (but see 'continuation-passing style' below).

Uses

Continuations simplify and clarify the implementation of several common design patterns, including coroutines/green threads and exception handling, by providing the basic, low-level primitive which unifies these seemingly unconnected patterns. Continuations can provide elegant solutions to some difficult high-level problems, like programming a web server that supports multiple pages, accessed by the use of the forward and back buttons and by following links. The Smalltalk Seaside web framework uses continuations to great effect, allowing one to program the web server in procedural style, by switching continuations when switching pages.

More complex constructs for which "continuations provide an elegant description" [1] also exist. For example, in C, longjmp can be used to jump from the middle of one function to another, provided the second function lies deeper in the stack (if it is waiting for the first function to return, possibly among others). Other more complex examples include coroutines in Simula 67, Lua, and Perl; tasklets in Stackless Python; generators in Icon and Python; continuations in Scala (starting in 2.8); fibers in Ruby (starting in 1.9.1); the backtracking mechanism in Prolog; monads in functional programming; and threads.

Examples

The Scheme programming language includes the control operator call-with-current-continuation (abbreviated as: call/cc) with which a Scheme program can manipulate the flow of control:

(definethe-continuation#f)(define(test)(let((i0)); call/cc calls its first function argument, passing; a continuation variable representing this point in; the program as the argument to that function.;; In this case, the function argument assigns that; continuation to the variable the-continuation.;(call/cc(lambda(k)(set!the-continuationk)));; The next time the-continuation is called, we start here.(set!i(+i1))i))

Using the above, the following code block defines a function test that sets the-continuation to the future execution state of itself:

>(test)1>(the-continuation)2>(the-continuation)3>; stores the current continuation (which will print 4 next) away>(defineanother-continuationthe-continuation)>(test); resets the-continuation1>(the-continuation)2>(another-continuation); uses the previously stored continuation4

For a gentler introduction to this mechanism, see call-with-current-continuation.

Coroutines

This example shows a possible usage of continuations to implement coroutines as separate threads. [5]

;;; A naive queue for thread scheduling.;;; It holds a list of continuations "waiting to run".(define*queue*'())(define(empty-queue?)(null?*queue*))(define(enqueuex)(set!*queue*(append*queue*(listx))))(define(dequeue)(let((x(car*queue*)))(set!*queue*(cdr*queue*))x));;; This starts a new thread running (proc).(define(forkproc)(call/cc(lambda(k)(enqueuek)(proc))));;; This yields the processor to another thread, if there is one.(define(yield)(call/cc(lambda(k)(enqueuek)((dequeue)))));;; This terminates the current thread, or the entire program;;; if there are no other threads left.(define(thread-exit)(if(empty-queue?)(exit)((dequeue))))

The functions defined above allow for defining and executing threads through cooperative multitasking, i.e. threads that yield control to the next one in a queue:

;;; The body of some typical Scheme thread that does stuff:(define(do-stuff-n-printstr)(lambda()(letloop((n0))(format#t"~A ~A\n"strn)(yield)(loop(+n1)))));;; Create two threads, and start them running.(fork(do-stuff-n-print"This is AAA"))(fork(do-stuff-n-print"Hello from BBB"))(thread-exit)

The previous code will produce this output:

 This is AAA 0  Hello from BBB 0  This is AAA 1  Hello from BBB 1  This is AAA 2  Hello from BBB 2  ...

Implementation

A program must allocate space in memory for the variables its functions use. Most programming languages use a call stack for storing the variables needed because it allows for fast and simple allocating and automatic deallocation of memory. Other programming languages use a heap for this, which allows for flexibility at a higher cost for allocating and deallocating memory. Both of these implementations have benefits and drawbacks in the context of continuations. [6]

Programming language support

Many programming languages exhibit first-class continuations under various names; specifically:

In any language which supports closures and proper tail calls, it is possible to write programs in continuation-passing style and manually implement call/cc. (In continuation-passing style, call/cc becomes a simple function that can be written with lambda.) This is a particularly common strategy in Haskell, where it is easy to construct a "continuation-passing monad" (for example, the Cont monad and ContT monad transformer in the mtl library). The support for proper tail calls is needed because in continuation-passing style no function ever returns; all calls are tail calls.

In Web development

One area that has seen practical use of continuations is in Web programming. [7] [8] The use of continuations shields the programmer from the stateless nature of the HTTP protocol. In the traditional model of web programming, the lack of state is reflected in the program's structure, leading to code constructed around a model that lends itself very poorly to expressing computational problems. Thus continuations enable code that has the useful properties associated with inversion of control, while avoiding its problems. "Inverting back the inversion of control or, Continuations versus page-centric programming" [9] is a paper that provides a good introduction to continuations applied to web programming.

Kinds

Support for continuations varies widely. A programming language supports re-invocable continuations if a continuation may be invoked repeatedly (even after it has already returned). Re-invocable continuations were introduced by Peter J. Landin using his J (for Jump) operator that could transfer the flow of control back into the middle of a procedure invocation. Re-invocable continuations have also been called "re-entrant" in the Racket language. However this use of the term "re-entrant" can be easily confused with its use in discussions of multithreading.

A more limited kind is the escape continuation that may be used to escape the current context to a surrounding one. Many languages which do not explicitly support continuations support exception handling, which is equivalent to escape continuations and can be used for the same purposes. C's setjmp/longjmp are also equivalent: they can only be used to unwind the stack. Escape continuations can also be used to implement tail call elimination.

One generalization of continuations are delimited continuations. Continuation operators like call/cc capture the entire remaining computation at a given point in the program and provide no way of delimiting this capture. Delimited continuation operators address this by providing two separate control mechanisms: a prompt that delimits a continuation operation and a reification operator such as shift or control. Continuations captured using delimited operators thus only represent a slice of the program context.

Disadvantages

Continuations are the functional expression of the GOTO statement, and the same caveats apply. [10] While they are a sensible option in some special cases such as web programming, use of continuations can result in code that is difficult to follow. In fact, the esoteric programming language Unlambda includes call-with-current-continuation as one of its features solely because expressions involving it "tend to be hopelessly difficult to track down." [11] The external links below illustrate the concept in more detail.

Linguistics

In "Continuations and the nature of quantification", Chris Barker introduced the "continuation hypothesis", that

some linguistic expressions (in particular, QNPs [quantificational noun phrases]) have denotations that manipulate their own continuations. [12]

Barker argued that this hypothesis could be used to explain phenomena such as duality of NP meaning (e.g., the fact that the QNP "everyone" behaves very differently from the non-quantificational noun phrase "Bob" in contributing towards the meaning of a sentence like "Alice sees [Bob/everyone]"), scope displacement (e.g., that "a raindrop fell on every car" is interpreted typically as rather than as ), and scope ambiguity (that a sentence like "someone saw everyone" may be ambiguous between and ). He also observed that this idea is in a way just a natural extension of Richard Montague's approach in "The Proper Treatment of Quantification in Ordinary English" (PTQ), writing that "with the benefit of hindsight, a limited form of continuation-passing is clearly discernible at the core of Montague’s (1973) PTQ treatment of NPs as generalized quantifiers".

The extent to which continuations can be used to explain other general phenomena in natural language is a topic of current research. [13]

See also

Related Research Articles

In computer science, functional programming is a programming paradigm where programs are constructed by applying and composing functions. It is a declarative programming paradigm in which function definitions are trees of expressions that map values to other values, rather than a sequence of imperative statements which update the running state of the program.

<span class="mw-page-title-main">Lisp (programming language)</span> Programming language family

Lisp is a family of programming languages with a long history and a distinctive, fully parenthesized prefix notation. Originally specified in 1960, Lisp is the second-oldest high-level programming language still in common use, after Fortran. Lisp has changed since its early days, and many dialects have existed over its history. Today, the best-known general-purpose Lisp dialects are Common Lisp, Scheme, Racket, and Clojure.

<span class="mw-page-title-main">Scheme (programming language)</span> Dialect of Lisp

Scheme is a dialect of the Lisp family of programming languages. Scheme was created during the 1970s at the MIT Computer Science and Artificial Intelligence Laboratory and released by its developers, Guy L. Steele and Gerald Jay Sussman, via a series of memos now known as the Lambda Papers. It was the first dialect of Lisp to choose lexical scope and the first to require implementations to perform tail-call optimization, giving stronger support for functional programming and associated techniques such as recursive algorithms. It was also one of the first programming languages to support first-class continuations. It had a significant influence on the effort that led to the development of Common Lisp.

In programming languages, a closure, also lexical closure or function closure, is a technique for implementing lexically scoped name binding in a language with first-class functions. Operationally, a closure is a record storing a function together with an environment. The environment is a mapping associating each free variable of the function with the value or reference to which the name was bound when the closure was created. Unlike a plain function, a closure allows the function to access those captured variables through the closure's copies of their values or references, even when the function is invoked outside their scope.

ISWIM is an abstract computer programming language devised by Peter Landin and first described in his article "The Next 700 Programming Languages", published in the Communications of the ACM in 1966.

Coroutines are computer program components that allow execution to be suspended and resumed, generalizing subroutines for cooperative multitasking. Coroutines are well-suited for implementing familiar program components such as cooperative tasks, exceptions, event loops, iterators, infinite lists and pipes.

In functional programming, a monad is a structure that combines program fragments (functions) and wraps their return values in a type with additional computation. In addition to defining a wrapping monadic type, monads define two operators: one to wrap a value in the monad type, and another to compose together functions that output values of the monad type. General-purpose languages use monads to reduce boilerplate code needed for common operations. Functional languages use monads to turn complicated sequences of functions into succinct pipelines that abstract away control flow, and side-effects.

In functional programming, continuation-passing style (CPS) is a style of programming in which control is passed explicitly in the form of a continuation. This is contrasted with direct style, which is the usual style of programming. Gerald Jay Sussman and Guy L. Steele, Jr. coined the phrase in AI Memo 349 (1975), which sets out the first version of the Scheme programming language. John C. Reynolds gives a detailed account of the numerous discoveries of continuations.

In the Scheme computer programming language, the procedure call-with-current-continuation, abbreviated call/cc, is used as a control flow operator. It has been adopted by several other programming languages.

The actor model in computer science is a mathematical model of concurrent computation that treats an actor as the basic building block of concurrent computation. In response to a message it receives, an actor can: make local decisions, create more actors, send more messages, and determine how to respond to the next message received. Actors may modify their own private state, but can only affect each other indirectly through messaging.

In computer science, future, promise, delay, and deferred refer to constructs used for synchronizing program execution in some concurrent programming languages. They describe an object that acts as a proxy for a result that is initially unknown, usually because the computation of its value is not yet complete.

In computer science, the Actor model, first published in 1973, is a mathematical model of concurrent computation.

<span class="mw-page-title-main">Matthias Felleisen</span> German-American computer science professor and author

Matthias Felleisen is a German-American computer science professor and author. He grew up in Germany and immigrated to the US in his twenties. He received his PhD from Indiana University under the direction of Daniel P. Friedman.

In a programming language, an evaluation strategy is a set of rules for evaluating expressions. The term is often used to refer to the more specific notion of a parameter-passing strategy that defines the kind of value that is passed to the function for each parameter and whether to evaluate the parameters of a function call, and if so in what order. The notion of reduction strategy is distinct, although some authors conflate the two terms and the definition of each term is not widely agreed upon.

<span class="mw-page-title-main">Programming language theory</span> Branch of computer science

Programming language theory (PLT) is a branch of computer science that deals with the design, implementation, analysis, characterization, and classification of formal languages known as programming languages. Programming language theory is closely related to other fields including mathematics, software engineering, and linguistics. There are a number of academic conferences and journals in the area.

In computing, a meta-circular evaluator (MCE) or meta-circular interpreter (MCI) is an interpreter which defines each feature of the interpreted language using a similar facility of the interpreter's host language. For example, interpreting a lambda application may be implemented using function application. Meta-circular evaluation is most prominent in the context of Lisp. A self-interpreter is a meta-circular interpreter where the interpreted language is nearly identical to the host language; the two terms are often used synonymously.

In programming languages, a delimited continuation, composable continuation or partial continuation, is a "slice" of a continuation frame that has been reified into a function. Unlike regular continuations, delimited continuations return a value, and thus may be reused and composed. Control delimiters, the basis of delimited continuations, were introduced by Matthias Felleisen in 1988 though early allusions to composable and delimited continuations can be found in Carolyn Talcott's Stanford 1984 dissertation, Felleisen et al., Felleisen's 1987 dissertation, and algorithms for functional backtracking, e.g., for pattern matching, for parsing, in the Algebraic Logic Functional programming language, and in the functional implementations of Prolog where the failure continuation is often kept implicit and the reason of being for the success continuation is that it is composable.

The history of the programming language Scheme begins with the development of earlier members of the Lisp family of languages during the second half of the twentieth century. During the design and development period of Scheme, language designers Guy L. Steele and Gerald Jay Sussman released an influential series of Massachusetts Institute of Technology (MIT) AI Memos known as the Lambda Papers (1975–1980). This resulted in the growth of popularity in the language and the era of standardization from 1990 onward. Much of the history of Scheme has been documented by the developers themselves.

A CEK Machine is an abstract machine invented by Matthias Felleisen and Daniel P. Friedman that implements left-to-right call by value. It is generally implemented as an interpreter for functional programming languages, but can also be used to implement simple imperative programming languages. A state in a CEK machine includes a control statement, environment and continuation. The control statement is the term being evaluated at that moment, the environment is (usually) a map from variable names to values, and the continuation stores another state, or a special halt case. It is a simplified form of another abstract machine called the SECD machine.

In computer science, refocusing is a program transformation used to implement a reduction semantics -- i.e., a small-step operational semantics with an explicit representation of the reduction context -- more efficiently. It is a step towards implementing a deterministic semantics as a deterministic abstract machine.

References

  1. 1 2 3 4 Reynolds 1993
  2. S.R. Russell noticed that eval could serve as an interpreter for LISP, promptly hand coded it, and we now had a programming language with an interpreter. —John McCarthy, History of LISP
  3. "Steve "Slug" Russell". Computer History.
  4. Palmer, Luke (June 29, 2004). "undo()? ("continuation sandwich" example)". perl.perl6.language (newsgroup). Retrieved 2009-10-04.
  5. Haynes, C. T., Friedman, D. P., and Wand, M. 1984. Continuations and coroutines. In Proceedings of the 1984 ACM Symposium on LISP and Functional Programming (Austin, Texas, United States, August 06–08, 1984). LFP '84. ACM, New York, NY, 293-298.
  6. "Call with current continuation for C programmers". Community-Scheme-Wiki. 12 October 2008.
  7. "Reading list on XML and Web Programming". Archived from the original on 2010-06-14. Retrieved 2006-08-03.
  8. "Web Programming with Continuations" (PDF). Archived from the original (PDF) on 2012-09-05. Retrieved 2012-09-05.
  9. Christian.Queinnec (2003) Inverting back the inversion of control or, Continuations versus page-centric programming
  10. Quigley, John (September 2007). "Computational Continuations" (PDF). p. 38.
  11. Madore, David. "The Unlambda Programming Language". www.madore.org. Retrieved 19 June 2021.
  12. Chris Barker, Continuations and the nature of quantification, 2002 Natural Language Semantics 10:211-242.
  13. See for example Chris Barker, Continuations in Natural Language (Continuations Workshop 2004), or Chung-chieh Shan, Linguistic Side Effects (in "Direct compositionality, ed. Chris Barker and Pauline Jacobson, pp. 132-163, Oxford University Press, 2007).

Further reading