Applicative programming language

Last updated

In the classification of programming languages, an applicative programming language is built out of functions applied to arguments. Applicative languages are functional, and applicative is often used as a synonym for functional. [2] However, concatenative languages can be functional, while not being applicative. [3]

The semantics of applicative languages are based on beta reduction of terms, and Side effect such as mutation of state are not permitted. [4]

Lisp and ML are applicative programming languages.

See also

Related Research Articles

In computer science, functional programming is a programming paradigm where programs are constructed by applying and composing functions. It is a declarative programming paradigm in which function definitions are trees of expressions that map values to other values, rather than a sequence of imperative statements which update the running state of the program.

Prolog is a logic programming language that has its origins in artificial intelligence, automated theorem proving and computational linguistics.

In computer science, denotational semantics is an approach of formalizing the meanings of programming languages by constructing mathematical objects that describe the meanings of expressions from the languages. Other approaches providing formal semantics of programming languages include axiomatic semantics and operational semantics.

In computer science, declarative programming is a programming paradigm—a style of building the structure and elements of computer programs—that expresses the logic of a computation without describing its control flow.

In programming language theory and proof theory, the Curry–Howard correspondence is the direct relationship between computer programs and mathematical proofs. It is also known as the Curry–Howard isomorphism or equivalence, or the proofs-as-programs and propositions- or formulae-as-types interpretation.

In functional programming, a monad is a structure that combines program fragments (functions) and wraps their return values in a type with additional computation. In addition to defining a wrapping monadic type, monads define two operators: one to wrap a value in the monad type, and another to compose together functions that output values of the monad type. General-purpose languages use monads to reduce boilerplate code needed for common operations. Functional languages use monads to turn complicated sequences of functions into succinct pipelines that abstract away control flow, and side-effects.

In functional programming, continuation-passing style (CPS) is a style of programming in which control is passed explicitly in the form of a continuation. This is contrasted with direct style, which is the usual style of programming. Gerald Jay Sussman and Guy L. Steele, Jr. coined the phrase in AI Memo 349 (1975), which sets out the first version of the Scheme programming language. John C. Reynolds gives a detailed account of the numerous discoveries of continuations.

A concatenative programming language is a point-free computer programming language in which all expressions denote functions, and the juxtaposition of expressions denotes function composition. Concatenative programming replaces function application, which is common in other programming styles, with function composition as the default way to build subroutines.

In computer science, function-level programming refers to one of the two contrasting programming paradigms identified by John Backus in his work on programs as mathematical objects, the other being value-level programming.

Bunched logic is a variety of substructural logic proposed by Peter O'Hearn and David Pym. Bunched logic provides primitives for reasoning about resource composition, which aid in the compositional analysis of computer and other systems. It has category-theoretic and truth-functional semantics, which can be understood in terms of an abstract concept of resource, and a proof theory in which the contexts Γ in an entailment judgement Γ ⊢ A are tree-like structures (bunches) rather than lists or (multi)sets as in most proof calculi. Bunched logic has an associated type theory, and its first application was in providing a way to control the aliasing and other forms of interference in imperative programs. The logic has seen further applications in program verification, where it is the basis of the assertion language of separation logic, and in systems modelling, where it provides a way to decompose the resources used by components of a system.

Binary combinatory logic (BCL) is a computer programming language that uses binary terms 0 and 1 to create a complete formulation of combinatory logic using only the symbols 0 and 1. Using the S and K combinators, complex boolean algebra functions can be made. BCL has applications in the theory of program-size complexity.

Higher-order programming is a style of computer programming that uses software components, like functions, modules or objects, as values. It is usually instantiated with, or borrowed from, models of computation such as lambda calculus which make heavy use of higher-order functions. A programming language can be considered higher-order if components, such as procedures or labels, can be used just like data. For example, these elements could be used in the same way as arguments or values.

In a programming language, an evaluation strategy is a set of rules for evaluating expressions. The term is often used to refer to the more specific notion of a parameter-passing strategy that defines the kind of value that is passed to the function for each parameter and whether to evaluate the parameters of a function call, and if so in what order. The notion of reduction strategy is distinct, although some authors conflate the two terms and the definition of each term is not widely agreed upon.

<span class="mw-page-title-main">And-inverter graph</span> Graph representing an implementation of the logical functionality of a network

An and-inverter graph (AIG) is a directed, acyclic graph that represents a structural implementation of the logical functionality of a circuit or network. An AIG consists of two-input nodes representing logical conjunction, terminal nodes labeled with variable names, and edges optionally containing markers indicating logical negation. This representation of a logic function is rarely structurally efficient for large circuits, but is an efficient representation for manipulation of boolean functions. Typically, the abstract graph is represented as a data structure in software.

Applicative computing systems, or ACS are the systems of object calculi founded on combinatory logic and lambda calculus. The only essential notion which is under consideration in these systems is the representation of object. In combinatory logic the only metaoperator is application in a sense of applying one object to other. In lambda calculus two metaoperators are used: application – the same as in combinatory logic, and functional abstraction which binds the only variable in one object.

The expression problem is a challenging problem in programming languages that concerns the extensibility and modularity of statically typed data abstractions. The goal is to define a data abstraction that is extensible both in its representations and its behaviors, where one can add new representations and new behaviors to the data abstraction, without recompiling existing code, and while retaining static type safety. The statement of the problem exposes deficiencies in programming paradigms and programming languages, and as of 2023 is still considered unsolved, although there are many proposed solutions.

Go! is an agent-based programming language in the tradition of logic-based programming languages like Prolog. It was introduced in a 2003 paper by Francis McCabe and Keith Clark.

Conor McBride is a Reader in the department of Computer and Information Sciences at the University of Strathclyde. In 1999, they completed a Doctor of Philosophy (Ph.D.) in Dependently Typed Functional Programs and their Proofs at the University of Edinburgh for their work in type theory. They formerly worked at Durham University and briefly at Royal Holloway, University of London before joining the academic staff at the University of Strathclyde.

Inductive programming (IP) is a special area of automatic programming, covering research from artificial intelligence and programming, which addresses learning of typically declarative and often recursive programs from incomplete specifications, such as input/output examples or constraints.

In computer science, purely functional programming usually designates a programming paradigm—a style of building the structure and elements of computer programs—that treats all computation as the evaluation of mathematical functions.

References

  1. McBride, Conor; Paterson, Ross (2008-01-01). "Applicative programming with effects". Journal of Functional Programming. 18 (1): 1–13. CiteSeerX   10.1.1.114.1555 . doi:10.1017/S0956796807006326 (inactive 2024-11-20). ISSN   1469-7653.{{cite journal}}: CS1 maint: DOI inactive as of November 2024 (link)
  2. Dershowitz, Nachum; Plaisted, David A. (1985). "Logic Programming cum Applicative Programming". Symposium on Logic Programming. Boston, MA. pp. 54–66. CiteSeerX   10.1.1.404.3826 .
  3. Jon Purdy (12 February 2012). "Why Concatenative Programming Matters" . Retrieved 28 April 2020.
  4. Backus, J. (1978). "Can programming be liberated from the von Neumann style?: A functional style and its algebra of programs". Communications of the ACM. 21 (8): 613–641. doi: 10.1145/359576.359579 .