This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these template messages)
(Learn how and when to remove this template message) |

Paradigm | Functional |
---|---|

Designed by | Conor McBride James McKinna |

Developer | Unmaintained |

First appeared | 2004 |

Stable release | 1 / October 11, 2006 |

Typing discipline | strong, static, dependent |

OS | Cross-platform: Linux, Windows, macOS |

License | MIT ^{ [1] } |

Website | e-pig |

Influenced by | |

ALF | |

Influenced | |

Agda, Idris |

**Epigram** is a functional programming language with dependent types, and the integrated development environment (IDE) usually packaged with the language. Epigram's type system is strong enough to express program specifications. The goal is to support a smooth transition from ordinary programming to integrated programs and proofs which correctness can be checked and certified by the compiler. Epigram exploits the * Curry–Howard correspondence *, also termed the *propositions as types principle*, and is based on intuitionistic type theory.

The Epigram prototype was implemented by Conor McBride based on joint work with James McKinna. Its development is continued by the Epigram group in Nottingham, Durham, St Andrews, and Royal Holloway, University of London in the United Kingdom (UK). The current experimental implementation of the Epigram system is freely available together with a user manual, a tutorial and some background material. The system has been used under Linux, Windows, and macOS.

It is currently unmaintained, and version 2, which was intended to implement Observational Type Theory, was never officially released but exists in GitHub. The design of Epigram and Epigram 2 have inspired the development Agda,^{[ citation needed ]} Idris,^{[ citation needed ]} and Coq.^{[ citation needed ]}

Epigram uses a two-dimensional, natural deduction style syntax, with versions in LaTeX and ASCII. Here are some examples from *The Epigram Tutorial*:

The following declaration defines the natural numbers:

( ! ( ! ( n : Nat ! data !---------! where !----------! ; !-----------! ! Nat : * ) !zero : Nat) !suc n : Nat)

The declaration says that `Nat`

is a type with kind `*`

(i.e., it is a simple type) and two constructors: `zero`

and `suc`

. The constructor `suc`

takes a single `Nat`

argument and returns a `Nat`

. This is equivalent to the Haskell declaration "`data Nat = Zero | Suc Nat`

".

In LaTeX, the code is displayed as:

The horizontal-line notation can be read as "assuming (what is on the top) is true, we can infer that (what is on the bottom) is true." For example, "assuming `n`

is of type `Nat`

, then `suc n`

is of type `Nat`

." If nothing is on the top, then the bottom statement is always true: "`zero`

is of type `Nat`

(in all cases)."

...And in ASCII:

NatInd : all P : Nat -> * => P zero -> (all n : Nat => P n -> P (suc n)) -> all n : Nat => P n NatInd P mz ms zero => mz NatInd P mz ms (suc n) => ms n (NatInd P mz ms n)

...And in ASCII:

plus x y <= rec x { plus x y <= case x { plus zero y => y plus (suc x) y => suc (plus x y) } }

Epigram is essentially a typed lambda calculus with generalized algebraic data type extensions, except for two extensions. First, types are first-class entities, of type ; types are arbitrary expressions of type , and type equivalence is defined in terms of the types' normal forms. Second, it has a dependent function type; instead of , , where is bound in to the value that the function's argument (of type ) eventually takes.

Full dependent types, as implemented in Epigram, are a powerful abstraction. (Unlike in Dependent ML, the value(s) depended upon may be of any valid type.) A sample of the new formal specification capabilities dependent types bring may be found in *The Epigram Tutorial*.

- ALF, a proof assistant among the predecessors of Epigram.

- McBride, Conor; McKinna, James (2004). "The view from the left".
*Journal of Functional Programming*. - McBride, Conor (2004). The Epigram Prototype, a nod and two winks (Report).
- McBride, Conor (2004). The Epigram Tutorial (Report).
- Altenkirch, Thorsten; McBride, Conor; McKinna, James (2005). Why Dependent Types Matter (Report).
- Chapman, James; Altenkirch, Thorsten; McBride, Conor (2006). Epigram Reloaded: A Standalone Typechecker for ETT (Report).
- Chapman, James; Dagand, Pierre-Évariste; McBride, Conor; Morris, Peter (2010). The gentle art of levitation (Report).

In optics, a **Gaussian beam** is a beam of monochromatic electromagnetic radiation whose amplitude envelope in the transverse plane is given by a Gaussian function; this also implies a Gaussian intensity (irradiance) profile. This fundamental (or TEM_{00}) transverse Gaussian mode describes the intended output of most (but not all) lasers, as such a beam can be focused into the most concentrated spot. When such a beam is refocused by a lens, the transverse *phase* dependence is altered; this results in a *different* Gaussian beam. The electric and magnetic field amplitude profiles along any such circular Gaussian beam (for a given wavelength and polarization) are determined by a single parameter: the so-called waist *w*_{0}. At any position z relative to the waist (focus) along a beam having a specified *w*_{0}, the field amplitudes and phases are thereby determined as detailed below.

In Boolean logic, a formula is in **conjunctive normal form** (**CNF**) or **clausal normal form** if it is a conjunction of one or more clauses, where a clause is a disjunction of literals; otherwise put, it is **an AND of ORs**. As a canonical normal form, it is useful in automated theorem proving and circuit theory.

In predicate logic, a **universal quantification** is a type of quantifier, a logical constant which is interpreted as "given any" or "for all". It expresses that a propositional function can be satisfied by every member of a domain of discourse. In other words, it is the predication of a property or relation to every member of the domain. It asserts that a predicate within the scope of a universal quantifier is true of every value of a predicate variable.

In the mathematical discipline of set theory, **forcing** is a technique for proving consistency and independence results. It was first used by Paul Cohen in 1963, to prove the independence of the axiom of choice and the continuum hypothesis from Zermelo–Fraenkel set theory.

In mathematics, **Fredholm operators** are certain operators that arise in the Fredholm theory of integral equations. They are named in honour of Erik Ivar Fredholm. By definition, a Fredholm operator is a bounded linear operator *T* : *X* → *Y* between two Banach spaces with finite-dimensional kernel and finite-dimensional (algebraic) cokernel , and with closed range . The last condition is actually redundant.

In mathematical logic and computer science, the **Calculus of Constructions** (**CoC**) is a type theory created by Thierry Coquand. It can serve as both a typed programming language and as constructive foundation for mathematics. For this second reason, the CoC and its variants have been the basis for Coq and other proof assistants.

**System F**, also known as the (**Girard–Reynolds**) **polymorphic lambda calculus** or the **second-order lambda calculus**, is a typed lambda calculus that differs from the simply typed lambda calculus by the introduction of a mechanism of universal quantification over types. System F thus formalizes the notion of parametric polymorphism in programming languages, and forms a theoretical basis for languages such as Haskell and ML. System F was discovered independently by logician Jean-Yves Girard (1972) and computer scientist John C. Reynolds (1974).

A **symmetric bilinear form** on a vector space is a bilinear map from two copies of the vector space to the field of scalars such that the order of the two vectors does not affect the value of the map. In other words, it is a bilinear function that maps every pair of elements of the vector space to the underlying field such that for every and in . They are also referred to more briefly as just **symmetric forms** when "bilinear" is understood.

In mathematics, a **t-norm** is a kind of binary operation used in the framework of probabilistic metric spaces and in multi-valued logic, specifically in fuzzy logic. A t-norm generalizes intersection in a lattice and conjunction in logic. The name *triangular norm* refers to the fact that in the framework of probabilistic metric spaces t-norms are used to generalize triangle inequality of ordinary metric spaces.

In computational complexity theory, the class **IP** is the class of problems solvable by an interactive proof system. It is equal to the class **PSPACE**. The result was established in a series of papers: the first by Lund, Karloff, Fortnow, and Nisan showed that co-NP had multiple prover interactive proofs; and the second, by Shamir, employed their technique to establish that IP=PSPACE. The result is a famous example where the proof does not relativize.

In mathematics, specifically in category theory, an **exponential object** or **map object** is the categorical generalization of a function space in set theory. Categories with all finite products and exponential objects are called cartesian closed categories. Categories without adjoined products may still have an **exponential law**.

In the foundations of mathematics, **Morse–Kelley set theory** (**MK**), **Kelley–Morse set theory** (**KM**), **Morse–Tarski set theory** (**MT**), **Quine–Morse set theory** (**QM**) or the **system of Quine and Morse** is a first order axiomatic set theory that is closely related to von Neumann–Bernays–Gödel set theory (NBG). While von Neumann–Bernays–Gödel set theory restricts the bound variables in the schematic formula appearing in the axiom schema of Class Comprehension to range over sets alone, Morse–Kelley set theory allows these bound variables to range over proper classes as well as sets, as first suggested by Quine in 1940 for his system ML.

In complexity theory, the **Karp–Lipton theorem** states that if the Boolean satisfiability problem (SAT) can be solved by Boolean circuits with a polynomial number of logic gates, then

In mathematical logic, **second-order arithmetic** is a collection of axiomatic systems that formalize the natural numbers and their subsets. It is an alternative to axiomatic set theory as a foundation for much, but not all, of mathematics.

**Constructive set theory** is an approach to mathematical constructivism following the program of axiomatic set theory. The same first-order language with and of classical set theory is usually used, so this is not to be confused with a constructive types approach. On the other hand, some constructive theories are indeed motivated by their interpretability in type theories.

In descriptive complexity, a branch of computational complexity, **FO** is a complexity class of structures that can be recognized by formulas of first-order logic, and also equals the complexity class AC^{0}. Descriptive complexity uses the formalism of logic, but does not use several key notions associated with logic such as proof theory or axiomatization.

**Pocket set theory** (**PST**) is an alternative set theory in which there are only two infinite cardinal numbers, ℵ_{0} and *c*. The theory was first suggested by Rudy Rucker in his *Infinity and the Mind*. The details set out in this entry are due to the American mathematician Randall M. Holmes.

In differential topology, the **transversality theorem**, also known as the **Thom transversality theorem** after French mathematician René Thom, is a major result that describes the transverse intersection properties of a smooth family of smooth maps. It says that transversality is a generic property: any smooth map , may be deformed by an arbitrary small amount into a map that is transverse to a given submanifold . Together with the Pontryagin–Thom construction, it is the technical heart of cobordism theory, and the starting point for surgery theory. The finite-dimensional version of the transversality theorem is also a very useful tool for establishing the genericity of a property which is dependent on a finite number of real parameters and which is expressible using a system of nonlinear equations. This can be extended to an infinite-dimensional parametrization using the infinite-dimensional version of the transversality theorem.

In type theory, a system has **inductive types** if it has facilities for creating a new type along with constants and functions that create terms of that type. The feature serves a role similar to data structures in a programming language and allows a type theory to add concepts like numbers, relations, and trees. As the name suggests, inductive types can be self-referential, but usually only in a way that permits structural recursion.

- ↑ "Epigram – Official website" . Retrieved 28 November 2015.

This page is based on this Wikipedia article

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.

Text is available under the CC BY-SA 4.0 license; additional terms may apply.

Images, videos and audio are available under their respective licenses.