# Dependent type

Last updated

In computer science and logic, a dependent type is a type whose definition depends on a value. It is an overlapping feature of type theory and type systems. In intuitionistic type theory, dependent types are used to encode logic's quantifiers like "for all" and "there exists". In functional programming languages like Agda, ATS, Coq, F*, Epigram, and Idris, dependent types may help reduce bugs by enabling the programmer to assign types that further restrain the set of possible implementations.

## Contents

Two common examples of dependent types are dependent functions and dependent pairs. The return type of a dependent function may depend on the value (not just type) of one of its arguments. For instance, a function that takes a positive integer $n$ may return an array of length $n$ , where the array length is part of the type of the array. (Note that this is different from polymorphism and generic programming, both of which include the type as an argument.) A dependent pair may have a second value of which the type depends on the first value. Sticking with the array example, a dependent pair may be used to pair an array with its length in a type-safe way.

Dependent types add complexity to a type system. Deciding the equality of dependent types in a program may require computations. If arbitrary values are allowed in dependent types, then deciding type equality may involve deciding whether two arbitrary programs produce the same result; hence type checking may become undecidable.

## History

In 1934, Haskell Curry noticed that the types used in typed lambda calculus, and in its combinatory logic counterpart, followed the same pattern as axioms in propositional logic. Going further, for every proof in the logic, there was a matching function (term) in the programming language. One of Curry's examples was the correspondence between simply typed lambda calculus and intuitionistic logic. 

Predicate logic is an extension of propositional logic, adding quantifiers. Howard and de Bruijn extended lambda calculus to match this more powerful logic by creating types for dependent functions, which correspond to "for all", and dependent pairs, which correspond to "there exists". 

(Because of this and other work by Howard, propositions-as-types is known as the Curry–Howard correspondence.)

## Formal definition

### Π type

Loosely speaking, dependent types are similar to the type of an indexed family of sets. More formally, given a type $A:{\mathcal {U}}$ in a universe of types ${\mathcal {U}}$ , one may have a family of types$B:A\to {\mathcal {U}}$ , which assigns to each term $a:A$ a type $B(a):{\mathcal {U}}$ . We say that the type B(a) varies with a.

A function whose type of return value varies with its argument (i.e. there is no fixed codomain) is a dependent function and the type of this function is called dependent product type, pi-type or dependent function type.  From a family of types $B:A\to {\mathcal {U}}$ we may construct the type of dependent functions ${\textstyle \prod _{x:A}B(x)}$ , whose terms are functions which take a term $a:A$ and return a term in $B(a)$ . For this example, the dependent function type is typically written as $\prod _{x:A}B(x),$ ${\textstyle \prod _{x:A}B(x),}$ or ${\textstyle \prod (x:A),B(x).}$ If $B:A\to {\mathcal {U}}$ is a constant function, the corresponding dependent product type is equivalent to an ordinary function type. That is, ${\textstyle \prod _{x:A}B}$ is judgmentally equal to $A\to B$ when B does not depend on x.

The name 'pi-type' comes from the idea that these may be viewed as a Cartesian product of types. Pi-types can also be understood as models of universal quantifiers.

For example, if we write $\operatorname {Vec} (\mathbb {R} ,n)$ for n-tuples of real numbers, then ${\textstyle \prod _{n:\mathbb {N} }\operatorname {Vec} (\mathbb {R} ,n)}$ would be the type of a function which, given a natural number n, returns a tuple of real numbers of size n. The usual function space arises as a special case when the range type does not actually depend on the input. E.g. ${\textstyle \prod _{n:\mathbb {N} }{\mathbb {R} }}$ is the type of functions from natural numbers to the real numbers, which is written as $\mathbb {N} \to \mathbb {R}$ in typed lambda calculus.

For a more concrete example, taking A to be equal to the family of unsigned integers from 0 to 255, (the ones you can fit into 8 bits or 1 byte) and B(a) = Xa for 256 arbitrary Xa's, then ${\textstyle \prod _{x:A}B(x)}$ devolves into the product of X0 × X1 × X2 × ... ×  X253 ×  X254 × X255precisely because the finite set of integers from 0 to 255 would ultimately stop at the bounds just mentioned, resulting in an finite codomain of the dependent function.

### Σ type

The dual of the dependent product type is the dependent pair type, dependent sum type, sigma-type, or (confusingly) dependent product type.  Sigma-types can also be understood as existential quantifiers. Continuing the above example, if, in the universe of types ${\mathcal {U}}$ , there is a type $A:{\mathcal {U}}$ and a family of types $B:A\to {\mathcal {U}}$ , then there is a dependent pair type ${\textstyle \sum _{x:A}B(x)}$ . (The alternate notations are similar to that of Π types.)

The dependent pair type captures the idea of an ordered pair where the type of the second term is dependent on the value of the first. If ${\textstyle (a,b):\sum _{x:A}B(x),}$ then $a:A$ and $b:B(a)$ . If B is a constant function, then the dependent pair type becomes (is judgementally equal to) the product type, that is, an ordinary Cartesian product $A\times B$ . 

For a more concrete example, taking A to again be equal to the family of unsigned integers from 0 to 255, and B(a) to again be equal to Xa for 256 more arbitrary Xa's, then ${\textstyle \sum _{x:A}B(x)}$ devolves into the sum X0 +  X1 +  X2 + ... + X253 + X254 + X255 for the same reasons as to what happened to the codomain of the dependent function.

#### Example as existential quantification

Let $A:{\mathcal {U}}$ be some type, and let $B:A\to {\mathcal {U}}$ . By the Curry–Howard correspondence, B can be interpreted as a logical predicate on terms of A. For a given $a:A$ , whether the type B(a) is inhabited indicates whether a satisfies this predicate. The correspondence can be extended to existential quantification and dependent pairs: the proposition $\exists {a}{\in }A\,B(a)$ is true if and only if the type ${\textstyle \sum _{a:A}B(a)}$ is inhabited.

For example, $m:\mathbb {N}$ is less than or equal to $n:\mathbb {N}$ if and only if there exists another natural number $k:\mathbb {N}$ such that m + k = n. In logic, this statement is codified by existential quantification:

$m\leq n\iff \exists {k}{\in }\mathbb {N} \,m+k=n.$ This proposition corresponds to the dependent pair type:

$\sum _{k:\mathbb {N} }m+k=n.$ That is, a proof of the statement that m is less than n is a pair that contains both a positive number k, which is the difference between m and n, and a proof of the equality m + k = n.

## Systems of the lambda cube

Henk Barendregt developed the lambda cube as a means of classifying type systems along three axes. The eight corners of the resulting cube-shaped diagram each correspond to a type system, with simply typed lambda calculus in the least expressive corner, and calculus of constructions in the most expressive. The three axes of the cube correspond to three different augmentations of the simply typed lambda calculus: the addition of dependent types, the addition of polymorphism, and the addition of higher kinded type constructors (functions from types to types, for example). The lambda cube is generalized further by pure type systems.

### First order dependent type theory

The system $\lambda \Pi$ of pure first order dependent types, corresponding to the logical framework LF, is obtained by generalising the function space type of the simply typed lambda calculus to the dependent product type.

### Second order dependent type theory

The system $\lambda \Pi 2$ of second order dependent types is obtained from $\lambda \Pi$ by allowing quantification over type constructors. In this theory the dependent product operator subsumes both the $\to$ operator of simply typed lambda calculus and the $\forall$ binder of System F.

### Higher order dependently typed polymorphic lambda calculus

The higher order system $\lambda \Pi \omega$ extends $\lambda \Pi 2$ to all four forms of abstraction from the lambda cube: functions from terms to terms, types to types, terms to types and types to terms. The system corresponds to the calculus of constructions whose derivative, the calculus of inductive constructions is the underlying system of the Coq proof assistant.

## Simultaneous programming language and logic

The Curry–Howard correspondence implies that types can be constructed that express arbitrarily complex mathematical properties. If the user can supply a constructive proof that a type is inhabited (i.e., that a value of that type exists) then a compiler can check the proof and convert it into executable computer code that computes the value by carrying out the construction. The proof checking feature makes dependently typed languages closely related to proof assistants. The code-generation aspect provides a powerful approach to formal program verification and proof-carrying code, since the code is derived directly from a mechanically verified mathematical proof.

## Comparison of languages with dependent types

LanguageActively developedParadigm [fn 1] Tactics Proof terms Termination checking Types can depend on [fn 2] Universes Proof irrelevance Program extraction Extraction erases irrelevant terms
Ada 2012 Yes  ImperativeYes  Yes (optional)  ?Any term [fn 3] ?? Ada ?
Agda Yes  Purely functional Few/limited [fn 4] YesYes (optional)Any termYes (optional) [fn 5] Proof-irrelevant arguments  Proof-irrelevant propositions  Haskell, JavaScriptYes 
ATS Yes  Functional / imperativeNo  YesYesStatic terms  ?YesYesYes
Cayenne NoPurely functionalNoYesNoAny termNoNo??
Gallina
(Coq)
Yes  Purely functionalYesYesYesAny termYes [fn 6] No Haskell, Scheme and OCaml Yes
Dependent ML No [fn 7] ??Yes?Natural numbers????
F* Yes  Functional and imperativeYes  YesYes (optional)Any pure termYesYes OCaml, F#, and C Yes
Guru No  Purely functional  hypjoin  Yes  YesAny termNoYesCarrawayYes
Idris Yes  Purely functional  Yes  YesYes (optional)Any termYesNoYesYes, aggressively 
Lean YesPurely functionalYesYesYesAny termYesYesYesYes
Matita Yes  Purely functionalYesYesYesAny termYesYes OCaml Yes
NuPRL YesPurely functionalYesYesYesAny termYes?Yes?
PVS Yes?Yes???????
Sage No [fn 8] Purely functionalNoNoNo?No???
Twelf Yes Logic programming ?YesYes (optional)Any (LF) termNoNo??
Xanadu No  Imperative????????

## Footnotes

1. This refers to the core language, not to any tactic (theorem proving procedure) or code generation sublanguage.
2. Subject to semantic constraints, such as universe constraints
3. Static_Predicate for restricted terms, Dynamic_Predicate for Assert-like checking of any term in type cast
4. Ring solver 
5. Optional universes, optional universe polymorphism, and optional explicitly specified universes
6. Universes, automatically inferred universe constraints (not the same as Agda's universe polymorphism) and optional explicit printing of universe constraints
7. Has been superseded by ATS
8. Last Sage paper and last code snapshot are both dated 2006

## Related Research Articles

Lambda calculus is a formal system in mathematical logic for expressing computation based on function abstraction and application using variable binding and substitution. It is a universal model of computation that can be used to simulate any Turing machine. It was introduced by the mathematician Alonzo Church in the 1930s as part of his research into the foundations of mathematics.

In mathematics, a product is the result of multiplication, or an expression that identifies factors to be multiplied. For example, 30 is the product of 6 and 5, and is the product of and .

In probability theory, the central limit theorem (CLT) establishes that, in many situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution even if the original variables themselves are not normally distributed. The theorem is a key concept in probability theory because it implies that probabilistic and statistical methods that work for normal distributions can be applicable to many problems involving other types of distributions. This theorem has seen many changes during the formal development of probability theory. Previous versions of the theorem date back to 1811, but in its modern general form, this fundamental result in probability theory was precisely stated as late as 1920, thereby serving as a bridge between classical and modern probability theory.

In mathematics, logic, and computer science, a type system is a formal system in which every term has a "type" which defines its meaning and the operations that may be performed on it. Type theory is the academic study of type systems.

In logic and proof theory, natural deduction is a kind of proof calculus in which logical reasoning is expressed by inference rules closely related to the "natural" way of reasoning. This contrasts with Hilbert-style systems, which instead use axioms as much as possible to express the logical laws of deductive reasoning.

A mathematical symbol is a figure or a combination of figures that is used to represent a mathematical object, an action on mathematical objects, a relation between mathematical objects, or for structuring the other symbols that occur in a formula. As formulas are entirely constituted with symbols of various types, many symbols are needed for expressing all mathematics.

In mathematical logic, a universal quantification is a type of quantifier, a logical constant which is interpreted as "given any" or "for all". It expresses that a predicate can be satisfied by every member of a domain of discourse. In other words, it is the predication of a property or relation to every member of the domain. It asserts that a predicate within the scope of a universal quantifier is true of every value of a predicate variable.

Combinatory logic is a notation to eliminate the need for quantified variables in mathematical logic. It was introduced by Moses Schönfinkel and Haskell Curry, and has more recently been used in computer science as a theoretical model of computation and also as a basis for the design of functional programming languages. It is based on combinators which were introduced by Schönfinkel in 1920 with the idea of providing an analogous way to build up functions—and to remove any mention of variables—particularly in predicate logic. A combinator is a higher-order function that uses only function application and earlier defined combinators to define a result from its arguments.

In programming language theory and proof theory, the Curry–Howard correspondence is the direct relationship between computer programs and mathematical proofs.

Intuitionistic type theory is a type theory and an alternative foundation of mathematics. Intuitionistic type theory was created by Per Martin-Löf, a Swedish mathematician and philosopher, who first published it in 1972. There are multiple versions of the type theory: Martin-Löf proposed both intensional and extensional variants of the theory and early impredicative versions, shown to be inconsistent by Girard's paradox, gave way to predicative versions. However, all versions keep the core design of constructive logic using dependent types.

In mathematical logic and computer science, the calculus of constructions (CoC) is a type theory created by Thierry Coquand. It can serve as both a typed programming language and as constructive foundation for mathematics. For this second reason, the CoC and its variants have been the basis for Coq and other proof assistants.

System F, also known as the (Girard–Reynolds) polymorphic lambda calculus or the second-order lambda calculus, is a typed lambda calculus that differs from the simply typed lambda calculus by the introduction of a mechanism of universal quantification over types. System F thus formalizes the notion of parametric polymorphism in programming languages, and forms a theoretical basis for languages such as Haskell and ML. System F was discovered independently by logician Jean-Yves Girard (1972) and computer scientist John C. Reynolds (1974).

In functional analysis, a branch of mathematics, the Borel functional calculus is a functional calculus, which has particularly broad scope. Thus for instance if T is an operator, applying the squaring function ss2 to T yields the operator T2. Using the functional calculus for larger classes of functions, we can for example define rigorously the "square root" of the (negative) Laplacian operator −Δ or the exponential In mathematical logic and type theory, the λ-cube is a framework introduced by Henk Barendregt to investigate the different dimensions in which the calculus of constructions is a generalization of the simply typed λ-calculus. Each dimension of the cube corresponds to a new kind of dependency between terms and types. Here, "dependency" refers to the capacity of a term or type to bind a term or type. The respective dimensions of the λ-cube correspond to:

In logic, a logical framework provides a means to define a logic as a signature in a higher-order type theory in such a way that provability of a formula in the original logic reduces to a type inhabitation problem in the framework type theory. This approach has been used successfully for (interactive) automated theorem proving. The first logical framework was Automath; however, the name of the idea comes from the more widely known Edinburgh Logical Framework, LF. Several more recent proof tools like Isabelle are based on this idea. Unlike a direct embedding, the logical framework approach allows many logics to be embedded in the same type system.

In system analysis, among other fields of study, a linear time-invariant system is a system that produces an output signal from any input signal subject to the constraints of linearity and time-invariance; these terms are briefly defined below. These properties apply to many important physical systems, in which case the response y(t) of the system to an arbitrary input x(t) can be found directly using convolution: y(t) = x(t) ∗ h(t) where h(t) is called the system's impulse response and ∗ represents convolution. What's more, there are systematic methods for solving any such system, whereas systems not meeting both properties are generally more difficult to solve analytically. A good example of an LTI system is any electrical circuit consisting of resistors, capacitors, inductors and linear amplifiers.

The simply typed lambda calculus, a form of type theory, is a typed interpretation of the lambda calculus with only one type constructor that builds function types. It is the canonical and simplest example of a typed lambda calculus. The simply typed lambda calculus was originally introduced by Alonzo Church in 1940 as an attempt to avoid paradoxical uses of the untyped lambda calculus, and it exhibits many desirable and interesting properties.

Constructive set theory is an approach to mathematical constructivism following the program of axiomatic set theory. The same first-order language with "" and "" of classical set theory is usually used, so this is not to be confused with a constructive types approach. On the other hand, some constructive theories are indeed motivated by their interpretability in type theories.

In the branches of mathematical logic known as proof theory and type theory, a pure type system (PTS), previously known as a generalized type system (GTS), is a form of typed lambda calculus that allows an arbitrary number of sorts and dependencies between any of these. The framework can be seen as a generalisation of Barendregt's lambda cube, in the sense that all corners of the cube can be represented as instances of a PTS with just two sorts. In fact, Barendregt (1991) framed his cube in this setting. Pure type systems may obscure the distinction between types and terms and collapse the type hierarchy, as is the case with the calculus of constructions, but this is not generally the case, e.g. the simply typed lambda calculus allows only terms to depend on terms.

In logic, a quantifier is an operator that specifies how many individuals in the domain of discourse satisfy an open formula. For instance, the universal quantifier in the first order formula expresses that everything in the domain satisfies the property denoted by . On the other hand, the existential quantifier in the formula expresses that there is something in the domain which satisfies that property. A formula where a quantifier takes widest scope is called a quantified formula. A quantified formula must contain a bound variable and a subformula specifying a property of the referent of that variable.

1. Sørensen, Morten Heine B.; Pawel Urzyczyn (1998). "Lectures on the Curry-Howard Isomorphism". CiteSeerX  .Cite journal requires |journal= (help)
2. Bove, Ana; Peter Dybjer (2008). "Dependent Types at Work" (PDF).Cite journal requires |journal= (help)
3. SPARK is a provable subset of Ada
4. "Announce: Agda 2.2.8". Archived from the original on 2011-07-18. Retrieved 2010-09-28.
5. Aaron Stump (6 April 2009). "Verified Programming in Guru" (PDF). Archived from the original (PDF) on 29 December 2009. Retrieved 28 September 2010.
6. Adam Petcher (1 April 2008). "Deciding Joinability Modulo Ground Equations in Operational Type Theory" (PDF). Retrieved 14 October 2010.
7. "Idris, a language with dependent types - extended abstract" (PDF). Archived from the original (PDF) on 2011-07-16.
8. "Matita SVN". Archived from the original on 2006-05-08. Retrieved 2010-09-29.