In type theory, a type system has the property of subject reduction (also subject evaluation, type preservation or simply preservation) if evaluation of expressions does not cause their type to change. Formally, if ⊢ e1 : τ and e1 → e2 then ⊢ e2 : τ. Intuitively, this means one would not like to write a expression, in say Haskell, of type Int, and have it evaluate to a value v, only to find out that v is a string.
Together with progress, it is an important meta-theoretical property for establishing type soundness of a type system.
The opposite property, if Γ ⊢ e2 : τ and e1 → e2 then Γ ⊢ e1 : τ, is called subject expansion. It often does not hold as evaluation can erase ill-typed sub-terms of an expression, resulting in a well-typed one.
In analytic philosophy and computer science, referential transparency and referential opacity are properties of linguistic constructions, and by extension of languages. A linguistic construction is called referentially transparent when for any expression built from it, replacing a subexpression with another one that denotes the same value does not change the value of the expression. Otherwise, it is called referentially opaque. Each expression built from a referentially opaque linguistic construction states something about a subexpression, whereas each expression built from a referentially transparent linguistic construction states something not about a subexpression, meaning that the subexpressions are ‘transparent’ to the expression, acting merely as ‘references’ to something else. For example, the linguistic construction ‘_ was wise’ is referentially transparent but ‘_ said _’ is referentially opaque.
In logic and proof theory, natural deduction is a kind of proof calculus in which logical reasoning is expressed by inference rules closely related to the "natural" way of reasoning. This contrasts with Hilbert-style systems, which instead use axioms as much as possible to express the logical laws of deductive reasoning.
In computer science, denotational semantics is an approach of formalizing the meanings of programming languages by constructing mathematical objects that describe the meanings of expressions from the languages. Other approaches providing formal semantics of programming languages include axiomatic semantics and operational semantics.
In particle physics, a lepton is an elementary particle of half-integer spin that does not undergo strong interactions. Two main classes of leptons exist: charged leptons, and neutral leptons. Charged leptons can combine with other particles to form various composite particles such as atoms and positronium, while neutrinos rarely interact with anything, and are consequently rarely observed. The best known of all leptons is the electron.
In computer programming, a type system is a logical system comprising a set of rules that assigns a property called a type to every term. Usually the terms are various language constructs of a computer program, such as variables, expressions, functions, or modules. A type system dictates the operations that can be performed on a term. For variables, the type system determines the allowed values of that term. Type systems formalize and enforce the otherwise implicit categories the programmer uses for algebraic data types, data structures, or other components.
An elimination reaction is a type of organic reaction in which two substituents are removed from a molecule in either a one- or two-step mechanism. The one-step mechanism is known as the E2 reaction, and the two-step mechanism is known as the E1 reaction. The numbers refer not to the number of steps in the mechanism, but rather to the kinetics of the reaction: E2 is bimolecular (second-order) while E1 is unimolecular (first-order). In cases where the molecule is able to stabilize an anion but possesses a poor leaving group, a third type of reaction, E1CB, exists. Finally, the pyrolysis of xanthate and acetate esters proceed through an "internal" elimination mechanism, the Ei mechanism.
In type theory, a typing rule is an inference rule that describes how a type system assigns a type to a syntactic construction. These rules may be applied by the type system to determine if a program is well-typed and what type expressions have. A prototypical example of the use of typing rules is in defining type inference in the simply typed lambda calculus, which is the internal language of Cartesian closed categories.
In mathematics, the covariant derivative is a way of specifying a derivative along tangent vectors of a manifold. Alternatively, the covariant derivative is a way of introducing and working with a connection on a manifold by means of a differential operator, to be contrasted with the approach given by a principal connection on the frame bundle – see affine connection. In the special case of a manifold isometrically embedded into a higher-dimensional Euclidean space, the covariant derivative can be viewed as the orthogonal projection of the Euclidean directional derivative onto the manifold's tangent space. In this case the Euclidean derivative is broken into two parts, the extrinsic normal component and the intrinsic covariant derivative component.
Differential geometry of curves is the branch of geometry that deals with smooth curves in the plane and the Euclidean space by methods of differential and integral calculus.
The simply typed lambda calculus, a form of type theory, is a typed interpretation of the lambda calculus with only one type constructor that builds function types. It is the canonical and simplest example of a typed lambda calculus. The simply typed lambda calculus was originally introduced by Alonzo Church in 1940 as an attempt to avoid paradoxical use of the untyped lambda calculus.
Simple shear is a deformation in which parallel planes in a material remain parallel and maintain a constant distance, while translating relative to each other.
In physics, a particle is called ultrarelativistic when its speed is very close to the speed of light c.
In programming language semantics, normalisation by evaluation (NBE) is a method of obtaining the normal form of terms in the λ-calculus by appealing to their denotational semantics. A term is first interpreted into a denotational model of the λ-term structure, and then a canonical (β-normal and η-long) representative is extracted by reifying the denotation. Such an essentially semantic, reduction-free, approach differs from the more traditional syntactic, reduction-based, description of normalisation as reductions in a term rewrite system where β-reductions are allowed deep inside λ-terms.
Expression templates are a C++ template metaprogramming technique that builds structures representing a computation at compile time, where expressions are evaluated only as needed to produce efficient code for the entire computation. Expression templates thus allow programmers to bypass the normal order of evaluation of the C++ language and achieve optimizations such as loop fusion.
Nemerle is a general-purpose, high-level, statically typed programming language designed for platforms using the Common Language Infrastructure (.NET/Mono). It offers functional, object-oriented, aspect-oriented, reflective and imperative features. It has a simple C#-like syntax and a powerful metaprogramming system.
A Hindley–Milner (HM) type system is a classical type system for the lambda calculus with parametric polymorphism. It is also known as Damas–Milner or Damas–Hindley–Milner. It was first described by J. Roger Hindley and later rediscovered by Robin Milner. Luis Damas contributed a close formal analysis and proof of the method in his PhD thesis.
XPath 3 is the latest version of the XML Path Language, a query language for selecting nodes in XML documents. It supersedes XPath 1.0 and XPath 2.0.
In mathematical logic, the intersection type discipline is a branch of type theory encompassing type systems that use the intersection type constructor to assign multiple types to a single term. In particular, if a term can be assigned both the type and the type , then can be assigned the intersection type . Therefore, the intersection type constructor can be used to express finite heterogeneous ad hoc polymorphism . For example, the λ-term can be assigned the type in most intersection type systems, assuming for the term variable both the function type and the corresponding argument type .
Flix is a functional, imperative, and logic programming language developed at Aarhus University, with funding from the Independent Research Fund Denmark, and by a community of open source contributors. The Flix language supports algebraic data types, pattern matching, parametric polymorphism, currying, higher-order functions, extensible records, channel and process-based concurrency, and tail call elimination. Two notable features of Flix are its type and effect system and its support for first-class Datalog constraints.
In computer science, refocusing is a program transformation used to implement a reduction semantics -- i.e., a small-step operational semantics with an explicit representation of the reduction context -- more efficiently. It is a step towards implementing a deterministic semantics as a deterministic abstract machine.