Type theory with records

Last updated

Type theory with records is a formal semantics representation framework, using records to express type theory types. It has been used in natural language processing, principally computational semantics and dialogue systems. [1] [2]

Syntax

A record type is a set of fields. A field is a pair consisting of a label and a type. Within a record type, field labels are unique. The witness of a record type is a record. A record is a similar set of fields, but fields contain objects instead of types. The object in each field must be of the type declared in the corresponding field in the record type. [3]

Basic type:

Object:

Ptype:

Object:

where and are individuals (type ), is proof that is a boy, etc.

Related Research Articles

Group representation Group homomorphism into the general linear group over a vector space

In the mathematical field of representation theory, group representations describe abstract groups in terms of bijective linear transformations of a vector space to itself ; in particular, they can be used to represent group elements as invertible matrices so that the group operation can be represented by matrix multiplication. Representations of groups are important because they allow many group-theoretic problems to be reduced to problems in linear algebra, which is well understood. They are also important in physics because, for example, they describe how the symmetry group of a physical system affects the solutions of equations describing that system.

In mathematics, Gaussian elimination, also known as row reduction, is an algorithm for solving systems of linear equations. It consists of a sequence of operations performed on the corresponding matrix of coefficients. This method can also be used to compute the rank of a matrix, the determinant of a square matrix, and the inverse of an invertible matrix. The method is named after Carl Friedrich Gauss (1777–1855) although some special cases of the method—albeit presented without proof—were known to Chinese mathematicians as early as circa 179 CE.

Lie algebra Vector space with a binary operation satisfying the Jacobi identity

In mathematics, a Lie algebra is a vector space together with an operation called the Lie bracket, an alternating bilinear map , that satisfies the Jacobi identity. The vector space together with this operation is a non-associative algebra, meaning that the Lie bracket is not necessarily associative.

Linear algebra Branch of mathematics

Linear algebra is the branch of mathematics concerning linear equations such as:

In computability theory, a primitive recursive function is roughly speaking a function that can be computed by a computer program whose loops are all "for" loops. Primitive recursive functions form a strict subset of those general recursive functions that are also total functions.

System of linear equations Collection of linear equations involving the same set of variables

In mathematics, a system of linear equations is a collection of one or more linear equations involving the same set of variables. For example,

Matrix multiplication Mathematical operation in linear algebra

In mathematics, particularly in linear algebra, matrix multiplication is a binary operation that produces a matrix from two matrices. For matrix multiplication, the number of columns in the first matrix must be equal to the number of rows in the second matrix. The resulting matrix, known as the matrix product, has the number of rows of the first and the number of columns of the second matrix. The product of matrices A and B is denoted as AB.

Translation (geometry) Planar movement within a Euclidean space without rotation

In Euclidean geometry, a translation is a geometric transformation that moves every point of a figure, shape or space by the same distance in a given direction. A translation can also be interpreted as the addition of a constant vector to every point, or as shifting the origin of the coordinate system. In a Euclidean space, any translation is an isometry.

In linear algebra, a square matrix  is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix  and a diagonal matrix such that , or equivalently . For a finite-dimensional vector space , a linear map  is called diagonalizable if there exists an ordered basis of  consisting of eigenvectors of . These definitions are equivalent: if  has a matrix representation as above, then the column vectors of  form a basis consisting of eigenvectors of , and the diagonal entries of  are the corresponding eigenvalues of ; with respect to this eigenvector basis,  is represented by .Diagonalization is the process of finding the above  and .

In control engineering, a state-space representation is a mathematical model of a physical system as a set of input, output and state variables related by first-order differential equations or difference equations. State variables are variables whose values evolve over time in a way that depends on the values they have at any given time and on the externally imposed values of input variables. Output variables’ values depend on the values of the state variables.

In linear algebra, linear transformations can be represented by matrices. If is a linear transformation mapping to and is a column vector with entries, then

Quantum logic gate Basic circuit in quantum computing

In quantum computing and specifically the quantum circuit model of computation, a quantum logic gate is a basic quantum circuit operating on a small number of qubits. They are the building blocks of quantum circuits, like classical logic gates are for conventional digital circuits.

In mathematics, the kernel of a linear map, also known as the null space or nullspace, is the linear subspace of the domain of the map which is mapped to the zero vector. That is, given a linear map L : VW between two vector spaces V and W, the kernel of L is the vector space of all elements v of V such that L(v) = 0, where 0 denotes the zero vector in W, or more symbolically:

In linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the matrix being factorized is a normal or real symmetric matrix, the decomposition is called "spectral decomposition", derived from the spectral theorem.

Combinatory categorial grammar (CCG) is an efficiently parsable, yet linguistically expressive grammar formalism. It has a transparent interface between surface syntax and underlying semantic representation, including predicate–argument structure, quantification and information structure. The formalism generates constituency-based structures and is therefore a type of phrase structure grammar.

Matrix (mathematics) Two-dimensional array of numbers with specific operations

In mathematics, a matrix is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object.

Quantum complexity theory is the subfield of computational complexity theory that deals with complexity classes defined using quantum computers, a computational model based on quantum mechanics. It studies the hardness of computational problems in relation to these complexity classes, as well as the relationship between quantum complexity classes and classical complexity classes.

In mathematical optimization, the revised simplex method is a variant of George Dantzig's simplex method for linear programming.

In quantum computing and quantum information theory, the Clifford gates are the elements of the Clifford group, a set of mathematical transformations which effect permutations of the Pauli operators. The notion was introduced by Daniel Gottesman and is named after the mathematician William Kingdon Clifford. Quantum circuits that consist only of Clifford gates can be efficiently simulated with a classical computer due to Gottesman–Knill theorem.

In mathematics, the Khatri–Rao product is defined as

References

  1. Cooper, Robin (2005). "Records and Record Types in Semantic Theory". Journal of Logic and Computation. 15 (2): 99–112. doi:10.1093/logcom/exi004.
  2. Cooper, Robin (2010). Type theory and semantics in flux. Handbook of the Philosophy of Science. Volume 14: Philosophy of Linguistics. Elsevier.
  3. R. Cooper. Type theory and language: From perception to linguistic communication. Draft of book chapters available from https://sites.google.com/site/typetheorywithrecords/drafts