Term algebra

Last updated

In universal algebra and mathematical logic, a term algebra is a freely generated algebraic structure over a given signature. [1] [2] For example, in a signature consisting of a single binary operation, the term algebra over a set X of variables is exactly the free magma generated by X. Other synonyms for the notion include absolutely free algebra and anarchic algebra. [3]

Contents

From a category theory perspective, a term algebra is the initial object for the category of all X-generated algebras of the same signature, and this object, unique up to isomorphism, is called an initial algebra; it generates by homomorphic projection all algebras in the category. [4] [5]

A similar notion is that of a Herbrand universe in logic, usually used under this name in logic programming, [6] which is (absolutely freely) defined starting from the set of constants and function symbols in a set of clauses. That is, the Herbrand universe consists of all ground terms: terms that have no variables in them.

An atomic formula or atom is commonly defined as a predicate applied to a tuple of terms; a ground atom is then a predicate in which only ground terms appear. The Herbrand base is the set of all ground atoms that can be formed from predicate symbols in the original set of clauses and terms in its Herbrand universe. [7] [8] These two concepts are named after Jacques Herbrand.

Term algebras also play a role in the semantics of abstract data types, where an abstract data type declaration provides the signature of a multi-sorted algebraic structure and the term algebra is a concrete model of the abstract declaration.

Universal algebra

A type is a set of function symbols, with each having an associated arity (i.e. number of inputs). For any non-negative integer , let denote the function symbols in of arity . A constant is a function symbol of arity 0.

Let be a type, and let be a non-empty set of symbols, representing the variable symbols. (For simplicity, assume and are disjoint.) Then the set of terms of type over is the set of all well-formed strings that can be constructed using the variable symbols of and the constants and operations of . Formally, is the smallest set such that:

The term algebra of type over is, in summary, the algebra of type that maps each expression to its string representation. Formally, is defined as follows: [9]

A term algebra is called absolutely free because for any algebra of type , and for any function , extends to a unique homomorphism , which simply evaluates each term to its corresponding value . Formally, for each :

Example

As an example type inspired from integer arithmetic can be defined by , , , and for each .

The best-known algebra of type has the natural numbers as its domain and interprets , , , and in the usual way; we refer to it as .

For the example variable set , we are going to investigate the term algebra of type over .

First, the set of terms of type over is considered. We use red color to flag its members, which otherwise may be hard to recognize due to their uncommon syntactic form. We have e.g.

More generally, each string in corresponds to a mathematical expression built from the admitted symbols and written in Polish prefix notation; for example, the term corresponds to the expression in usual infix notation. No parentheses are needed to avoid ambiguities in Polish notation; e.g. the infix expression corresponds to the term .

To give some counter-examples, we have e.g.

Now that the term set is established, we consider the term algebra of type over . This algebra uses as its domain, on which addition and multiplication need to be defined. The addition function takes two terms and and returns the term ; similarly, the multiplication function maps given terms and to the term . For example, evaluates to the term . Informally, the operations and are both "sluggards" in that they just record what computation should be done, rather than doing it.

As an example for unique extendability of a homomorphism consider defined by and . Informally, defines an assignment of values to variable symbols, and once this is done, every term from can be evaluated in a unique way in . For example,

In a similar way, one obtains .

Herbrand base

The signatureσ of a language is a triple <O,F,P> consisting of the alphabet of constants O, function symbols F, and predicates P. The Herbrand base [10] of a signature σ consists of all ground atoms of σ: of all formulas of the form R(t1, ..., tn), where t1, ..., tn are terms containing no variables (i.e. elements of the Herbrand universe) and R is an n-ary relation symbol (i.e. predicate). In the case of logic with equality, it also contains all equations of the form t1=t2, where t1 and t2 contain no variables.

Decidability

Term algebras can be shown decidable using quantifier elimination. The complexity of the decision problem is in NONELEMENTARY because binary constructors are injective and thus pairing functions. [11]

See also

Related Research Articles

In mathematics, an associative algebraA over a commutative ring K is a ring A together with a ring homomorphism from K into the center of A. This is thus an algebraic structure with an addition, a multiplication, and a scalar multiplication. The addition and multiplication operations together give A the structure of a ring; the addition and scalar multiplication operations together give A the structure of a module or vector space over K. In this article we will also use the term K-algebra to mean an associative algebra over K. A standard first example of a K-algebra is a ring of square matrices over a commutative ring K, with the usual matrix multiplication.

<span class="mw-page-title-main">Convolution</span> Integral expressing the amount of overlap of one function as it is shifted over another

In mathematics, convolution is a mathematical operation on two functions that produces a third function. The term convolution refers to both the result function and to the process of computing it. It is defined as the integral of the product of the two functions after one is reflected about the y-axis and shifted. The integral is evaluated for all values of shift, producing the convolution function. The choice of which function is reflected and shifted before the integral does not change the integral result. Graphically, it expresses how the 'shape' of one function is modified by the other.

In algebra, a homomorphism is a structure-preserving map between two algebraic structures of the same type. The word homomorphism comes from the Ancient Greek language: ὁμός meaning "same" and μορφή meaning "form" or "shape". However, the word was apparently introduced to mathematics due to a (mis)translation of German ähnlich meaning "similar" to ὁμός meaning "same". The term "homomorphism" appeared as early as 1892, when it was attributed to the German mathematician Felix Klein (1849–1925).

In mathematical logic, model theory is the study of the relationship between formal theories, and their models. The aspects investigated include the number and size of models of a theory, the relationship of different models to each other, and their interaction with the formal language itself. In particular, model theorists also investigate the sets that can be defined in a model of a theory, and the relationship of such definable sets to each other. As a separate discipline, model theory goes back to Alfred Tarski, who first used the term "Theory of Models" in publication in 1954. Since the 1970s, the subject has been shaped decisively by Saharon Shelah's stability theory.

In mathematics, a product is the result of multiplication, or an expression that identifies objects to be multiplied, called factors. For example, 21 is the product of 3 and 7, and is the product of and . When one factor is an integer, the product is called a multiple.

In mathematics, a topological space is, roughly speaking, a geometrical space in which closeness is defined but cannot necessarily be measured by a numeric distance. More specifically, a topological space is a set whose elements are called points, along with an additional structure called a topology, which can be defined as a set of neighbourhoods for each point that satisfy some axioms formalizing the concept of closeness. There are several equivalent definitions of a topology, the most commonly used of which is the definition through open sets, which is easier than the others to manipulate.

In mathematics and theoretical computer science, a type theory is the formal presentation of a specific type system. Type theory is the academic study of type systems.

In logic and computer science, specifically automated reasoning, unification is an algorithmic process of solving equations between symbolic expressions, each of the form Left-hand side = Right-hand side. For example, using x,y,z as variables, and taking f to be an uninterpreted function, the singleton equation set { f(1,y) = f(x,2) } is a syntactic first-order unification problem that has the substitution { x ↦ 1, y ↦ 2 } as its only solution.

<span class="mw-page-title-main">Markov property</span> Memoryless property of a stochastic process

In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process, which means that its future evolution is independent of its history. It is named after the Russian mathematician Andrey Markov. The term strong Markov property is similar to the Markov property, except that the meaning of "present" is defined in terms of a random variable known as a stopping time.

In mathematics, a filtration is an indexed family of subobjects of a given algebraic structure , with the index running over some totally ordered index set , subject to the condition that

<span class="mw-page-title-main">Stopping time</span> Time at which a random variable stops exhibiting a behavior of interest

In probability theory, in particular in the study of stochastic processes, a stopping time is a specific type of “random time”: a random variable whose value is interpreted as the time at which a given stochastic process exhibits a certain behavior of interest. A stopping time is often defined by a stopping rule, a mechanism for deciding whether to continue or stop a process on the basis of the present position and past events, and which will almost always lead to a decision to stop at some finite time.

The simply typed lambda calculus, a form of type theory, is a typed interpretation of the lambda calculus with only one type constructor that builds function types. It is the canonical and simplest example of a typed lambda calculus. The simply typed lambda calculus was originally introduced by Alonzo Church in 1940 as an attempt to avoid paradoxical use of the untyped lambda calculus.

In mathematical logic, a ground term of a formal system is a term that does not contain any variables. Similarly, a ground formula is a formula that does not contain any variables.

<span class="mw-page-title-main">Complex torus</span>

In mathematics, a complex torus is a particular kind of complex manifold M whose underlying smooth manifold is a torus in the usual sense. Here N must be the even number 2n, where n is the complex dimension of M.

In universal algebra and in model theory, a structure consists of a set along with a collection of finitary operations and relations that are defined on it.

A substitution is a syntactic transformation on formal expressions. To apply a substitution to an expression means to consistently replace its variable, or placeholder, symbols with other expressions.

An interpretation is an assignment of meaning to the symbols of a formal language. Many formal languages used in mathematics, logic, and theoretical computer science are defined in solely syntactic terms, and as such do not have any meaning until they are given some interpretation. The general study of interpretations of formal languages is called formal semantics.

In mathematical logic, a term denotes a mathematical object while a formula denotes a mathematical fact. In particular, terms appear as components of a formula. This is analogous to natural language, where a noun phrase refers to an object and a whole sentence refers to a fact.

In the mathematical field of algebraic number theory, the concept of principalization refers to a situation when, given an extension of algebraic number fields, some ideal of the ring of integers of the smaller field isn't principal but its extension to the ring of integers of the larger field is. Its study has origins in the work of Ernst Kummer on ideal numbers from the 1840s, who in particular proved that for every algebraic number field there exists an extension number field such that all ideals of the ring of integers of the base field become principal when extended to the larger field. In 1897 David Hilbert conjectured that the maximal abelian unramified extension of the base field, which was later called the Hilbert class field of the given base field, is such an extension. This conjecture, now known as principal ideal theorem, was proved by Philipp Furtwängler in 1930 after it had been translated from number theory to group theory by Emil Artin in 1929, who made use of his general reciprocity law to establish the reformulation. Since this long desired proof was achieved by means of Artin transfers of non-abelian groups with derived length two, several investigators tried to exploit the theory of such groups further to obtain additional information on the principalization in intermediate fields between the base field and its Hilbert class field. The first contributions in this direction are due to Arnold Scholz and Olga Taussky in 1934, who coined the synonym capitulation for principalization. Another independent access to the principalization problem via Galois cohomology of unit groups is also due to Hilbert and goes back to the chapter on cyclic extensions of number fields of prime degree in his number report, which culminates in the famous Theorem 94.

Tau functions are an important ingredient in the modern mathematical theory of integrable systems, and have numerous applications in a variety of other domains. They were originally introduced by Ryogo Hirota in his direct method approach to soliton equations, based on expressing them in an equivalent bilinear form.

References

  1. Wilfrid Hodges (1997). A Shorter Model Theory . Cambridge University Press. pp.  14. ISBN   0-521-58713-1.
  2. Franz Baader; Tobias Nipkow (1998). Term Rewriting and All That. Cambridge University Press. p. 49. ISBN   0-521-77920-0.
  3. Klaus Denecke; Shelly L. Wismath (2009). Universal Algebra and Coalgebra. World Scientific. pp. 21–23. ISBN   978-981-283-745-5.
  4. T.H. Tse (2010). A Unifying Framework for Structured Analysis and Design Models: An Approach Using Initial Algebra Semantics and Category Theory. Cambridge University Press. pp. 46–47. doi:10.1017/CBO9780511569890. ISBN   978-0-511-56989-0.
  5. Jean-Yves Béziau (1999). "The mathematical structure of logical syntax". In Carnielli, Walter Alexandre; D'Ottaviano, Itala M. L. (eds.). Advances in Contemporary Logic and Computer Science: Proceedings of the Eleventh Brazilian Conference on Mathematical Logic, May 6-10, 1996, Salvador, Bahia, Brazil. American Mathematical Society. p. 9. ISBN   978-0-8218-1364-5 . Retrieved 18 April 2011.
  6. Dirk van Dalen (2004). Logic and Structure. Springer. p. 108. ISBN   978-3-540-20879-2.
  7. M. Ben-Ari (2001). Mathematical Logic for Computer Science. Springer. pp. 148–150. ISBN   978-1-85233-319-5.
  8. Monroe Newborn (2001). Automated Theorem Proving: Theory and Practice. Springer. p. 43. ISBN   978-0-387-95075-4.
  9. Stanley Burris; H. P. Sankappanavar (1981). A Course in Universal Algebra. Springer. pp. 68–69, 71. ISBN   978-1-4613-8132-7.{{cite book}}: CS1 maint: multiple names: authors list (link)
  10. Rogelio Davila. Answer Set Programming Overview.
  11. Jeanne Ferrante; Charles W. Rackoff (1979). The Computational Complexity of Logical Theories. Springer, Chapter 8, Theorem 1.2.

Further reading