Look up algebra in Wiktionary, the free dictionary. |
The word 'algebra' is used for various branches and structures of mathematics. For their overview, see Algebra.
The bare word "algebra" may refer to:
In universal algebra, algebra has an axiomatic definition, roughly as an instance of any of a number of algebraic structures, such as groups, rings, etc.
The term is also traditionally used for the field of:
An "algebra", or to be verbose, an algebra over a field, is a vector space equipped with a bilinear vector product. Some notable algebras in this sense are:
See also coalgebra, the dual notion.
A different class of "algebras" consists of objects which generalize logical connectives, sets, and lattices.
"Algebra" can also describe more general structures:
In mathematics, an associative algebraA is an algebraic structure with compatible operations of addition, multiplication, and a scalar multiplication by elements in some field. The addition and multiplication operations together give A the structure of a ring; the addition and scalar multiplication operations together give A the structure of a vector space over K. In this article we will also use the term K-algebra to mean an associative algebra over the field K. A standard first example of a K-algebra is a ring of square matrices over a field K, with the usual matrix multiplication.
Linear algebra is the branch of mathematics concerning linear equations such as:
In mathematics, physics, and engineering, a vector space is a set of objects called vectors, which may be added together and multiplied ("scaled") by numbers called scalars. Scalars are often real numbers, but some vector spaces have scalar multiplication by complex numbers or, generally, by a scalar from any mathematic field. The operations of vector addition and scalar multiplication must satisfy certain requirements, called vector axioms. To specify whether the scalars in a particular vector space are real numbers or complex numbers, the terms real vector space and complex vector space are often used.
In abstract algebra, the direct sum is a construction which combines several modules into a new, larger module. The direct sum of modules is the smallest module which contains the given modules as submodules with no "unnecessary" constraints, making it an example of a coproduct. Contrast with the direct product, which is the dual notion.
In mathematics, an algebraic structure consists of a nonempty set A, a collection of operations on A of finite arity, and a finite set of identities, known as axioms, that these operations must satisfy.
In mathematics, an algebra over a field is a vector space equipped with a bilinear product. Thus, an algebra is an algebraic structure consisting of a set together with operations of multiplication and addition and scalar multiplication by elements of a field and satisfying the axioms implied by "vector space" and "bilinear".
In mathematics, coalgebras or cogebras are structures that are dual to unital associative algebras. The axioms of unital associative algebras can be formulated in terms of commutative diagrams. Turning all arrows around, one obtains the axioms of coalgebras. Every coalgebra, by duality, gives rise to an algebra, but not in general the other way. In finite dimensions, this duality goes in both directions.
In mathematics, a duality translates concepts, theorems or mathematical structures into other concepts, theorems or structures, in a one-to-one fashion, often by means of an involution operation: if the dual of A is B, then the dual of B is A. Such involutions sometimes have fixed points, so that the dual of A is A itself. For example, Desargues' theorem is self-dual in this sense under the standard duality in projective geometry.
In abstract algebra, a representation of an associative algebra is a module for that algebra. Here an associative algebra is a ring. If the algebra is not unital, it may be made so in a standard way ; there is no essential difference between modules for the resulting unital ring, in which the identity acts by the identity mapping, and representations of the algebra.
In the mathematical fields of linear algebra and functional analysis, the orthogonal complement of a subspace W of a vector space V equipped with a bilinear form B is the set W⊥ of all vectors in V that are orthogonal to every vector in W. Informally, it is called the perp, short for perpendicular complement. It is a subspace of V.
In mathematics, especially in the fields of representation theory and module theory, a Frobenius algebra is a finite-dimensional unital associative algebra with a special kind of bilinear form which gives the algebras particularly nice duality theories. Frobenius algebras began to be studied in the 1930s by Richard Brauer and Cecil Nesbitt and were named after Ferdinand Frobenius. Tadashi Nakayama discovered the beginnings of a rich duality theory, . Jean Dieudonné used this to characterize Frobenius algebras. Frobenius algebras were generalized to quasi-Frobenius rings, those Noetherian rings whose right regular representation is injective. In recent times, interest has been renewed in Frobenius algebras due to connections to topological quantum field theory.
In mathematics, there are many types of algebraic structures which are studied. Abstract algebra is primarily the study of specific algebraic structures and their properties. Algebraic structures may be viewed in different ways, however the common starting point of algebra texts is that an algebraic object incorporates one or more sets with one or more binary operations or unary operations satisfying a collection of axioms.
In mathematics, especially in the areas of abstract algebra known as universal algebra, group theory, ring theory, and module theory, a subdirect product is a subalgebra of a direct product that depends fully on all its factors without however necessarily being the whole direct product. The notion was introduced by Birkhoff in 1944 and has proved to be a powerful generalization of the notion of direct product.
Representation theory is a branch of mathematics that studies abstract algebraic structures by representing their elements as linear transformations of vector spaces, and studies modules over these abstract algebraic structures. In essence, a representation makes an abstract algebraic object more concrete by describing its elements by matrices and their algebraic operations. The theory of matrices and linear operators is well-understood, so representations of more abstract objects in terms of familiar linear algebra objects helps glean properties and sometimes simplify calculations on more abstract theories.
In mathematics and physics, a vector is an element of a vector space. For many specific vector spaces, the vectors have received specific names, which are listed below. In general, a Euclidean vector is a geometric object with both length and direction. Such vectors can be added to each other or scaled using vector algebra. Correspondingly, an ensemble of vectors is called a vector space. These objects are the subject of linear algebra and can be characterized by their dimension.
In mathematics, semi-simplicity is a widespread concept in disciplines such as linear algebra, abstract algebra, representation theory, category theory, and algebraic geometry. A semi-simple object is one that can be decomposed into a sum of simple objects, and simple objects are those that do not contain non-trivial proper sub-objects. The precise definitions of these words depends on the context.
This is a glossary for the terminology in a mathematical field of functional analysis.