The theory of association schemes arose in statistics, in the theory of experimental design for the analysis of variance. [1] [2] [3] In mathematics, association schemes belong to both algebra and combinatorics. In algebraic combinatorics, association schemes provide a unified approach to many topics, for example combinatorial designs and the theory of error-correcting codes. [4] [5] In algebra, association schemes generalize groups, and the theory of association schemes generalizes the character theory of linear representations of groups. [6] [7] [8]
An n-class association scheme consists of a set X together with a partition S of X × X into n + 1 binary relations, R0, R1, ..., Rn which satisfy:
An association scheme is commutative if for all , and . Most authors assume this property. Note, however, that while the notion of an association scheme generalizes the notion of a group, the notion of a commutative association scheme only generalizes the notion of a commutative group.
A symmetric association scheme is one in which each is a symmetric relation. That is:
Every symmetric association scheme is commutative.
Two points x and y are called i th associates if . The definition states that if x and y are i th associates then so are y and x. Every pair of points are i th associates for exactly one . Each point is its own zeroth associate while distinct points are never zeroth associates. If x and y are k th associates then the number of points which are both i th associates of and j th associates of is a constant .
A symmetric association scheme can be visualized as a complete graph with labeled edges. The graph has vertices, one for each point of , and the edge joining vertices and is labeled if and are th associates. Each edge has a unique label, and the number of triangles with a fixed base labeled having the other edges labeled and is a constant , depending on but not on the choice of the base. In particular, each vertex is incident with exactly edges labeled ; is the valency of the relation . There are also loops labeled at each vertex , corresponding to .
The relations are described by their adjacency matrices. is the adjacency matrix of for and is a v × v matrix with rows and columns labeled by the points of .
The definition of a symmetric association scheme is equivalent to saying that the are v × v (0,1)-matrices which satisfy
The (x, y)-th entry of the left side of (IV) is the number of paths of length two between x and y with labels i and j in the graph. Note that the rows and columns of contain 's:
The term association scheme is due to ( Bose & Shimamoto 1952 ) but the concept is already inherent in ( Bose & Nair 1939 ). [9] These authors were studying what statisticians have called partially balanced incomplete block designs (PBIBDs). The subject became an object of algebraic interest with the publication of ( Bose & Mesner 1959 ) and the introduction of the Bose–Mesner algebra. The most important contribution to the theory was the thesis of P. Delsarte ( Delsarte 1973 ) who recognized and fully used the connections with coding theory and design theory. [10] Generalizations have been studied by D. G. Higman (coherent configurations) and B. Weisfeiler (distance regular graphs).
The adjacency matrices of the graphs generate a commutative and associative algebra (over the real or complex numbers) both for the matrix product and the pointwise product. This associative, commutative algebra is called the Bose–Mesner algebra of the association scheme.
Since the matrices in are symmetric and commute with each other, they can be diagonalized simultaneously. Therefore, is semi-simple and has a unique basis of primitive idempotents .
There is another algebra of matrices which is isomorphic to , and is often easier to work with.
1 | 2 | 3 | 4 | 5 | 6 | |
---|---|---|---|---|---|---|
1 | 0 | 1 | 1 | 2 | 3 | 3 |
2 | 1 | 0 | 1 | 3 | 2 | 3 |
3 | 1 | 1 | 0 | 3 | 3 | 2 |
4 | 2 | 3 | 3 | 0 | 1 | 1 |
5 | 3 | 2 | 3 | 1 | 0 | 1 |
6 | 3 | 3 | 2 | 1 | 1 | 0 |
The Hamming scheme and the Johnson scheme are of major significance in classical coding theory.
In coding theory, association scheme theory is mainly concerned with the distance of a code. The linear programming method produces upper bounds for the size of a code with given minimum distance, and lower bounds for the size of a design with a given strength. The most specific results are obtained in the case where the underlying association scheme satisfies certain polynomial properties; this leads one into the realm of orthogonal polynomials. In particular, some universal bounds are derived for codes and designs in polynomial-type association schemes.
In classical coding theory, dealing with codes in a Hamming scheme, the MacWilliams transform involves a family of orthogonal polynomials known as the Krawtchouk polynomials. These polynomials give the eigenvalues of the distance relation matrices of the Hamming scheme.
In mathematics, an associative algebraA over a commutative ring K is a ring A together with a ring homomorphism from K into the center of A. This is thus an algebraic structure with an addition, a multiplication, and a scalar multiplication. The addition and multiplication operations together give A the structure of a ring; the addition and scalar multiplication operations together give A the structure of a module or vector space over K. In this article we will also use the term K-algebra to mean an associative algebra over K. A standard first example of a K-algebra is a ring of square matrices over a commutative ring K, with the usual matrix multiplication.
In mathematics, rings are algebraic structures that generalize fields: multiplication need not be commutative and multiplicative inverses need not exist. Informally, a ring is a set equipped with two binary operations satisfying properties analogous to those of addition and multiplication of integers. Ring elements may be numbers such as integers or complex numbers, but they may also be non-numerical objects such as polynomials, square matrices, functions, and power series.
In mathematics, more specifically in ring theory, local rings are certain rings that are comparatively simple, and serve to describe what is called "local behaviour", in the sense of functions defined on varieties or manifolds, or of algebraic number fields examined at a particular place, or prime. Local algebra is the branch of commutative algebra that studies commutative local rings and their modules.
In mathematics, an algebra over a field is a vector space equipped with a bilinear product. Thus, an algebra is an algebraic structure consisting of a set together with operations of multiplication and addition and scalar multiplication by elements of a field and satisfying the axioms implied by "vector space" and "bilinear".
In mathematics, specifically in homology theory and algebraic topology, cohomology is a general term for a sequence of abelian groups, usually one associated with a topological space, often defined from a cochain complex. Cohomology can be viewed as a method of assigning richer algebraic invariants to a space than homology. Some versions of cohomology arise by dualizing the construction of homology. In other words, cochains are functions on the group of chains in homology theory.
Algebraic varieties are the central objects of study in algebraic geometry, a sub-field of mathematics. Classically, an algebraic variety is defined as the set of solutions of a system of polynomial equations over the real or complex numbers. Modern definitions generalize this concept in several different ways, while attempting to preserve the geometric intuition behind the original definition.
In mathematics, a Hopf algebra, named after Heinz Hopf, is a structure that is simultaneously an algebra and a coalgebra, with these structures' compatibility making it a bialgebra, and that moreover is equipped with an antiautomorphism satisfying a certain property. The representation theory of a Hopf algebra is particularly nice, since the existence of compatible comultiplication, counit, and antipode allows for the construction of tensor products of representations, trivial representations, and dual representations.
In mathematics, a gerbe is a construct in homological algebra and topology. Gerbes were introduced by Jean Giraud following ideas of Alexandre Grothendieck as a tool for non-commutative cohomology in degree 2. They can be seen as an analogue of fibre bundles where the fibre is the classifying stack of a group. Gerbes provide a convenient, if highly abstract, language for dealing with many types of deformation questions especially in modern algebraic geometry. In addition, special cases of gerbes have been used more recently in differential topology and differential geometry to give alternative descriptions to certain cohomology classes and additional structures attached to them.
In abstract algebra, a Jordan algebra is a nonassociative algebra over a field whose multiplication satisfies the following axioms:
In abstract algebra, if I and J are ideals of a commutative ring R, their ideal quotient (I : J) is the set
In abstract algebra, the biquaternions are the numbers w + xi + yj + zk, where w, x, y, and z are complex numbers, or variants thereof, and the elements of {1, i, j, k} multiply as in the quaternion group and commute with their coefficients. There are three types of biquaternions corresponding to complex numbers and the variations thereof:
In mathematics, a unipotent elementr of a ring R is one such that r − 1 is a nilpotent element; in other words, (r − 1)n is zero for some n.
In mathematics, especially in the fields of representation theory and module theory, a Frobenius algebra is a finite-dimensional unital associative algebra with a special kind of bilinear form which gives the algebras particularly nice duality theories. Frobenius algebras began to be studied in the 1930s by Richard Brauer and Cecil Nesbitt and were named after Georg Frobenius. Tadashi Nakayama discovered the beginnings of a rich duality theory, . Jean Dieudonné used this to characterize Frobenius algebras. Frobenius algebras were generalized to quasi-Frobenius rings, those Noetherian rings whose right regular representation is injective. In recent times, interest has been renewed in Frobenius algebras due to connections to topological quantum field theory.
Algebraic combinatorics is an area of mathematics that employs methods of abstract algebra, notably group theory and representation theory, in various combinatorial contexts and, conversely, applies combinatorial techniques to problems in algebra.
The Hamming scheme, named after Richard Hamming, is also known as the hyper-cubic association scheme, and it is the most important example for coding theory. In this scheme the set of binary vectors of length and two vectors are -th associates if they are Hamming distance apart.
In mathematics, a Bose–Mesner algebra is a special set of matrices which arise from a combinatorial structure known as an association scheme, together with the usual set of rules for combining those matrices, such that they form an associative algebra, or, more precisely, a unitary commutative algebra. Among these rules are:
In mathematics, the Butcher group, named after the New Zealand mathematician John C. Butcher by Hairer & Wanner (1974), is an infinite-dimensional Lie group first introduced in numerical analysis to study solutions of non-linear ordinary differential equations by the Runge–Kutta method. It arose from an algebraic formalism involving rooted trees that provides formal power series solutions of the differential equation modeling the flow of a vector field. It was Cayley (1857), prompted by the work of Sylvester on change of variables in differential calculus, who first noted that the derivatives of a composition of functions can be conveniently expressed in terms of rooted trees and their combinatorics.
In mathematics, Hurwitz's theorem is a theorem of Adolf Hurwitz (1859–1919), published posthumously in 1923, solving the Hurwitz problem for finite-dimensional unital real non-associative algebras endowed with a positive-definite quadratic form. The theorem states that if the quadratic form defines a homomorphism into the positive real numbers on the non-zero part of the algebra, then the algebra must be isomorphic to the real numbers, the complex numbers, the quaternions, or the octonions. Such algebras, sometimes called Hurwitz algebras, are examples of composition algebras.
In algebraic geometry, a derived scheme is a homotopy-theoretic generalization of a scheme in which classical commutative rings are replaced with derived versions such as cdgas, commutative simplicial rings, or commutative ring spectra.
A coherent algebra is an algebra of complex square matrices that is closed under ordinary matrix multiplication, Schur product, transposition, and contains both the identity matrix and the all-ones matrix .