Williamson conjecture

Last updated

In combinatorial mathematics, specifically in combinatorial design theory and combinatorial matrix theory the Williamson conjecture is that Williamson matrices of order exist for all positive integers . Four symmetric and circulant matrices , , , are known as Williamson matrices if their entries are and they satisfy the relationship

where is the identity matrix of order . John Williamson showed that if , , , are Williamson matrices then

is an Hadamard matrix of order . [1] It was once considered likely that Williamson matrices exist for all orders and that the structure of Williamson matrices could provide a route to proving the Hadamard conjecture that Hadamard matrices exist for all orders . [2] However, in 1993 the Williamson conjecture was shown to be false via an exhaustive computer search by Dragomir Ž. Ðoković, who showed that Williamson matrices do not exist in order . [3] In 2008, the counterexamples 47, 53, and 59 were additionally discovered. [4]

Related Research Articles

Latin square Square array with symbols that each occur once per row and column

In combinatorics and in experimental design, a Latin square is an n × n array filled with n different symbols, each occurring exactly once in each row and exactly once in each column. An example of a 3×3 Latin square is

Matrix multiplication Mathematical operation in linear algebra

In mathematics, particularly in linear algebra, matrix multiplication is a binary operation that produces a matrix from two matrices. For matrix multiplication, the number of columns in the first matrix must be equal to the number of rows in the second matrix. The resulting matrix, known as the matrix product, has the number of rows of the first and the number of columns of the second matrix. The product of matrices A and B is denoted as AB.

In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is , while an example of a 3×3 diagonal matrix is. An identity matrix of any size, or any multiple of it, is a diagonal matrix.

In mathematics, a Hadamard matrix, named after the French mathematician Jacques Hadamard, is a square matrix whose entries are either +1 or −1 and whose rows are mutually orthogonal. In geometric terms, this means that each pair of rows in a Hadamard matrix represents two perpendicular vectors, while in combinatorial terms, it means that each pair of rows has matching entries in exactly half of their columns and mismatched entries in the remaining columns. It is a consequence of this definition that the corresponding properties hold for columns as well as rows. The n-dimensional parallelotope spanned by the rows of an n×n Hadamard matrix has the maximum possible n-dimensional volume among parallelotopes spanned by vectors whose entries are bounded in absolute value by 1. Equivalently, a Hadamard matrix has maximal determinant among matrices with entries of absolute value less than or equal to 1 and so is an extremal solution of Hadamard's maximal determinant problem.

In mathematics, the Kronecker product, sometimes denoted by ⊗, is an operation on two matrices of arbitrary size resulting in a block matrix. It is a generalization of the outer product from vectors to matrices, and gives the matrix of the tensor product linear map with respect to a standard choice of basis. The Kronecker product is to be distinguished from the usual matrix multiplication, which is an entirely different operation. The Kronecker product is also sometimes called matrix direct product.

Walsh matrix

In mathematics, a Walsh matrix is a specific square matrix of dimensions 2n, where n are some particular natural number. The entries of the matrix are either +1 or −1 and its rows as well as columns are orthogonal, i.e. dot product is zero. The Walsh matrix was proposed by Joseph L. Walsh in 1923. Each row of a Walsh matrix corresponds to a Walsh function.

In mathematics, an alternating sign matrix is a square matrix of 0s, 1s, and −1s such that the sum of each row and column is 1 and the nonzero entries in each row and column alternate in sign. These matrices generalize permutation matrices and arise naturally when using Dodgson condensation to compute a determinant. They are also closely related to the six-vertex model with domain wall boundary conditions from statistical mechanics. They were first defined by William Mills, David Robbins, and Howard Rumsey in the former context.

In mathematics, especially in probability and combinatorics, a doubly stochastic matrix (also called bistochastic matrix), is a square matrix of nonnegative real numbers, each of whose rows and columns sums to 1, i.e.,

A complex Hadamard matrix is any complex matrix satisfying two conditions:

In mathematics, a complex Hadamard matrix H of size N with all its columns (rows) mutually orthogonal, belongs to the Butson-typeH(qN) if all its elements are powers of q-th root of unity,

In mathematics, a square matrix is said to be diagonally dominant if, for every row of the matrix, the magnitude of the diagonal entry in a row is larger than or equal to the sum of the magnitudes of all the other (non-diagonal) entries in that row. More precisely, the matrix A is diagonally dominant if

In mathematics, the Paley construction is a method for constructing Hadamard matrices using finite fields. The construction was described in 1933 by the English mathematician Raymond Paley.

In mathematics, especially in linear algebra and matrix theory, the vectorization of a matrix is a linear transformation which converts the matrix into a column vector. Specifically, the vectorization of a m × n matrix A, denoted vec(A), is the mn × 1 column vector obtained by stacking the columns of the matrix A on top of one another:

Hadamard code

The Hadamard code is an error-correcting code named after Jacques Hadamard that is used for error detection and correction when transmitting messages over very noisy or unreliable channels. In 1971, the code was used to transmit photos of Mars back to Earth from the NASA space probe Mariner 9. Because of its unique mathematical properties, the Hadamard code is not only used by engineers, but also intensely studied in coding theory, mathematics, and theoretical computer science. The Hadamard code is also known under the names Walsh code, Walsh family, and Walsh–Hadamard code in recognition of the American mathematician Joseph Leonard Walsh.

In mathematics, a unistochastic matrix is a doubly stochastic matrix whose entries are the squares of the absolute values of the entries of some unitary matrix.

Matrix (mathematics) Two-dimensional array of numbers with specific operations

In mathematics, a matrix is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example,

In theoretical computer science, the computational complexity of matrix multiplication dictates how quickly the operation of matrix multiplication can be performed. Matrix multiplication algorithms are a central subroutine in theoretical and numerical algorithms for numerical linear algebra and optimization, so finding the right amount of time it should take is of major practical relevance.

Hadamard's maximal determinant problem, named after Jacques Hadamard, asks for the largest determinant of a matrix with elements equal to 1 or −1. The analogous question for matrices with elements equal to 0 or 1 is equivalent since, as will be shown below, the maximal determinant of a {1,−1} matrix of size n is 2n−1 times the maximal determinant of a {0,1} matrix of size n−1. The problem was posed by Hadamard in the 1893 paper in which he presented his famous determinant bound and remains unsolved for matrices of general size. Hadamard's bound implies that {1, −1}-matrices of size n have determinant at most nn/2. Hadamard observed that a construction of Sylvester produces examples of matrices that attain the bound when n is a power of 2, and produced examples of his own of sizes 12 and 20. He also showed that the bound is only attainable when n is equal to 1, 2, or a multiple of 4. Additional examples were later constructed by Scarpis and Paley and subsequently by many other authors. Such matrices are now known as Hadamard matrices. They have received intensive study.

Hadamard product (matrices) Matrix operation

In mathematics, the Hadamard product is a binary operation that takes two matrices of the same dimensions and produces another matrix of the same dimension as the operands, where each element i, j is the product of elements i, j of the original two matrices. It is to be distinguished from the more common matrix product. It is attributed to, and named after, either French mathematician Jacques Hadamard or German mathematician Issai Schur.

In the mathematical theory of matroids, a matroid representation is a family of vectors whose linear independence relation is the same as that of a given matroid. Matroid representations are analogous to group representations; both types of representation provide abstract algebraic structures with concrete descriptions in terms of linear algebra.

References

  1. Williamson, John (1944). "Hadamard's determinant theorem and the sum of four squares". Duke Mathematical Journal . 11 (1): 65–81. doi:10.1215/S0012-7094-44-01108-7. MR   0009590.
  2. Golomb, Solomon W.; Baumert, Leonard D. (1963). "The Search for Hadamard Matrices". American Mathematical Monthly . 70 (1): 12–17. doi:10.2307/2312777. JSTOR   2312777. MR   0146195.
  3. Ðoković, Dragomir Ž. (1993). "Williamson matrices of order for ". Discrete Mathematics . 115 (1): 267–271. doi: 10.1016/0012-365X(93)90495-F . MR   1217635.
  4. Holzmann, W. H.; Kharaghani, H.; Tayfeh-Rezaie, B. (2008). "Williamson matrices up to order 59". Designs, Codes and Cryptography. 46 (3): 343–352. doi:10.1007/s10623-007-9163-5. MR   2372843.