Matrix of ones

Last updated

In mathematics, a matrix of ones or all-ones matrix is a matrix where every entry is equal to one. [1] Examples of standard notation are given below:

Contents

Some sources call the all-ones matrix the unit matrix, [2] but that term may also refer to the identity matrix, a different type of matrix.

A vector of ones or all-ones vector is matrix of ones having row or column form; it should not be confused with unit vectors .

Properties

For an n×n matrix of ones J, the following properties hold:

When J is considered as a matrix over the real numbers, the following additional properties hold:

Applications

The all-ones matrix arises in the mathematical field of combinatorics, particularly involving the application of algebraic methods to graph theory. For example, if A is the adjacency matrix of an n-vertex undirected graph G, and J is the all-ones matrix of the same dimension, then G is a regular graph if and only if AJ = JA. [7] As a second example, the matrix appears in some linear-algebraic proofs of Cayley's formula, which gives the number of spanning trees of a complete graph, using the matrix tree theorem.

See also

Related Research Articles

In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if and only if the matrix is invertible and the linear map represented by the matrix is an isomorphism. The determinant of a product of matrices is the product of their determinants (the preceding property is a corollary of this one). The determinant of a matrix A is denoted det(A), det A, or |A|.

<span class="mw-page-title-main">Pauli matrices</span> Matrices important in quantum mechanics and the study of spin

In mathematical physics and mathematics, the Pauli matrices are a set of three 2 × 2 complex matrices which are Hermitian, involutory and unitary. Usually indicated by the Greek letter sigma, they are occasionally denoted by tau when used in connection with isospin symmetries.

In linear algebra, the trace of a square matrix A, denoted tr(A), is defined to be the sum of elements on the main diagonal of A. The trace is only defined for a square matrix.

<span class="mw-page-title-main">Matrix multiplication</span> Mathematical operation in linear algebra

In mathematics, particularly in linear algebra, matrix multiplication is a binary operation that produces a matrix from two matrices. For matrix multiplication, the number of columns in the first matrix must be equal to the number of rows in the second matrix. The resulting matrix, known as the matrix product, has the number of rows of the first and the number of columns of the second matrix. The product of matrices A and B is denoted as AB.

<span class="mw-page-title-main">Cayley–Hamilton theorem</span> Every square matrix over a commutative ring satisfies its own characteristic equation

In linear algebra, the Cayley–Hamilton theorem states that every square matrix over a commutative ring satisfies its own characteristic equation.

<span class="mw-page-title-main">Special unitary group</span> Group of unitary matrices with determinant of 1

In mathematics, the special unitary group of degree n, denoted SU(n), is the Lie group of n × n unitary matrices with determinant 1.

In mathematics, particularly in linear algebra, a skew-symmetricmatrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition

In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any base. The characteristic equation, also known as the determinantal equation, is the equation obtained by equating the characteristic polynomial to zero.

In mathematics, the signature(v, p, r) of a metric tensor g (or equivalently, a real quadratic form thought of as a real symmetric bilinear form on a finite-dimensional vector space) is the number (counted with multiplicity) of positive, negative and zero eigenvalues of the real symmetric matrix gab of the metric tensor with respect to a basis. In relativistic physics, the v represents the time or virtual dimension, and the p for the space and physical dimension. Alternatively, it can be defined as the dimensions of a maximal positive and null subspace. By Sylvester's law of inertia these numbers do not depend on the choice of basis and thus can be used to classify the metric. The signature is often denoted by a pair of integers (v, p) implying r= 0, or as an explicit list of signs of eigenvalues such as (+, −, −, −) or (−, +, +, +) for the signatures (1, 3, 0) and (3, 1, 0), respectively.

In mathematics, the determinant of a skew-symmetric matrix can always be written as the square of a polynomial in the matrix entries, a polynomial with integer coefficients that only depend on the size of the matrix. The value of this polynomial, when applied to the coefficients of a skew-symmetric matrix, is called the Pfaffian of that matrix. The term Pfaffian was introduced by Cayley (1852) who indirectly named them after Johann Friedrich Pfaff. The Pfaffian is nonvanishing only for 2n × 2n skew-symmetric matrices, in which case it is a polynomial of degree n.

In linear algebra, a circulant matrix is a square matrix in which all row vectors are composed of the same elements and each row vector is rotated one element to the right relative to the preceding row vector. It is a particular kind of Toeplitz matrix.

In linear algebra, a generalized eigenvector of an matrix is a vector which satisfies certain criteria which are more relaxed than those for an (ordinary) eigenvector.

In the mathematical field of graph theory, Kirchhoff's theorem or Kirchhoff's matrix tree theorem named after Gustav Kirchhoff is a theorem about the number of spanning trees in a graph, showing that this number can be computed in polynomial time from the determinant of a submatrix of the Laplacian matrix of the graph; specifically, the number is equal to any cofactor of the Laplacian matrix. Kirchhoff's theorem is a generalization of Cayley's formula which provides the number of spanning trees in a complete graph.

In linear algebra, an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by , is the factor by which the eigenvector is scaled.

In linear algebra, the minimal polynomialμA of an n × n matrix A over a field F is the monic polynomial P over F of least degree such that P(A) = 0. Any other polynomial Q with Q(A) = 0 is a (polynomial) multiple of μA.

In linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the matrix being factorized is a normal or real symmetric matrix, the decomposition is called "spectral decomposition", derived from the spectral theorem.

<span class="mw-page-title-main">Matrix (mathematics)</span> Array of numbers

In mathematics, a matrix is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object.

In linear algebra, the computation of the permanent of a matrix is a problem that is thought to be more difficult than the computation of the determinant of a matrix despite the apparent similarity of the definitions.

In mathematics, Manin matrices, named after Yuri Manin who introduced them around 1987–88, are a class of matrices with elements in a not-necessarily commutative ring, which in a certain sense behave like matrices whose elements commute. In particular there is natural definition of the determinant for them and most linear algebra theorems like Cramer's rule, Cayley–Hamilton theorem, etc. hold true for them. Any matrix with commuting elements is a Manin matrix. These matrices have applications in representation theory in particular to Capelli's identity, Yangian and quantum integrable systems.

In mathematics, particularly in linear algebra and applications, matrix analysis is the study of matrices and their algebraic properties. Some particular topics out of many include; operations defined on matrices, functions of matrices, and the eigenvalues of matrices.

References

  1. Horn, Roger A.; Johnson, Charles R. (2012), "0.2.8 The all-ones matrix and vector", Matrix Analysis, Cambridge University Press, p. 8, ISBN   9780521839402 .
  2. Weisstein, Eric W. "Unit Matrix". MathWorld .
  3. Stanley, Richard P. (2013), Algebraic Combinatorics: Walks, Trees, Tableaux, and More, Springer, Lemma 1.4, p. 4, ISBN   9781461469988 .
  4. Stanley (2013); Horn & Johnson (2012), p. 65.
  5. 1 2 Timm, Neil H. (2002), Applied Multivariate Analysis, Springer texts in statistics, Springer, p. 30, ISBN   9780387227719 .
  6. Smith, Jonathan D. H. (2011), Introduction to Abstract Algebra, CRC Press, p. 77, ISBN   9781420063721 .
  7. Godsil, Chris (1993), Algebraic Combinatorics, CRC Press, Lemma 4.1, p. 25, ISBN   9780412041310 .
  1. One may also consider the case n = 0, in which case the empty matrix is vacuously an all-ones matrix, also with determinant 1.[ citation needed ]