Normal matrix

Last updated

In mathematics, a complex square matrix A is normal if it commutes with its conjugate transpose A*:

Contents

The concept of normal matrices can be extended to normal operators on infinite-dimensional normed spaces and to normal elements in C*-algebras. As in the matrix case, normality means commutativity is preserved, to the extent possible, in the noncommutative setting. This makes normal operators, and normal elements of C*-algebras, more amenable to analysis.

The spectral theorem states that a matrix is normal if and only if it is unitarily similar to a diagonal matrix, and therefore any matrix A satisfying the equation A*A = AA* is diagonalizable. The converse does not hold because diagonalizable matrices may have non-orthogonal eigenspaces.

The left and right singular vectors in the singular value decomposition of a normal matrix differ only in complex phase from each other and from the corresponding eigenvectors, since the phase must be factored out of the eigenvalues to form singular values.

Special cases

Among complex matrices, all unitary, Hermitian, and skew-Hermitian matrices are normal, with all eigenvalues being unit modulus, real, and imaginary, respectively. Likewise, among real matrices, all orthogonal, symmetric, and skew-symmetric matrices are normal, with all eigenvalues being complex conjugate pairs on the unit circle, real, and imaginary, respectively. However, it is not the case that all normal matrices are either unitary or (skew-)Hermitian, as their eigenvalues can be any complex number, in general. For example,

is neither unitary, Hermitian, nor skew-Hermitian, because its eigenvalues are ; yet it is normal because

Consequences

Proposition  A normal triangular matrix is diagonal.

Proof

Let A be any normal upper triangular matrix. Since

using subscript notation, one can write the equivalent expression using instead the ith unit vector () to select the ith row and ith column:

The expression

is equivalent, and so is

which shows that the ith row must have the same norm as the ith column.

Consider i = 1. The first entry of row 1 and column 1 are the same, and the rest of column 1 is zero (because of triangularity). This implies the first row must be zero for entries 2 through n. Continuing this argument for row–column pairs 2 through n shows A is diagonal. Q.E.D.

The concept of normality is important because normal matrices are precisely those to which the spectral theorem applies:

Proposition  A matrix A is normal if and only if there exists a diagonal matrix Λ and a unitary matrix U such that A = UΛU*.

The diagonal entries of Λ are the eigenvalues of A, and the columns of U are the eigenvectors of A. The matching eigenvalues in Λ come in the same order as the eigenvectors are ordered as columns of U.

Another way of stating the spectral theorem is to say that normal matrices are precisely those matrices that can be represented by a diagonal matrix with respect to a properly chosen orthonormal basis of Cn. Phrased differently: a matrix is normal if and only if its eigenspaces span Cn and are pairwise orthogonal with respect to the standard inner product of Cn.

The spectral theorem for normal matrices is a special case of the more general Schur decomposition which holds for all square matrices. Let A be a square matrix. Then by Schur decomposition it is unitary similar to an upper-triangular matrix, say, B. If A is normal, so is B. But then B must be diagonal, for, as noted above, a normal upper-triangular matrix is diagonal.

The spectral theorem permits the classification of normal matrices in terms of their spectra, for example:

Proposition  A normal matrix is unitary if and only if all of its eigenvalues (its spectrum) lie on the unit circle of the complex plane.

Proposition  A normal matrix is self-adjoint if and only if its spectrum is contained in . In other words: A normal matrix is Hermitian if and only if all its eigenvalues are real.

In general, the sum or product of two normal matrices need not be normal. However, the following holds:

Proposition  If A and B are normal with AB = BA, then both AB and A + B are also normal. Furthermore there exists a unitary matrix U such that UAU* and UBU* are diagonal matrices. In other words A and B are simultaneously diagonalizable.

In this special case, the columns of U* are eigenvectors of both A and B and form an orthonormal basis in Cn. This follows by combining the theorems that, over an algebraically closed field, commuting matrices are simultaneously triangularizable and a normal matrix is diagonalizable – the added result is that these can both be done simultaneously.

Equivalent definitions

It is possible to give a fairly long list of equivalent definitions of a normal matrix. Let A be a n × n complex matrix. Then the following are equivalent:

  1. A is normal.
  2. A is diagonalizable by a unitary matrix.
  3. There exists a set of eigenvectors of A which forms an orthonormal basis for Cn.
  4. for every x.
  5. The Frobenius norm of A can be computed by the eigenvalues of A: .
  6. The Hermitian part 1/2(A + A*) and skew-Hermitian part 1/2(AA*) of A commute.
  7. A* is a polynomial (of degree n − 1) in A. [lower-alpha 1]
  8. A* = AU for some unitary matrix U. [1]
  9. U and P commute, where we have the polar decomposition A = UP with a unitary matrix U and some positive semidefinite matrix P.
  10. A commutes with some normal matrix N with distinct[ clarification needed ] eigenvalues.
  11. σi = |λi| for all 1 ≤ in where A has singular values σ1 ≥ ⋯ ≥ σn and has eigenvalues that are indexed with ordering |λ1| ≥ ⋯ ≥ |λn|. [2]

Some but not all of the above generalize to normal operators on infinite-dimensional Hilbert spaces. For example, a bounded operator satisfying (9) is only quasinormal.

Normal matrix analogy

It is occasionally useful (but sometimes misleading) to think of the relationships of special kinds of normal matrices as analogous to the relationships of the corresponding type of complex numbers of which their eigenvalues are composed. This is because any function of a non-defective matrix acts directly on each of its eigenvalues, and the conjugate transpose of its spectral decomposition is , where is the diagonal matrix of eigenvalues. Likewise, if two normal matrices commute and are therefore simultaneously diagonalizable, any operation between these matrices also acts on each corresponding pair of eigenvalues.

As a special case, the complex numbers may be embedded in the normal 2×2 real matrices by the mapping

which preserves addition and multiplication. It is easy to check that this embedding respects all of the above analogies.

See also

Notes

  1. Proof: When is normal, use Lagrange's interpolation formula to construct a polynomial such that , where are the eigenvalues of .

Citations

  1. Horn & Johnson (1985) , p. 109
  2. Horn & Johnson (1991) , p.  157

Sources

Related Research Articles

In mathematics, a symmetric matrix with real entries is positive-definite if the real number is positive for every nonzero real column vector where is the transpose of . More generally, a Hermitian matrix is positive-definite if the real number is positive for every nonzero complex column vector where denotes the conjugate transpose of

In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized. This is extremely useful because computations involving a diagonalizable matrix can often be reduced to much simpler computations involving the corresponding diagonal matrix. The concept of diagonalization is relatively straightforward for operators on finite-dimensional vector spaces but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of linear operators that can be modeled by multiplication operators, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative C*-algebras. See also spectral theory for a historical perspective.

<span class="mw-page-title-main">Symmetric matrix</span> Matrix equal to its transpose

In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally,

In linear algebra, an invertible complex square matrix U is unitary if its conjugate transpose U* is also its inverse, that is, if

<span class="mw-page-title-main">Singular value decomposition</span> Matrix decomposition

In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix. It generalizes the eigendecomposition of a square normal matrix with an orthonormal eigenbasis to any matrix. It is related to the polar decomposition.

<span class="mw-page-title-main">Square matrix</span> Matrix with the same number of rows and columns

In mathematics, a square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square matrix of order . Any two square matrices of the same order can be added and multiplied.

In mathematics, particularly in linear algebra, a skew-symmetricmatrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition

In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is , while an example of a 3×3 diagonal matrix is. An identity matrix of any size, or any multiple of it is a diagonal matrix called scalar matrix, for example, . In geometry, a diagonal matrix may be used as a scaling matrix, since matrix multiplication with it results in changing scale (size) and possibly also shape; only a scalar matrix results in uniform change in scale.

In mathematics, a Hermitian matrix is a complex square matrix that is equal to its own conjugate transpose—that is, the element in the i-th row and j-th column is equal to the complex conjugate of the element in the j-th row and i-th column, for all indices i and j:

In linear algebra, a square matrix  is called diagonalizable or non-defective if it is similar to a diagonal matrix. That is, if there exists an invertible matrix  and a diagonal matrix such that . This is equivalent to . This property exists for any linear map: for a finite-dimensional vector space , a linear map  is called diagonalizable if there exists an ordered basis of  consisting of eigenvectors of . These definitions are equivalent: if  has a matrix representation as above, then the column vectors of  form a basis consisting of eigenvectors of , and the diagonal entries of  are the corresponding eigenvalues of ; with respect to this eigenvector basis,  is represented by .

In the mathematical discipline of linear algebra, a matrix decomposition or matrix factorization is a factorization of a matrix into a product of matrices. There are many different matrix decompositions; each finds use among a particular class of problems.

<span class="mw-page-title-main">Jordan normal form</span> Form of a matrix indicating its eigenvalues and their algebraic multiplicities

In linear algebra, a Jordan normal form, also known as a Jordan canonical form (JCF), is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to some basis. Such a matrix has each non-zero off-diagonal entry equal to 1, immediately above the main diagonal, and with identical diagonal entries to the left and below them.

In the mathematical discipline of linear algebra, the Schur decomposition or Schur triangulation, named after Issai Schur, is a matrix decomposition. It allows one to write an arbitrary complex square matrix as unitarily equivalent to an upper triangular matrix whose diagonal elements are the eigenvalues of the original matrix.

In linear algebra, a square matrix with complex entries is said to be skew-Hermitian or anti-Hermitian if its conjugate transpose is the negative of the original matrix. That is, the matrix is skew-Hermitian if it satisfies the relation

In numerical analysis, one of the most important problems is designing efficient and stable algorithms for finding the eigenvalues of a matrix. These eigenvalue algorithms may also find eigenvectors.

In mathematics, operator theory is the study of linear operators on function spaces, beginning with differential operators and integral operators. The operators may be presented abstractly by their characteristics, such as bounded linear operators or closed operators, and consideration may be given to nonlinear operators. The study, which depends heavily on the topology of function spaces, is a branch of functional analysis.

In linear algebra, it is often important to know which vectors have their directions unchanged by a linear transformation. An eigenvector or characteristic vector is such a vector. Thus an eigenvector of a linear transformation is scaled by a constant factor when the linear transformation is applied to it: . The corresponding eigenvalue, characteristic value, or characteristic root is the multiplying factor .

In linear algebra, a defective matrix is a square matrix that does not have a complete basis of eigenvectors, and is therefore not diagonalizable. In particular, an n × n matrix is defective if and only if it does not have n linearly independent eigenvectors. A complete basis is formed by augmenting the eigenvectors with generalized eigenvectors, which are necessary for solving defective systems of ordinary differential equations and other problems.

In linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the matrix being factorized is a normal or real symmetric matrix, the decomposition is called "spectral decomposition", derived from the spectral theorem.

In linear algebra, two matrices and are said to commute if , or equivalently if their commutator is zero. A set of matrices is said to commute if they commute pairwise, meaning that every pair of matrices in the set commute with each other.