Skew-Hermitian matrix

Last updated

In linear algebra, a square matrix with complex entries is said to be skew-Hermitian or anti-Hermitian if its conjugate transpose is the negative of the original matrix. [1] That is, the matrix is skew-Hermitian if it satisfies the relation

Contents

where denotes the conjugate transpose of the matrix . In component form, this means that

for all indices and , where is the element in the -th row and -th column of , and the overline denotes complex conjugation.

Skew-Hermitian matrices can be understood as the complex versions of real skew-symmetric matrices, or as the matrix analogue of the purely imaginary numbers. [2] The set of all skew-Hermitian matrices forms the Lie algebra, which corresponds to the Lie group U(n). The concept can be generalized to include linear transformations of any complex vector space with a sesquilinear norm.

Note that the adjoint of an operator depends on the scalar product considered on the dimensional complex or real space . If denotes the scalar product on , then saying is skew-adjoint means that for all one has .

Imaginary numbers can be thought of as skew-adjoint (since they are like matrices), whereas real numbers correspond to self-adjoint operators.

Example

For example, the following matrix is skew-Hermitian

because

Properties

Decomposition into Hermitian and skew-Hermitian

See also

Notes

  1. Horn & Johnson (1985), §4.1.1; Meyer (2000), §3.2
  2. Horn & Johnson (1985), §4.1.2
  3. Horn & Johnson (1985), §2.5.2, §2.5.4
  4. Meyer (2000), Exercise 3.2.5
  5. 1 2 Horn & Johnson (1985), §4.1.1

Related Research Articles

In mathematics, a symmetric matrix with real entries is positive-definite if the real number is positive for every nonzero real column vector where is the transpose of . More generally, a Hermitian matrix is positive-definite if the real number is positive for every nonzero complex column vector where denotes the conjugate transpose of

<span class="mw-page-title-main">Symmetric matrix</span> Matrix equal to its transpose

In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally,

<span class="mw-page-title-main">Square matrix</span> Matrix with the same number of rows and columns

In mathematics, a square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square matrix of order . Any two square matrices of the same order can be added and multiplied.

In mathematics, a complex square matrix A is normal if it commutes with its conjugate transpose A*:

<span class="mw-page-title-main">Transpose</span> Matrix operation which flips a matrix over its diagonal

In linear algebra, the transpose of a matrix is an operator which flips a matrix over its diagonal; that is, it switches the row and column indices of the matrix A by producing another matrix, often denoted by AT.

<span class="mw-page-title-main">Complex conjugate</span> Fundamental operation on complex numbers

In mathematics, the complex conjugate of a complex number is the number with an equal real part and an imaginary part equal in magnitude but opposite in sign. That is, if and are real numbers then the complex conjugate of is The complex conjugate of is often denoted as or .

<span class="mw-page-title-main">Symplectic group</span> Mathematical group

In mathematics, the name symplectic group can refer to two different, but closely related, collections of mathematical groups, denoted Sp(2n, F) and Sp(n) for positive integer n and field F (usually C or R). The latter is called the compact symplectic group and is also denoted by . Many authors prefer slightly different notations, usually differing by factors of 2. The notation used here is consistent with the size of the most common matrices which represent the groups. In Cartan's classification of the simple Lie algebras, the Lie algebra of the complex group Sp(2n, C) is denoted Cn, and Sp(n) is the compact real form of Sp(2n, C). Note that when we refer to the (compact) symplectic group it is implied that we are talking about the collection of (compact) symplectic groups, indexed by their dimension n.

In mathematics, particularly in linear algebra, a skew-symmetricmatrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition

In mathematics, a Hermitian matrix is a complex square matrix that is equal to its own conjugate transpose—that is, the element in the i-th row and j-th column is equal to the complex conjugate of the element in the j-th row and i-th column, for all indices i and j:

In linear algebra, the adjugate or classical adjoint of a square matrix A is the transpose of its cofactor matrix and is denoted by adj(A). It is also occasionally known as adjunct matrix, or "adjoint", though the latter term today normally refers to a different concept, the adjoint operator which for a matrix is the conjugate transpose.

In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an complex matrix is an matrix obtained by transposing and applying complex conjugate on each entry. It is often denoted as or or , and very commonly in physics as .

In the mathematical discipline of linear algebra, a matrix decomposition or matrix factorization is a factorization of a matrix into a product of matrices. There are many different matrix decompositions; each finds use among a particular class of problems.

In mathematics, and more specifically in abstract algebra, a *-algebra is a mathematical structure consisting of two involutive ringsR and A, where R is commutative and A has the structure of an associative algebra over R. Involutive algebras generalize the idea of a number system equipped with conjugation, for example the complex numbers and complex conjugation, matrices over the complex numbers and conjugate transpose, and linear operators over a Hilbert space and Hermitian adjoints. However, it may happen that an algebra admits no involution.

In numerical analysis, one of the most important problems is designing efficient and stable algorithms for finding the eigenvalues of a matrix. These eigenvalue algorithms may also find eigenvectors.

<span class="mw-page-title-main">Projection (linear algebra)</span> Idempotent linear transformation from a vector space to itself

In linear algebra and functional analysis, a projection is a linear transformation from a vector space to itself such that . That is, whenever is applied twice to any vector, it gives the same result as if it were applied once. It leaves its image unchanged. This definition of "projection" formalizes and generalizes the idea of graphical projection. One can also consider the effect of a projection on a geometrical object by examining the effect of the projection on points in the object.

In linear algebra, a rotation matrix is a transformation matrix that is used to perform a rotation in Euclidean space. For example, using the convention below, the matrix

In mathematics, the Cayley transform, named after Arthur Cayley, is any of a cluster of related things. As originally described by Cayley (1846), the Cayley transform is a mapping between skew-symmetric matrices and special orthogonal matrices. The transform is a homography used in real analysis, complex analysis, and quaternionic analysis. In the theory of Hilbert spaces, the Cayley transform is a mapping between linear operators.

<span class="mw-page-title-main">Vectorization (mathematics)</span> Conversion of a matrix or a tensor to a vector

In mathematics, especially in linear algebra and matrix theory, the vectorization of a matrix is a linear transformation which converts the matrix into a vector. Specifically, the vectorization of a m × n matrix A, denoted vec(A), is the mn × 1 column vector obtained by stacking the columns of the matrix A on top of one another:

<span class="mw-page-title-main">Classical group</span>

In mathematics, the classical groups are defined as the special linear groups over the reals R, the complex numbers C and the quaternions H together with special automorphism groups of symmetric or skew-symmetric bilinear forms and Hermitian or skew-Hermitian sesquilinear forms defined on real, complex and quaternionic finite-dimensional vector spaces. Of these, the complex classical Lie groups are four infinite families of Lie groups that together with the exceptional groups exhaust the classification of simple Lie groups. The compact classical groups are compact real forms of the complex classical groups. The finite analogues of the classical groups are the classical groups of Lie type. The term "classical group" was coined by Hermann Weyl, it being the title of his 1939 monograph The Classical Groups.

In linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the matrix being factorized is a normal or real symmetric matrix, the decomposition is called "spectral decomposition", derived from the spectral theorem.

References