Spectral decomposition

Last updated

Spectral decomposition is any of several things:

Related Research Articles

Principal component analysis conversion of a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components

Given a collection of points in two, three, or higher dimensional space, a "best fitting" line can be defined as one that minimizes the average squared distance from a point to the line. The next best-fitting line can be similarly chosen from directions perpendicular to the first. Repeating this process yields an orthogonal basis in which different individual dimensions of the data are uncorrelated. These basis vectors are called principal components, and several related procedures principal component analysis (PCA).

In linear algebra, an orthogonal matrix is a square matrix whose columns and rows are orthogonal unit vectors.

In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized. This is extremely useful because computations involving a diagonalizable matrix can often be reduced to much simpler computations involving the corresponding diagonal matrix. The concept of diagonalization is relatively straightforward for operators on finite-dimensional vector spaces but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of linear operators that can be modeled by multiplication operators, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative C*-algebras. See also spectral theory for a historical perspective.

Symmetric matrix Matrix equal to its transpose

In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally,

In linear algebra, a complex square matrix U is unitary if its conjugate transpose U* is also its inverse, that is, if

Singular value decomposition Matrix decomposition

In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix that generalizes the eigendecomposition of a square normal matrix to any matrix via an extension of the polar decomposition.

In mathematics, a complex square matrix A is normal if it commutes with its conjugate transpose A*:

In mathematics, the spectrum of a matrix is the set of its eigenvalues. More generally, if is a linear operator over any finite-dimensional vector space, its spectrum is the set of scalars such that is not invertible. The determinant of the matrix equals the product of its eigenvalues. Similarly, the trace of the matrix equals the sum of its eigenvalues. From this point of view, we can define the pseudo-determinant for a singular matrix to be the product of its nonzero eigenvalues.

In the mathematical discipline of linear algebra, a matrix decomposition or matrix factorization is a factorization of a matrix into a product of matrices. There are many different matrix decompositions; each finds use among a particular class of problems.

In linear algebra, a QR decomposition, also known as a QR factorization or QU factorization is a decomposition of a matrix A into a product A = QR of an orthogonal matrix Q and an upper triangular matrix R. QR decomposition is often used to solve the linear least squares problem and is the basis for a particular eigenvalue algorithm, the QR algorithm.

Jordan normal form Form of a matrix indicating its eigenvalues and their algebraic multiplicities

In linear algebra, a Jordan normal form, also known as a Jordan canonical form or JCF, is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to some basis. Such a matrix has each non-zero off-diagonal entry equal to 1, immediately above the main diagonal, and with identical diagonal entries to the left and below them.

In the mathematical discipline of linear algebra, the Schur decomposition or Schur triangulation, named after Issai Schur, is a matrix decomposition. It allows one to write an arbitrary complex matrix as unitarily equivalent to an upper triangular matrix whose diagonal elements are the eigenvalues of the original matrix.

In linear algebra, a Hankel matrix, named after Hermann Hankel, is a square matrix in which each ascending skew-diagonal from left to right is constant, e.g.:

In mathematics, operator theory is the study of linear operators on function spaces, beginning with differential operators and integral operators. The operators may be presented abstractly by their characteristics, such as bounded linear operators or closed operators, and consideration may be given to nonlinear operators. The study, which depends heavily on the topology of function spaces, is a branch of functional analysis.

In mathematics, the polar decomposition of a square real or complex matrix is a factorization of the form , where is a unitary matrix and is a positive-semidefinite Hermitian matrix, both square and of the same size.

In functional analysis, compact operators are linear operators on Banach spaces that map bounded sets to relatively compact sets. In the case of a Hilbert space H, the compact operators are the closure of the finite rank operators in the uniform operator topology. In general, operators on infinite-dimensional spaces feature properties that do not appear in the finite-dimensional case, i.e. for matrices. The compact operators are notable in that they share as much similarity with matrices as one can expect from a general operator. In particular, the spectral properties of compact operators resemble those of square matrices.

In mathematics, a tempered representation of a linear semisimple Lie group is a representation that has a basis whose matrix coefficients lie in the Lp space

The frequency domain decomposition (FDD) is an output-only system identification technique popular in civil engineering, in particular in structural health monitoring. As an output-only algorithm, it is useful when the input data is unknown. FDD is a modal analysis technique which generates a system realization using the frequency response given (multi-)output data.

Polynomial matrices are widely studied in the fields of systems theory and control theory and have seen other uses relating to stable polynomials. In stability theory, Spectral Factorization has been used to find determinental matrix representations for bivariate stable polynomials and real zero polynomials. A key tool used to study these is a matrix factorization known as either the Polynomial Matrix Spectral Factorization or the Matrix Fejer–Riesz Theorem.