Affine involution

Last updated

In Euclidean geometry, of special interest are involutions which are linear or affine transformations over the Euclidean space Rn. Such involutions are easy to characterize and they can be described geometrically. [1]

Contents

Linear involutions

To give a linear involution is the same as giving an involutory matrix, a square matrix A such that

where I is the identity matrix.

It is a quick check that a square matrix D whose elements are all zero off the main diagonal and ±1 on the diagonal, that is, a signature matrix of the form

satisfies (1), i.e. is the matrix of a linear involution. It turns out that all the matrices satisfying (1) are of the form

A=U −1DU,

where U is invertible and D is as above. That is to say, the matrix of any linear involution is of the form D up to a matrix similarity. Geometrically this means that any linear involution can be obtained by taking oblique reflections against any number from 0 through n hyperplanes going through the origin. (The term oblique reflection as used here includes ordinary reflections.)

One can easily verify that A represents a linear involution if and only if A has the form

A = ±(2P - I)

for a linear projection P.

Affine involutions

If A represents a linear involution, then xA(xb)+b is an affine involution. One can check that any affine involution in fact has this form. Geometrically this means that any affine involution can be obtained by taking oblique reflections against any number from 0 through n hyperplanes going through a point b.

Affine involutions can be categorized by the dimension of the affine space of fixed points; this corresponds to the number of values 1 on the diagonal of the similar matrix D (see above), i.e., the dimension of the eigenspace for eigenvalue 1.

The affine involutions in 3D are:

Isometric involutions

In the case that the eigenspace for eigenvalue 1 is the orthogonal complement of that for eigenvalue −1, i.e., every eigenvector with eigenvalue 1 is orthogonal to every eigenvector with eigenvalue −1, such an affine involution is an isometry. The two extreme cases for which this always applies are the identity function and inversion in a point.

The other involutive isometries are inversion in a line (in 2D, 3D, and up; this is in 2D a reflection, and in 3D a rotation about the line by 180°), inversion in a plane (in 3D and up; in 3D this is a reflection in a plane), inversion in a 3D space (in 3D: the identity), etc.

Related Research Articles

In mathematics, a complex square matrix A is normal if it commutes with its conjugate transpose A*:

In linear algebra, a square matrix  is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix  and a diagonal matrix such that , or equivalently . For a finite-dimensional vector space , a linear map  is called diagonalizable if there exists an ordered basis of  consisting of eigenvectors of . These definitions are equivalent: if  has a matrix representation as above, then the column vectors of  form a basis consisting of eigenvectors of , and the diagonal entries of  are the corresponding eigenvalues of ; with respect to this eigenvector basis,  is represented by .Diagonalization is the process of finding the above  and .

<span class="mw-page-title-main">Reflection (mathematics)</span> Mapping from a Euclidean space to itself

In mathematics, a reflection is a mapping from a Euclidean space to itself that is an isometry with a hyperplane as a set of fixed points; this set is called the axis or plane of reflection. The image of a figure by a reflection is its mirror image in the axis or plane of reflection. For example the mirror image of the small Latin letter p for a reflection with respect to a vertical axis would look like q. Its image by reflection in a horizontal axis would look like b. A reflection is an involution: when applied twice in succession, every point returns to its original location, and every geometrical object is restored to its original state.

This is an outline of topics related to linear algebra, the branch of mathematics concerning linear equations and linear maps and their representations in vector spaces and through matrices.

In mathematics, the spectral radius of a square matrix is the maximum of the absolute values of its eigenvalues. More generally, the spectral radius of a bounded linear operator is the supremum of the absolute values of the elements of its spectrum. The spectral radius is often denoted by ρ(·).

In numerical analysis, one of the most important problems is designing efficient and stable algorithms for finding the eigenvalues of a matrix. These eigenvalue algorithms may also find eigenvectors.

In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal, and the supradiagonal/upper diagonal . For example, the following matrix is tridiagonal:

In linear algebra, linear transformations can be represented by matrices. If is a linear transformation mapping to and is a column vector with entries, then

In linear algebra, a generalized eigenvector of an matrix is a vector which satisfies certain criteria which are more relaxed than those for an (ordinary) eigenvector.

In mathematics, the Cartan decomposition is a decomposition of a semisimple Lie group or Lie algebra, which plays an important role in their structure theory and representation theory. It generalizes the polar decomposition or singular value decomposition of matrices. Its history can be traced to the 1880s work of Élie Cartan and Wilhelm Killing.

In matrix theory, the Perron–Frobenius theorem, proved by Oskar Perron (1907) and Georg Frobenius (1912), asserts that a real square matrix with positive entries has a unique largest real eigenvalue and that the corresponding eigenvector can be chosen to have strictly positive components, and also asserts a similar statement for certain classes of nonnegative matrices. This theorem has important applications to probability theory ; to the theory of dynamical systems ; to economics ; to demography ; to social networks ; to Internet search engines (PageRank); and even to ranking of football teams. The first to discuss the ordering of players within tournaments using Perron–Frobenius eigenvectors is Edmund Landau.

<span class="mw-page-title-main">Symmetric space</span> A (pseudo-)Riemannian manifold whose geodesics are reversible.

In mathematics, a symmetric space is a Riemannian manifold whose group of symmetries contains an inversion symmetry about every point. This can be studied with the tools of Riemannian geometry, leading to consequences in the theory of holonomy; or algebraically through Lie theory, which allowed Cartan to give a complete classification. Symmetric spaces commonly occur in differential geometry, representation theory and harmonic analysis.

In linear algebra, an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by , is the factor by which the eigenvector is scaled.

In mathematics, the square root of a matrix extends the notion of square root from numbers to matrices. A matrix B is said to be a square root of A if the matrix product BB is equal to A.

In mathematics, a logarithm of a matrix is another matrix such that the matrix exponential of the latter matrix equals the original matrix. It is thus a generalization of the scalar logarithm and in some sense an inverse function of the matrix exponential. Not all matrices have a logarithm and those matrices that do have a logarithm may have more than one logarithm. The study of logarithms of matrices leads to Lie theory since when a matrix has a logarithm then it is in an element of a Lie group and the logarithm is the corresponding element of the vector space of the Lie algebra.

In mathematics, a signature matrix is a diagonal matrix whose diagonal elements are plus or minus 1, that is, any matrix of the form:

In mathematics, an involutory matrix is a square matrix that is its own inverse. That is, multiplication by the matrix A is an involution if and only if A2 = I, where I is the n × n identity matrix. Involutory matrices are all square roots of the identity matrix. This is simply a consequence of the fact that any nonsingular matrix multiplied by its inverse is the identity.

<span class="mw-page-title-main">Point reflection</span> Geometric symmetry operation

In geometry, a point reflection is a type of isometry of Euclidean space. An object that is invariant under a point reflection is said to possess point symmetry; if it is invariant under point reflection through its center, it is said to possess central symmetry or to be centrally symmetric.

In linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the matrix being factorized is a normal or real symmetric matrix, the decomposition is called "spectral decomposition", derived from the spectral theorem.

In mathematics, a pseudoreflection is an invertible linear transformation of a finite-dimensional vector space such that it is not the identity transformation, has a finite (multiplicative) order, and fixes a hyperplane. The concept of pseudoreflection generalizes the concepts of reflection and complex reflection, and is simply called reflection by some mathematicians. It plays an important role in Invariant theory of finite groups, including the Chevalley-Shephard-Todd theorem.

References

  1. LLC, Books (2010). Affine Geometry: Affine Transformation, Hyperplane, Ceva's Theorem, Affine Curvature, Barycentric Coordinates, Centroid, Affine Space. General Books LLC, 2010. ISBN   9781155313931.
  2. Marberg, Eric; Zhang, Yifeng (March 2022). "Affine transitions for involution Stanley symmetric functions". European Journal of Combinatorics. 101: 103463. arXiv: 1812.04880 . doi:10.1016/j.ejc.2021.103463.