Hafnian

Last updated

In mathematics, the hafnian is a scalar function of a symmetric matrix that generalizes the permanent.

Contents

The hafnian was named by Eduardo R. Caianiello "to mark the fruitful period of stay in Copenhagen (Hafnia in Latin)." [1]

Definition

The hafnian of a symmetric matrix is defined as

where is the set of all partitions of the set into subsets of size . [2] [3]

This definition is similar to that of the Pfaffian, but differs in that the signatures of the permutations are not taken into account. Thus the relationship of the hafnian to the Pfaffian is the same as relationship of the permanent to the determinant. [4]

Basic properties

The hafnian may also be defined as

where is the symmetric group on . [5] The two definitions are equivalent because if , then Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \{\{\sigma(2i-1),\sigma(2i)\}:i\in\{1,...,n\}\}} is a partition of into subsets of size 2, and as ranges over , each such partition is counted exactly times. Note that this argument relies on the symmetry of , without which the original definition is not well-defined.

The hafnian of an adjacency matrix of a graph is the number of perfect matchings (also known as 1-factors) in the graph. This is because a partition of into subsets of size 2 can also be thought of as a perfect matching in the complete graph .

The hafnian can also be thought of as a generalization of the permanent, since the permanent can be expressed as

. [2]

Just as the hafnian counts the number of perfect matchings in a graph given its adjacency matrix, the permanent counts the number of matchings in a bipartite graph given its biadjacency matrix.

The hafnian is also related to moments of multivariate Gaussian distributions. By Wick's probability theorem, the hafnian of a real symmetric matrix may expressed as

where is any number large enough to make positive semi-definite. Note that the hafnian does not depend on the diagonal entries of the matrix, and the expectation on the right-hand side does not depend on .

Generating function

Let be an arbitrary complex symmetric matrix composed of four blocks , , and . Let be a set of independent variables, and let be an antidiagonal block matrix composed of entries (each one is presented twice, one time per nonzero block). Let denote the identity matrix. Then the following identity holds: [4]

where the right-hand side involves hafnians of matrices , whose blocks , , and are built from the blocks , , and respectively in the way introduced in MacMahon's Master theorem. In particular, is a matrix built by replacing each entry in the matrix with a block filled with ; the same scheme is applied to , and . The sum runs over all -tuples of non-negative integers, and it is assumed that .

The identity can be proven [4] by means of multivariate Gaussian integrals and Wick's probability theorem.

The expression in the left-hand side, , is in fact a multivariate generating function for a series of hafnians, and the right-hand side constitutes its multivariable Taylor expansion in the vicinity of the point As a consequence of the given relation, the hafnian of a symmetric matrix can be represented as the following mixed derivative of the order :

The hafnian generating function identity written above can be considered as a hafnian generalization of MacMahon's Master theorem, which introduces the generating function for matrix permanents and has the following form in terms of the introduced notation:

Note that MacMahon's Master theorem comes as a simple corollary from the hafnian generating function identity due to the relation .

Non-negativity

If is a Hermitian positive semi-definite matrix and is a complex symmetric matrix, then

where denotes the complex conjugate of . [6]

A simple way to see this when is positive semi-definite is to observe that, by Wick's probability theorem, when is a complex normal random vector with mean , covariance matrix and relation matrix .

This result is a generalization of the fact that the permanent of a Hermitian positive semi-definite matrix is non-negative. This corresponds to the special case using the relation .

Loop hafnian

The loop hafnian of an symmetric matrix is defined as

where is the set of all perfect matchings of the complete graph on vertices with loops , i.e., the set of all ways to partition the set into pairs or singletons (treating a singleton as the pair ). [7] Thus the loop hafnian depends on the diagonal entries of the matrix, unlike the hafnian. [7] Furthermore, the loop hafnian can be non-zero when is odd.

The loop hafnian can be used to count the total number of matchings in a graph (perfect or non-perfect), also known as its Hosoya index. Specifically, if one takes the adjacency matrix of a graph and sets the diagonal elements to 1, then the loop hafnian of the resulting matrix is equal to the total number of matchings in the graph. [7]

The loop hafnian can also be thought of as incorporating a mean into the interpretation of the hafnian as a multivariate Gaussian moment. Specifically, by Wick's probability theorem again, the loop hafnian of a real symmetric matrix can be expressed as

where is any number large enough to make positive semi-definite.

Computation

Computing the hafnian of a (0,1)-matrix is #P-complete, because computing the permanent of a (0,1)-matrix is #P-complete. [4] [7]

The hafnian of a matrix can be computed in time. [7]

If the entries of a matrix are non-negative, then its hafnian can be approximated to within an exponential factor in polynomial time. [8]

See also

Related Research Articles

<span class="mw-page-title-main">Inner product space</span> Generalization of the dot product; used to define Hilbert spaces

In mathematics, an inner product space is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, often denoted with angle brackets such as in . Inner products allow formal definitions of intuitive geometric notions, such as lengths, angles, and orthogonality of vectors. Inner product spaces generalize Euclidean vector spaces, in which the inner product is the dot product or scalar product of Cartesian coordinates. Inner product spaces of infinite dimension are widely used in functional analysis. Inner product spaces over the field of complex numbers are sometimes referred to as unitary spaces. The first usage of the concept of a vector space with an inner product is due to Giuseppe Peano, in 1898.

In mathematics, a symmetric matrix with real entries is positive-definite if the real number is positive for every nonzero real column vector where is the transpose of . More generally, a Hermitian matrix is positive-definite if the real number is positive for every nonzero complex column vector where denotes the conjugate transpose of

<span class="mw-page-title-main">Multivariate normal distribution</span> Generalization of the one-dimensional normal distribution to higher dimensions

In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One definition is that a random vector is said to be k-variate normally distributed if every linear combination of its k components has a univariate normal distribution. Its importance derives mainly from the multivariate central limit theorem. The multivariate normal distribution is often used to describe, at least approximately, any set of (possibly) correlated real-valued random variables, each of which clusters around a mean value.

In linear algebra, the Cholesky decomposition or Cholesky factorization is a decomposition of a Hermitian, positive-definite matrix into the product of a lower triangular matrix and its conjugate transpose, which is useful for efficient numerical solutions, e.g., Monte Carlo simulations. It was discovered by André-Louis Cholesky for real matrices, and posthumously published in 1924. When it is applicable, the Cholesky decomposition is roughly twice as efficient as the LU decomposition for solving systems of linear equations.

<span class="mw-page-title-main">Quaternion group</span> Non-abelian group of order eight

In group theory, the quaternion group Q8 (sometimes just denoted by Q) is a non-abelian group of order eight, isomorphic to the eight-element subset of the quaternions under multiplication. It is given by the group presentation

<span class="mw-page-title-main">Symplectic group</span> Mathematical group

In mathematics, the name symplectic group can refer to two different, but closely related, collections of mathematical groups, denoted Sp(2n, F) and Sp(n) for positive integer n and field F (usually C or R). The latter is called the compact symplectic group and is also denoted by . Many authors prefer slightly different notations, usually differing by factors of 2. The notation used here is consistent with the size of the most common matrices which represent the groups. In Cartan's classification of the simple Lie algebras, the Lie algebra of the complex group Sp(2n, C) is denoted Cn, and Sp(n) is the compact real form of Sp(2n, C). Note that when we refer to the (compact) symplectic group it is implied that we are talking about the collection of (compact) symplectic groups, indexed by their dimension n.

In mathematics, a symplectic matrix is a matrix with real entries that satisfies the condition

<span class="mw-page-title-main">Unitary group</span> Group of unitary matrices

In mathematics, the unitary group of degree n, denoted U(n), is the group of n × n unitary matrices, with the group operation of matrix multiplication. The unitary group is a subgroup of the general linear group GL(n, C). Hyperorthogonal group is an archaic name for the unitary group, especially over finite fields. For the group of unitary matrices with determinant 1, see Special unitary group.

<span class="mw-page-title-main">Special unitary group</span> Group of unitary matrices with determinant of 1

In mathematics, the special unitary group of degree n, denoted SU(n), is the Lie group of n × n unitary matrices with determinant 1.

<span class="mw-page-title-main">Covariance matrix</span> Measure of covariance of components of a random vector

In probability theory and statistics, a covariance matrix is a square matrix giving the covariance between each pair of elements of a given random vector.

In linear algebra, the permanent of a square matrix is a function of the matrix similar to the determinant. The permanent, as well as the determinant, is a polynomial in the entries of the matrix. Both are special cases of a more general function of a matrix called the immanant.

In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an complex matrix is an matrix obtained by transposing and applying complex conjugation to each entry. There are several notations, such as or , , or .

In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal, and the supradiagonal/upper diagonal. For example, the following matrix is tridiagonal:

In mathematics, the determinant of an m×m skew-symmetric matrix can always be written as the square of a polynomial in the matrix entries, a polynomial with integer coefficients that only depends on m. When m is odd, the polynomial is zero. When m is even, it is a nonzero polynomial of degree m/2, and is unique up to multiplication by ±1. The convention on skew-symmetric tridiagonal matrices, given below in the examples, then determines one specific polynomial, called the Pfaffian polynomial. The value of this polynomial, when applied to the entries of a skew-symmetric matrix, is called the Pfaffian of that matrix. The term Pfaffian was introduced by Cayley, who indirectly named them after Johann Friedrich Pfaff.

In mathematics, the square root of a matrix extends the notion of square root from numbers to matrices. A matrix B is said to be a square root of A if the matrix product BB is equal to A.

<span class="mw-page-title-main">Classical group</span>

In mathematics, the classical groups are defined as the special linear groups over the reals R, the complex numbers C and the quaternions H together with special automorphism groups of symmetric or skew-symmetric bilinear forms and Hermitian or skew-Hermitian sesquilinear forms defined on real, complex and quaternionic finite-dimensional vector spaces. Of these, the complex classical Lie groups are four infinite families of Lie groups that together with the exceptional groups exhaust the classification of simple Lie groups. The compact classical groups are compact real forms of the complex classical groups. The finite analogues of the classical groups are the classical groups of Lie type. The term "classical group" was coined by Hermann Weyl, it being the title of his 1939 monograph The Classical Groups.

In mathematics, the Abel–Jacobi map is a construction of algebraic geometry which relates an algebraic curve to its Jacobian variety. In Riemannian geometry, it is a more general construction mapping a manifold to its Jacobi torus. The name derives from the theorem of Abel and Jacobi that two effective divisors are linearly equivalent if and only if they are indistinguishable under the Abel–Jacobi map.

In linear algebra, the computation of the permanent of a matrix is a problem that is thought to be more difficult than the computation of the determinant of a matrix despite the apparent similarity of the definitions.

In probability theory, the family of complex normal distributions, denoted or , characterizes complex random variables whose real and imaginary parts are jointly normal. The complex normal family has three parameters: location parameter μ, covariance matrix , and the relation matrix . The standard complex normal is the univariate distribution with , , and .

In mathematics, the oscillator representation is a projective unitary representation of the symplectic group, first investigated by Irving Segal, David Shale, and André Weil. A natural extension of the representation leads to a semigroup of contraction operators, introduced as the oscillator semigroup by Roger Howe in 1988. The semigroup had previously been studied by other mathematicians and physicists, most notably Felix Berezin in the 1960s. The simplest example in one dimension is given by SU(1,1). It acts as Möbius transformations on the extended complex plane, leaving the unit circle invariant. In that case the oscillator representation is a unitary representation of a double cover of SU(1,1) and the oscillator semigroup corresponds to a representation by contraction operators of the semigroup in SL(2,C) corresponding to Möbius transformations that take the unit disk into itself.

References

  1. F. Guerra, in Imagination and Rigor: Essays on Eduardo R. Caianiello's Scientific Heritage, edited by Settimo Termini, Springer Science & Business Media, 2006, page 98
  2. 1 2 Alexander Barvinok (13 March 2017). Combinatorics and Complexity of Partition Functions. Springer. p. 93. ISBN   9783319518299.
  3. Barvinok, Alexander; Regts, Guus (2019). "Weighted counting of integer points in a subspace". Combinator. Probab. Comp. 28: 696–719. arXiv: 1706.05423 . doi:10.1017/S0963548319000105. S2CID   126185737.
  4. 1 2 3 4 Kocharovsky, Vitaly V.; Kocharovsky, Vladimir V.; Tarasov, Sergey V. (2022). "The Hafnian Master Theorem". Linear Algebra and Its Applications. 651. Elsevier BV: 144–161. doi: 10.1016/j.laa.2022.06.021 . ISSN   0024-3795. S2CID   249935016.
  5. Rudelson, Mark; Samorodnitsky, Alex; Zeitouni, Ofer (2016). "Hafnians, perfect matchings and Gaussian matrices". The Annals of Probability. 44 (4): 2858–2888. arXiv: 1409.3905 . doi:10.1214/15-AOP1036. S2CID   14458608.
  6. Brádler, Kamil; Friedland, Shmuel; Israel, Robert B. (2021-02-24). "Nonnegativity for hafnians of certain matrices". Linear and Multilinear Algebra. 70 (19). Informa UK Limited: 4615–4619. arXiv: 1811.10342 . doi:10.1080/03081087.2021.1892022. ISSN   0308-1087. S2CID   119601142.
  7. 1 2 3 4 5 Björklund, Andreas; Gupt, Brajesh; Quesada, Nicolás (2018). "A faster hafnian formula for complex matrices and its benchmarking on a supercomputer". arXiv: 1805.12498 [cs.DS].
  8. Barvinok, Alexander (1999). "Polynomial Time Algorithms to Approximate Permanents and Mixed Discriminants Within a Simply Exponential Factor". Random Structures and Algorithms. 14 (1). Wiley: 29–61. doi:10.1002/(sici)1098-2418(1999010)14:1<29::aid-rsa2>3.0.co;2-x. hdl: 2027.42/35110 . ISSN   1042-9832.