In the mathematical field of graph theory, Kirchhoff's theorem or Kirchhoff's matrix tree theorem named after Gustav Kirchhoff is a theorem about the number of spanning trees in a graph, showing that this number can be computed in polynomial time from the determinant of a submatrix of the graph's Laplacian matrix; specifically, the number is equal to any cofactor of the Laplacian matrix. Kirchhoff's theorem is a generalization of Cayley's formula which provides the number of spanning trees in a complete graph.
Kirchhoff's theorem relies on the notion of the Laplacian matrix of a graph, which is equal to the difference between the graph's degree matrix (the diagonal matrix of vertex degrees) and its adjacency matrix (a (0,1)-matrix with 1's at places corresponding to entries where the vertices are adjacent and 0's otherwise).
For a given connected graph G with n labeled vertices, let λ1, λ2, ..., λn−1 be the non-zero eigenvalues of its Laplacian matrix. Then the number of spanning trees of G is
An English translation of Kirchhoff's original 1847 paper was made by J. B. O'Toole and published in 1958. [1]
First, construct the Laplacian matrix Q for the example diamond graph G (see image on the right):
Next, construct a matrix Q* by deleting any row and any column from Q. For example, deleting row 1 and column 1 yields
Finally, take the determinant of Q* to obtain t(G), which is 8 for the diamond graph. (Notice t(G) is the (1,1)-cofactor of Q in this example.)
(The proof below is based on the Cauchy–Binet formula. An elementary induction argument for Kirchhoff's theorem can be found on page 654 of Moore (2011). [2] )
First notice that the Laplacian matrix has the property that the sum of its entries across any row and any column is 0. Thus we can transform any minor into any other minor by adding rows and columns, switching them, and multiplying a row or a column by −1. Thus the cofactors are the same up to sign, and it can be verified that, in fact, they have the same sign.
We proceed to show that the determinant of the minor M11 counts the number of spanning trees. Let n be the number of vertices of the graph, and m the number of its edges. The incidence matrix E is an n-by-m matrix, which may be defined as follows: suppose that (i, j) is the kth edge of the graph, and that i < j. Then Eik = 1, Ejk = −1, and all other entries in column k are 0 (see oriented incidence matrix for understanding this modified incidence matrix E). For the preceding example (with n = 4 and m = 5):
Recall that the Laplacian L can be factored into the product of the incidence matrix and its transpose, i.e., L = EET. Furthermore, let F be the matrix E with its first row deleted, so that FFT = M11.
Now the Cauchy–Binet formula allows us to write
where S ranges across subsets of [m] of size n − 1, and FS denotes the (n − 1)-by-(n − 1) matrix whose columns are those of F with index in S. Then every S specifies n − 1 edges of the original graph, and it can be shown that those edges induce a spanning tree if and only if the determinant of FS is +1 or −1, and that they do not induce a spanning tree if and only if the determinant is 0. This completes the proof.
Cayley's formula follows from Kirchhoff's theorem as a special case, since every vector with 1 in one place, −1 in another place, and 0 elsewhere is an eigenvector of the Laplacian matrix of the complete graph, with the corresponding eigenvalue being n. These vectors together span a space of dimension n − 1, so there are no other non-zero eigenvalues.
Alternatively, note that as Cayley's formula counts the number of distinct labeled trees of a complete graph Kn we need to compute any cofactor of the Laplacian matrix of Kn. The Laplacian matrix in this case is
Any cofactor of the above matrix is nn−2, which is Cayley's formula.
Kirchhoff's theorem holds for multigraphs as well; the matrix Q is modified as follows:
Cayley's formula for a complete multigraph is mn−1(nn−1−(n−1)nn−2) by same methods produced above, since a simple graph is a multigraph with m = 1.
Kirchhoff's theorem can be strengthened by altering the definition of the Laplacian matrix. Rather than merely counting edges emanating from each vertex or connecting a pair of vertices, label each edge with an indeterminate and let the (i, j)-th entry of the modified Laplacian matrix be the sum over the indeterminates corresponding to edges between the i-th and j-th vertices when i does not equal j, and the negative sum over all indeterminates corresponding to edges emanating from the i-th vertex when i equals j.
The determinant of the modified Laplacian matrix by deleting any row and column (similar to finding the number of spanning trees from the original Laplacian matrix), above is then a homogeneous polynomial (the Kirchhoff polynomial) in the indeterminates corresponding to the edges of the graph. After collecting terms and performing all possible cancellations, each monomial in the resulting expression represents a spanning tree consisting of the edges corresponding to the indeterminates appearing in that monomial. In this way, one can obtain explicit enumeration of all the spanning trees of the graph simply by computing the determinant.
For a proof of this version of the theorem see Bollobás (1998). [3]
The spanning trees of a graph form the bases of a graphic matroid, so Kirchhoff's theorem provides a formula to count the number of bases in a graphic matroid. The same method may also be used to count the number of bases in regular matroids, a generalization of the graphic matroids ( Maurer 1976 ).
Kirchhoff's theorem can be modified to count the number of oriented spanning trees in directed multigraphs. The matrix Q is constructed as follows:
The number of oriented spanning trees rooted at a vertex i is the determinant of the matrix gotten by removing the ith row and column of Q
Kirchhoff's theorem can be generalized to count k-component spanning forests in an unweighted graph. [4] A k-component spanning forest is a subgraph with k connected components that contains all vertices and is cycle-free, i.e., there is at most one path between each pair of vertices. Given such a forest F with connected components , define its weight to be the product of the number of vertices in each component. Then
where the sum is over all k-component spanning forests and is the coefficient of of the polynomial
The last factor in the polynomial is due to the zero eigenvalue . More explicitly, the number can be computed as
where the sum is over all n−k-element subsets of . For example
Since a spanning forest with n−1 components corresponds to a single edge, the k = n−1 case states that the sum of the eigenvalues of Q is twice the number of edges. The k = 1 case corresponds to the original Kirchhoff theorem since the weight of every spanning tree is n.
The proof can be done analogously to the proof of Kirchhoff's theorem. An invertible submatrix of the incidence matrix corresponds bijectively to a k-component spanning forest with a choice of vertex for each component.
The coefficients are up to sign the coefficients of the characteristic polynomial of Q.
In graph theory, a regular graph is a graph where each vertex has the same number of neighbors; i.e. every vertex has the same degree or valency. A regular directed graph must also satisfy the stronger condition that the indegree and outdegree of each internal vertex are equal to each other. A regular graph with vertices of degree k is called a k‑regular graph or regular graph of degree k.
In linear algebra, the Cayley–Hamilton theorem states that every square matrix over a commutative ring satisfies its own characteristic equation.
In mathematics, particularly in linear algebra, a skew-symmetricmatrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition
In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any base. The characteristic equation, also known as the determinantal equation, is the equation obtained by equating the characteristic polynomial to zero.
In graph theory and computer science, an adjacency matrix is a square matrix used to represent a finite graph. The elements of the matrix indicate whether pairs of vertices are adjacent or not in the graph.
In mathematics, the Hodge star operator or Hodge star is a linear map defined on the exterior algebra of a finite-dimensional oriented vector space endowed with a nondegenerate symmetric bilinear form. Applying the operator to an element of the algebra produces the Hodge dual of the element. This map was introduced by W. V. D. Hodge.
In mathematics, a Cayley graph, also known as a Cayley color graph, Cayley diagram, group diagram, or color group, is a graph that encodes the abstract structure of a group. Its definition is suggested by Cayley's theorem, and uses a specified set of generators for the group. It is a central tool in combinatorial and geometric group theory. The structure and symmetry of Cayley graphs makes them particularly good candidates for constructing expander graphs.
In the mathematical field of graph theory, a spanning treeT of an undirected graph G is a subgraph that is a tree which includes all of the vertices of G. In general, a graph may have several spanning trees, but a graph that is not connected will not contain a spanning tree. If all of the edges of G are also edges of a spanning tree T of G, then G is a tree and is identical to T.
In numerical analysis, one of the most important problems is designing efficient and stable algorithms for finding the eigenvalues of a matrix. These eigenvalue algorithms may also find eigenvectors.
In mathematics, spectral graph theory is the study of the properties of a graph in relationship to the characteristic polynomial, eigenvalues, and eigenvectors of matrices associated with the graph, such as its adjacency matrix or Laplacian matrix.
In graph theory, the resistance distance between two vertices of a simple, connected graph, G, is equal to the resistance between two equivalent points on an electrical network, constructed so as to correspond to G, with each edge being replaced by a resistance of one ohm. It is a metric on graphs.
In mathematics, the discrete Laplace operator is an analog of the continuous Laplace operator, defined so that it has meaning on a graph or a discrete grid. For the case of a finite-dimensional graph, the discrete Laplace operator is more commonly called the Laplacian matrix.
In the theory of lattices, the dual lattice is a construction analogous to that of a dual vector space. In certain respects, the geometry of the dual lattice of a lattice is the reciprocal of the geometry of , a perspective which underlies many of its uses.
In graph theory, a strongly regular graph (SRG) is a regular graph G = (V, E) with v vertices and degree k such that for some given integers
In the mathematical field of graph theory, the Laplacian matrix, also called the graph Laplacian, admittance matrix, Kirchhoff matrix or discrete Laplacian, is a matrix representation of a graph. Named after Pierre-Simon Laplace, the graph Laplacian matrix can be viewed as a matrix form of the negative discrete Laplace operator on a graph approximating the negative continuous Laplacian obtained by the finite difference method.
In linear algebra, an eigenvector or characteristic vector is a vector that has its direction unchanged by a given linear transformation. More precisely, an eigenvector, , of a linear transformation, , is scaled by a constant factor, , when the linear transformation is applied to it: . It is often important to know these vectors in linear algebra. The corresponding eigenvalue, characteristic value, or characteristic root is the multiplying factor .
The Tutte polynomial, also called the dichromate or the Tutte–Whitney polynomial, is a graph polynomial. It is a polynomial in two variables which plays an important role in graph theory. It is defined for every undirected graph and contains information about how the graph is connected. It is denoted by .
In the mathematical discipline of graph theory, the expander walk sampling theorem intuitively states that sampling vertices in an expander graph by doing relatively short random walk can simulate sampling the vertices independently from a uniform distribution. The earliest version of this theorem is due to Ajtai, Komlós & Szemerédi (1987), and the more general version is typically attributed to Gillman (1998).
In the geometry of numbers, the Klein polyhedron, named after Felix Klein, is used to generalize the concept of continued fractions to higher dimensions.
The Geiringer–Laman theorem gives a combinatorial characterization of generically rigid graphs in -dimensional Euclidean space, with respect to bar-joint frameworks. This theorem was first proved by Hilda Pollaczek-Geiringer in 1927, and later by Gerard Laman in 1970. An efficient algorithm called the Pebble Game is used to identify this class of graphs. This theorem has been the inspiration for many Geiringer-Laman type results for other types of frameworks with generalized Pebble games.