Packing in a hypergraph

Last updated

In mathematics, a packing in a hypergraph is a partition of the set of the hypergraph's edges into a number of disjoint subsets such that no pair of edges in each subset share any vertex. There are two famous algorithms to achieve asymptotically optimal packing in k-uniform hypergraphs. One of them is a random greedy algorithm which was proposed by Joel Spencer. He used a branching process to formally prove the optimal achievable bound under some side conditions. The other algorithm is called the Rödl nibble and was proposed by Vojtěch Rödl et al. They showed that the achievable packing by the Rödl nibble is in some sense close to that of the random greedy algorithm.

Contents

History

The problem of finding the number of such subsets in a k-uniform hypergraph was originally motivated through a conjecture by Paul Erdős and Haim Hanani in 1963. Vojtěch Rödl proved their conjecture asymptotically under certain conditions in 1985. Pippenger and Joel Spencer generalized Rödl's results using a random greedy algorithm in 1989.

Definition and terminology

In the following definitions, the hypergraph is denoted by H=(V,E). H is called a k-uniform hypergraph if every edge in E consists of exactly k vertices.

is a hypergraph packing if it is a subset of edges in H such that there is no pair of distinct edges with a common vertex.

is a (,)-good hypergraph if there exists a such that for all and and both of the following conditions hold.

where the degree of a vertex is the number of edges that contain and the codegree of two distinct vertices and is the number of edges that contain both vertices.

Theorem

There exists an asymptotic packing P of size at least for a -uniform hypergraph under the following two conditions,

  1. All vertices have the degree of in which tends to infinity.
  2. For every pair of vertices shares only common edges.

where is the total number of vertices. This result was shown by Pippenger and was later proved by Joel Spencer. To address the asymptotic hypergraph packing problem, Joel Spencer proposed a random greedy algorithm. In this algorithm, a branching process is used as the basis and it was shown that it almost always achieves an asymptotically optimal packing under the above side conditions.

Asymptotic packing algorithms

There are two famous algorithms for asymptotic packing of k-uniform hypergraphs: the random greedy algorithm via branching process, and the Rödl nibble.

Random greedy algorithm via branching process

Every edge is independently and uniformly assigned a distinct real "birthtime" . The edges are taken one by one in the order of their birthtimes. The edge is accepted and included in if it does not overlap any previously accepted edges. Obviously, the subset is a packing and it can be shown that its size is almost surely. To show that, let stop the process of adding new edges at time . For an arbitrary , pick such that for any -good hypergraph where denotes the probability of vertex survival (a vertex survives if it is not in any edges in ) until time . Obviously, in such a situation the expected number of surviving at time is less than . As a result, the probability of surviving being less than is higher than . In other words, must include at least vertices which means that .

To complete the proof, it must be shown that . For that, the asymptotic behavior of surviving is modeled by a continuous branching process. Fix and begin with Eve with the birthdate of . Assume time goes backward so Eve gives birth in the interval of with a unit density Poisson distribution. The probability of Eve having birth is . By conditioning on the birthtimes are independently and uniformly distributed on . Every birth given by Eve consists of offspring all with the same birth time say . The process is iterated for each offspring. It can be shown that for all there exists a so that with a probability higher than , Eve has at most descendants.

A rooted tree with the notions of parent, child, root, birthorder and wombmate shall be called a broodtree. Given a finite broodtree we say for each vertex that it survives or dies. A childless vertex survives. A vertex dies if and only if it has at least one brood all of whom survive. Let denote the probability that Eve survives in the broodtree given by the above process. The objective is to show and then for any fixed , it can be shown that . These two relations complete our argument.

To show , let . For small, as, roughly, an Eve starting at time might have a birth in time interval all of whose children survive while Eve has no births in all of whose children survive. Letting yields the differential equation . The initial value gives a unique solution . Note that indeed .

To prove , consider a procedure we call History which either aborts or produces a broodtree. History contains a set of vertices, initially . will have a broodtree structure with the root. The are either processed or unprocessed, is initially unprocessed. To each is assigned a birthtime , we initialize . History is to take an unprocessed and process it as follows. For the value of all with but with no that has already been processed, if either some has and with or some have with and , then History is aborted. Otherwise for each with add all to as wombmates with parent and common birthdate . Now is considered processed. History halts, if not aborted, when all are processed. If History does not abort then root survives broodtree if and only if survives at time . For a fixed broodtree, let denote the probability that the branching process yields broodtree . Then the probability that History does not abort is . By the finiteness of the branching process, , the summation over all broodtrees and History does not abort. The distribution of its broodtree approaches the branching process distribution. Thus .

The Rödl nibble

In 1985, Rödl proved Paul Erdős’s conjecture by a method called the Rödl nibble. Rödl's result can be formulated in form of either packing or covering problem. For the covering number denoted by shows the minimal size of a family of -element subsets of which have the property that every -element set is contained in at least one . Paul Erdős et al. conjecture was

.

where . This conjecture roughly means that a tactical configuration is asymptotically achievable. One may similarly define the packing number as the maximal size of a family of -element subsets of having the property that every -element set is contained in at most one .

Packing under the stronger condition

In 1997, Noga Alon, Jeong Han Kim, and Joel Spencer also supply a good bound for under the stronger codegree condition that every distinct pair has at most one edge in common.

For a k-uniform, D-regular hypergraph on n vertices, if k > 3, there exists a packing P covering all vertices but at most . If k = 3 there exists a packing P covering all vertices but at most .

This bound is desirable in various applications, such as Steiner triple system. A Steiner Triple System is a 3-uniform, simple hypergraph in which every pair of vertices is contained in precisely one edge. Since a Steiner Triple System is clearly d=(n-1)/2-regular, the above bound supplies the following asymptotic improvement.

Any Steiner Triple System on n vertices contains a packing covering all vertices but at most .

This has subsequently improved to [1] and [2]

See also

Related Research Articles

In graph theory, an expander graph is a sparse graph that has strong connectivity properties, quantified using vertex, edge or spectral expansion. Expander constructions have spawned research in pure and applied mathematics, with several applications to complexity theory, design of robust computer networks, and the theory of error-correcting codes.

In combinatorics, Ramsey's theorem, in one of its graph-theoretic forms, states that one will find monochromatic cliques in any edge labelling (with colours) of a sufficiently large complete graph. To demonstrate the theorem for two colours (say, blue and red), let r and s be any two positive integers. Ramsey's theorem states that there exists a least positive integer R(r, s) for which every blue-red edge colouring of the complete graph on R(r, s) vertices contains a blue clique on r vertices or a red clique on s vertices. (Here R(r, s) signifies an integer that depends on both r and s.)

<span class="mw-page-title-main">Hypergraph</span> Generalization of graph theory

In mathematics, a hypergraph is a generalization of a graph in which an edge can join any number of vertices. In contrast, in an ordinary graph, an edge connects exactly two vertices.

<span class="mw-page-title-main">Loop-erased random walk</span> Model for a random simple path

In mathematics, loop-erased random walk is a model for a random simple path with important applications in combinatorics, physics and quantum field theory. It is intimately connected to the uniform spanning tree, a model for a random tree. See also random walk for more general treatment of this topic.

In mathematics, the discrete Laplace operator is an analog of the continuous Laplace operator, defined so that it has meaning on a graph or a discrete grid. For the case of a finite-dimensional graph, the discrete Laplace operator is more commonly called the Laplacian matrix.

<span class="mw-page-title-main">Szemerédi regularity lemma</span>

Szemerédi's regularity lemma is one of the most powerful tools in extremal graph theory, particularly in the study of large dense graphs. It states that the vertices of every large enough graph can be partitioned into a bounded number of parts so that the edges between different parts behave almost randomly.

In extremal graph theory, the Erdős–Stone theorem is an asymptotic result generalising Turán's theorem to bound the number of edges in an H-free graph for a non-complete graph H. It is named after Paul Erdős and Arthur Stone, who proved it in 1946, and it has been described as the “fundamental theorem of extremal graph theory”.

<span class="mw-page-title-main">Threshold graph</span> Graph formed by adding isolated or universal vertices

In graph theory, a threshold graph is a graph that can be constructed from a one-vertex graph by repeated applications of the following two operations:

  1. Addition of a single isolated vertex to the graph.
  2. Addition of a single dominating vertex to the graph, i.e. a single vertex that is connected to all other vertices.

In mathematics, a graph partition is the reduction of a graph to a smaller graph by partitioning its set of nodes into mutually exclusive groups. Edges of the original graph that cross between the groups will produce edges in the partitioned graph. If the number of resulting edges is small compared to the original graph, then the partitioned graph may be better suited for analysis and problem-solving than the original. Finding a partition that simplifies graph analysis is a hard problem, but one that has applications to scientific computing, VLSI circuit design, and task scheduling in multiprocessor computers, among others. Recently, the graph partition problem has gained importance due to its application for clustering and detection of cliques in social, pathological and biological networks. For a survey on recent trends in computational methods and applications see Buluc et al. (2013). Two common examples of graph partitioning are minimum cut and maximum cut problems.

<span class="mw-page-title-main">Random geometric graph</span> In graph theory, the mathematically simplest spatial network

In graph theory, a random geometric graph (RGG) is the mathematically simplest spatial network, namely an undirected graph constructed by randomly placing N nodes in some metric space and connecting two nodes by a link if and only if their distance is in a given range, e.g. smaller than a certain neighborhood radius, r.

In computer science, a kernelization is a technique for designing efficient algorithms that achieve their efficiency by a preprocessing stage in which inputs to the algorithm are replaced by a smaller input, called a "kernel". The result of solving the problem on the kernel should either be the same as on the original input, or it should be easy to transform the output on the kernel to the desired output for the original problem.

In extremal graph theory, the forbidden subgraph problem is the following problem: given a graph , find the maximal number of edges an -vertex graph can have such that it does not have a subgraph isomorphic to . In this context, is called a forbidden subgraph.

In arithmetic combinatorics, the corners theorem states that for every , for large enough , any set of at least points in the grid contains a corner, i.e., a triple of points of the form with . It was first proved by Miklós Ajtai and Endre Szemerédi in 1974 using Szemerédi's theorem. In 2003, József Solymosi gave a short proof using the triangle removal lemma.

In graph theory, the graph removal lemma states that when a graph contains few copies of a given subgraph, then all of the copies can be eliminated by removing a small number of edges. The special case in which the subgraph is a triangle is known as the triangle removal lemma.

<span class="mw-page-title-main">Locally linear graph</span> Graph where every edge is in one triangle

In graph theory, a locally linear graph is an undirected graph in which every edge belongs to exactly one triangle. Equivalently, for each vertex of the graph, its neighbors are each adjacent to exactly one other neighbor, so the neighbors can be paired up into an induced matching. Locally linear graphs have also been called locally matched graphs. Their triangles form the hyperedges of triangle-free 3-uniform linear hypergraphs and the blocks of certain partial Steiner triple systems, and the locally linear graphs are exactly the Gaifman graphs of these hypergraphs or partial Steiner systems.

In graph theory, the hypergraph removal lemma states that when a hypergraph contains few copies of a given sub-hypergraph, then all of the copies can be eliminated by removing a small number of hyperedges. It is a generalization of the graph removal lemma. The special case in which the graph is a tetrahedron is known as the tetrahedron removal lemma. It was first proved by Nagle, Rödl, Schacht and Skokan and, independently, by Gowers.

In the mathematical field of graph theory, Hall-type theorems for hypergraphs are several generalizations of Hall's marriage theorem from graphs to hypergraphs. Such theorems were proved by Ofra Kessler, Ron Aharoni, Penny Haxell, Roy Meshulam, and others.

In mathematics, the hypergraph regularity method is a powerful tool in extremal graph theory that refers to the combined application of the hypergraph regularity lemma and the associated counting lemma. It is a generalization of the graph regularity method, which refers to the use of Szemerédi's regularity and counting lemmas.

The blow-up lemma, proved by János Komlós, Gábor N. Sárközy, and Endre Szemerédi in 1997, is an important result in extremal graph theory, particularly within the context of the regularity method. It states that the regular pairs in the statement of Szemerédi's regularity lemma behave like complete bipartite graphs in the context of embedding spanning graphs of bounded degree.

The method of (hypergraph) containers is a powerful tool that can help characterize the typical structure and/or answer extremal questions about families of discrete objects with a prescribed set of local constraints. Such questions arise naturally in extremal graph theory, additive combinatorics, discrete geometry, coding theory, and Ramsey theory; they include some of the most classical problems in the associated fields.

References

  1. Keevash, Peter; Pokrovskiy, Alexey; Sudakov, Benny; Yepremyan, Liana (2022-04-15). "New bounds for Ryser's conjecture and related problems". Transactions of the American Mathematical Society, Series B. 9 (8): 288–321. doi: 10.1090/btran/92 . hdl: 20.500.11850/592212 . ISSN   2330-0000.
  2. Montgomery, Richard (2023). "A proof of the Ryser-Brualdi-Stein conjecture for large even n". arXiv: 2310.19779 [math.CO].