Discrepancy of hypergraphs is an area of discrepancy theory that studies the discrepancy of general set systems.
In the classical setting, we aim at partitioning the vertices of a hypergraph into two classes in such a way that ideally each hyperedge contains the same number of vertices in both classes. A partition into two classes can be represented by a coloring . We call −1 and +1 colors. The color-classes and form the corresponding partition. For a hyperedge , set
The discrepancy of with respect to and the discrepancy of are defined by
These notions as well as the term 'discrepancy' seem to have appeared for the first time in a paper of Beck. [1] Earlier results on this problem include the famous lower bound on the discrepancy of arithmetic progressions by Roth [2] and upper bounds for this problem and other results by Erdős and Spencer [3] [4] and Sárközi. [5] : 39 At that time, discrepancy problems were called quasi-Ramsey problems.
To get some intuition for this concept, let's have a look at a few examples.
The last example shows that we cannot expect to determine the discrepancy by looking at a single parameter like the number of hyperedges. Still, the size of the hypergraph yields first upper bounds.
1. For any hypergraph with n vertices and m edges:
The proof is a simple application of the probabilistic method. Let be a random coloring, i.e. we have
independently for all . Since is a sum of independent −1, 1 random variables. So we have for all and . Taking gives
Since a random coloring with positive probability has discrepancy at most , in particular, there are colorings that have discrepancy at most . Hence
2. For any hypergraph with n vertices and m edges such that :
To prove this, a much more sophisticated approach using the entropy function was necessary. Of course this is particularly interesting for . In the case , can be shown for n large enough. Therefore, this result is usually known to as 'Six Standard Deviations Suffice'. It is considered to be one of the milestones of discrepancy theory. The entropy method has seen numerous other applications, e.g. in the proof of the tight upper bound for the arithmetic progressions of Matoušek and Spencer [6] or the upper bound in terms of the primal shatter function due to Matoušek. [7]
Better discrepancy bounds can be attained when the hypergraph has a bounded degree, that is, each vertex of is contained in at most t edges, for some small t. In particular:
Better bounds on the discrepancy are possible for hypergraphs with a special structure, such as:
In graph theory, an expander graph is a sparse graph that has strong connectivity properties, quantified using vertex, edge or spectral expansion. Expander constructions have spawned research in pure and applied mathematics, with several applications to complexity theory, design of robust computer networks, and the theory of error-correcting codes.
In probability theory and statistics, the exponential distribution or negative exponential distribution is the probability distribution of the distance between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant average rate; the distance parameter could be any meaningful mono-dimensional measure of the process, such as time between production errors, or length along a roll of fabric in the weaving manufacturing process. It is a particular case of the gamma distribution. It is the continuous analogue of the geometric distribution, and it has the key property of being memoryless. In addition to being used for the analysis of Poisson point processes it is found in various other contexts.
In mathematical optimization, the method of Lagrange multipliers is a strategy for finding the local maxima and minima of a function subject to equation constraints. It is named after the mathematician Joseph-Louis Lagrange.
In mathematics, a hypergraph is a generalization of a graph in which an edge can join any number of vertices. In contrast, in an ordinary graph, an edge connects exactly two vertices.
In mathematics, a Cayley graph, also known as a Cayley color graph, Cayley diagram, group diagram, or color group, is a graph that encodes the abstract structure of a group. Its definition is suggested by Cayley's theorem, and uses a specified set of generators for the group. It is a central tool in combinatorial and geometric group theory. The structure and symmetry of Cayley graphs makes them particularly good candidates for constructing expander graphs.
In statistics, the Wishart distribution is a generalization of the gamma distribution to multiple dimensions. It is named in honor of John Wishart, who first formulated the distribution in 1928. Other names include Wishart ensemble, or Wishart–Laguerre ensemble, or LOE, LUE, LSE.
In graph theory, graph coloring is a special case of graph labeling; it is an assignment of labels traditionally called "colors" to elements of a graph subject to certain constraints. In its simplest form, it is a way of coloring the vertices of a graph such that no two adjacent vertices are of the same color; this is called a vertex coloring. Similarly, an edge coloring assigns a color to each edge so that no two adjacent edges are of the same color, and a face coloring of a planar graph assigns a color to each face or region so that no two faces that share a boundary have the same color.
In mathematics, particularly in operator theory and C*-algebra theory, the continuous functional calculus is a functional calculus which allows the application of a continuous function to normal elements of a C*-algebra.
In statistics and information theory, a maximum entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions. According to the principle of maximum entropy, if nothing is known about a distribution except that it belongs to a certain class, then the distribution with the largest entropy should be chosen as the least-informative default. The motivation is twofold: first, maximizing entropy minimizes the amount of prior information built into the distribution; second, many physical systems tend to move towards maximal entropy configurations over time.
In probability theory and statistics, the noncentral chi-squared distribution is a noncentral generalization of the chi-squared distribution. It often arises in the power analysis of statistical tests in which the null distribution is a chi-squared distribution; important examples of such tests are the likelihood-ratio tests.
In mathematics, a complex torus is a particular kind of complex manifold M whose underlying smooth manifold is a torus in the usual sense. Here N must be the even number 2n, where n is the complex dimension of M.
In graph theory, a graph is said to be a pseudorandom graph if it obeys certain properties that random graphs obey with high probability. There is no concrete definition of graph pseudorandomness, but there are many reasonable characterizations of pseudorandomness one can consider.
The expander mixing lemma intuitively states that the edges of certain -regular graphs are evenly distributed throughout the graph. In particular, the number of edges between two vertex subsets and is always close to the expected number of edges between them in a random -regular graph, namely .
In mathematics, the Schur orthogonality relations, which were proven by Issai Schur through Schur's lemma, express a central fact about representations of finite groups. They admit a generalization to the case of compact groups in general, and in particular compact Lie groups, such as the rotation group SO(3).
In mathematics, a Caccioppoli set is a set whose boundary is measurable and has a finite measure. A synonym is set of (locally) finite perimeter. Basically, a set is a Caccioppoli set if its characteristic function is a function of bounded variation.
In statistics, the Chapman–Robbins bound or Hammersley–Chapman–Robbins bound is a lower bound on the variance of estimators of a deterministic parameter. It is a generalization of the Cramér–Rao bound; compared to the Cramér–Rao bound, it is both tighter and applicable to a wider range of problems. However, it is usually more difficult to compute.
In probability theory and statistics, the Conway–Maxwell–Poisson distribution is a discrete probability distribution named after Richard W. Conway, William L. Maxwell, and Siméon Denis Poisson that generalizes the Poisson distribution by adding a parameter to model overdispersion and underdispersion. It is a member of the exponential family, has the Poisson distribution and geometric distribution as special cases and the Bernoulli distribution as a limiting case.
In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time if these events occur with a known constant mean rate and independently of the time since the last event. It can also be used for the number of events in other types of intervals than time, and in dimension greater than 1.
In representation theory of mathematics, the Waldspurger formula relates the special values of two L-functions of two related admissible irreducible representations. Let k be the base field, f be an automorphic form over k, π be the representation associated via the Jacquet–Langlands correspondence with f. Goro Shimura (1976) proved this formula, when and f is a cusp form; Günter Harder made the same discovery at the same time in an unpublished paper. Marie-France Vignéras (1980) proved this formula, when and f is a newform. Jean-Loup Waldspurger, for whom the formula is named, reproved and generalized the result of Vignéras in 1985 via a totally different method which was widely used thereafter by mathematicians to prove similar formulas.
In graph theory, a locally linear graph is an undirected graph in which every edge belongs to exactly one triangle. Equivalently, for each vertex of the graph, its neighbors are each adjacent to exactly one other neighbor, so the neighbors can be paired up into an induced matching. Locally linear graphs have also been called locally matched graphs. Their triangles form the hyperedges of triangle-free 3-uniform linear hypergraphs and the blocks of certain partial Steiner triple systems, and the locally linear graphs are exactly the Gaifman graphs of these hypergraphs or partial Steiner systems.