In graph theory, an area of mathematics, common graphs belong to a branch of extremal graph theory concerning inequalities in homomorphism densities. Roughly speaking, is a common graph if it "commonly" appears as a subgraph, in a sense that the total number of copies of in any graph and its complement is a large fraction of all possible copies of on the same vertices. Intuitively, if contains few copies of , then its complement must contain lots of copies of in order to compensate for it.
Common graphs are closely related to other graph notions dealing with homomorphism density inequalities. For example, common graphs are a more general case of Sidorenko graphs.
A graph is common if the inequality:
holds for any graphon , where is the number of edges of and is the homomorphism density. [1]
The inequality is tight because the lower bound is always reached when is the constant graphon .
For a graph , we have and for the associated graphon , since graphon associated to the complement is . Hence, this formula provides us with the very informal intuition to take a close enough approximation, whatever that means, [2] to , and see as roughly the fraction of labeled copies of graph in "approximate" graph . Then, we can assume the quantity is roughly and interpret the latter as the combined number of copies of in and . Hence, we see that holds. This, in turn, means that common graph commonly appears as subgraph.
In other words, if we think of edges and non-edges as 2-coloring of edges of complete graph on the same vertices, then at least fraction of all possible copies of are monochromatic. Note that in a Erdős–Rényi random graph with each edge drawn with probability , each graph homomorphism from to have probability of being monochromatic. So, common graph is a graph where it attains its minimum number of appearance as a monochromatic subgraph of graph at the graph with
. The above definition using the generalized homomorphism density can be understood in this way.
A graph is a Sidorenko graph if it satisfies for all graphons .
In that case, . Furthermore, , which follows from the definition of homomorphism density. Combining this with Jensen's inequality for the function :
Thus, the conditions for common graph is met. [8]
Expand the integral expression for and take into account the symmetry between the variables:
Each term in the expression can be written in terms of homomorphism densities of smaller graphs. By the definition of homomorphism densities:
where denotes the complete bipartite graph on vertex on one part and vertices on the other. It follows:
can be related to thanks to the symmetry between the variables and :
where the last step follows from the integral Cauchy–Schwarz inequality. Finally:
.
This proof can be obtained from taking the continuous analog of Theorem 1 in "On Sets Of Acquaintances And Strangers At Any Party" [9]
In mathematics, rings are algebraic structures that generalize fields: multiplication need not be commutative and multiplicative inverses need not exist. In other words, a ring is a set equipped with two binary operations satisfying properties analogous to those of addition and multiplication of integers. Ring elements may be numbers such as integers or complex numbers, but they may also be non-numerical objects such as polynomials, square matrices, functions, and power series.
In vector calculus, the divergence theorem, also known as Gauss's theorem or Ostrogradsky's theorem, is a theorem which relates the flux of a vector field through a closed surface to the divergence of the field in the volume enclosed.
In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proved by Jensen in 1906, building on an earlier proof of the same inequality for doubly-differentiable functions by Otto Hölder in 1889. Given its generality, the inequality appears in many forms depending on the context, some of which are presented below. In its simplest form the inequality states that the convex transformation of a mean is less than or equal to the mean applied after convex transformation; it is a simple corollary that the opposite is true of concave transformations.
In mathematics, a norm is a function from a real or complex vector space to the non-negative real numbers that behaves in certain ways like the distance from the origin: it commutes with scaling, obeys a form of the triangle inequality, and is zero only at the origin. In particular, the Euclidean distance in a Euclidean space is defined by a norm on the associated Euclidean vector space, called the Euclidean norm, the 2-norm, or, sometimes, the magnitude of the vector. This norm can be defined as the square root of the inner product of a vector with itself.
Szemerédi's regularity lemma is one of the most powerful tools in extremal graph theory, particularly in the study of large dense graphs. It states that the vertices of every large enough graph can be partitioned into a bounded number of parts so that the edges between different parts behave almost randomly.
In mathematics, subharmonic and superharmonic functions are important classes of functions used extensively in partial differential equations, complex analysis and potential theory.
In the mathematical discipline of graph theory, the expander walk sampling theorem intuitively states that sampling vertices in an expander graph by doing relatively short random walk can simulate sampling the vertices independently from a uniform distribution. The earliest version of this theorem is due to Ajtai, Komlós & Szemerédi (1987), and the more general version is typically attributed to Gillman (1998).
In Euclidean geometry, Ptolemy's inequality relates the six distances determined by four points in the plane or in a higher-dimensional space. It states that, for any four points A, B, C, and D, the following inequality holds:
Inequalities are very important in the study of information theory. There are a number of different contexts in which these inequalities appear.
In mathematics, the Fortuin–Kasteleyn–Ginibre (FKG) inequality is a correlation inequality, a fundamental tool in statistical mechanics and probabilistic combinatorics, due to Cees M. Fortuin, Pieter W. Kasteleyn, and Jean Ginibre (1971). Informally, it says that in many random systems, increasing events are positively correlated, while an increasing and a decreasing event are negatively correlated. It was obtained by studying the random cluster model.
In extremal graph theory, the forbidden subgraph problem is the following problem: given a graph , find the maximal number of edges an -vertex graph can have such that it does not have a subgraph isomorphic to . In this context, is called a forbidden subgraph.
In graph theory and statistics, a graphon is a symmetric measurable function , that is important in the study of dense graphs. Graphons arise both as a natural notion for the limit of a sequence of dense graphs, and as the fundamental defining objects of exchangeable random graph models. Graphons are tied to dense graphs by the following pair of observations: the random graph models defined by graphons give rise to dense graphs almost surely, and, by the regularity lemma, graphons capture the structure of arbitrary large dense graphs.
In complex analysis and geometric function theory, the Grunsky matrices, or Grunsky operators, are infinite matrices introduced in 1939 by Helmut Grunsky. The matrices correspond to either a single holomorphic function on the unit disk or a pair of holomorphic functions on the unit disk and its complement. The Grunsky inequalities express boundedness properties of these matrices, which in general are contraction operators or in important special cases unitary operators. As Grunsky showed, these inequalities hold if and only if the holomorphic function is univalent. The inequalities are equivalent to the inequalities of Goluzin, discovered in 1947. Roughly speaking, the Grunsky inequalities give information on the coefficients of the logarithm of a univalent function; later generalizations by Milin, starting from the Lebedev–Milin inequality, succeeded in exponentiating the inequalities to obtain inequalities for the coefficients of the univalent function itself. The Grunsky matrix and its associated inequalities were originally formulated in a more general setting of univalent functions between a region bounded by finitely many sufficiently smooth Jordan curves and its complement: the results of Grunsky, Goluzin and Milin generalize to that case.
In mathematics, the oscillator representation is a projective unitary representation of the symplectic group, first investigated by Irving Segal, David Shale, and André Weil. A natural extension of the representation leads to a semigroup of contraction operators, introduced as the oscillator semigroup by Roger Howe in 1988. The semigroup had previously been studied by other mathematicians and physicists, most notably Felix Berezin in the 1960s. The simplest example in one dimension is given by SU(1,1). It acts as Möbius transformations on the extended complex plane, leaving the unit circle invariant. In that case the oscillator representation is a unitary representation of a double cover of SU(1,1) and the oscillator semigroup corresponds to a representation by contraction operators of the semigroup in SL(2,C) corresponding to Möbius transformations that take the unit disk into itself.
In mathematics, singular integral operators of convolution type are the singular integral operators that arise on Rn and Tn through convolution by distributions; equivalently they are the singular integral operators that commute with translations. The classical examples in harmonic analysis are the harmonic conjugation operator on the circle, the Hilbert transform on the circle and the real line, the Beurling transform in the complex plane and the Riesz transforms in Euclidean space. The continuity of these operators on L2 is evident because the Fourier transform converts them into multiplication operators. Continuity on Lp spaces was first established by Marcel Riesz. The classical techniques include the use of Poisson integrals, interpolation theory and the Hardy–Littlewood maximal function. For more general operators, fundamental new techniques, introduced by Alberto Calderón and Antoni Zygmund in 1952, were developed by a number of authors to give general criteria for continuity on Lp spaces. This article explains the theory for the classical operators and sketches the subsequent general theory.
In mathematics, the Douady–Earle extension, named after Adrien Douady and Clifford Earle, is a way of extending homeomorphisms of the unit circle in the complex plane to homeomorphisms of the closed unit disk, such that the extension is a diffeomorphism of the open disk. The extension is analytic on the open disk. The extension has an important equivariance property: if the homeomorphism is composed on either side with a Möbius transformation preserving the unit circle the extension is also obtained by composition with the same Möbius transformation. If the homeomorphism is quasisymmetric, the diffeomorphism is quasiconformal. An extension for quasisymmetric homeomorphisms had previously been given by Lars Ahlfors and Arne Beurling; a different equivariant construction had been given in 1985 by Pekka Tukia. Equivariant extensions have important applications in Teichmüller theory; for example, they lead to a quick proof of the contractibility of the Teichmüller space of a Fuchsian group.
In the mathematical field of extremal graph theory, homomorphism density with respect to a graph is a parameter that is associated to each graph in the following manner:
The counting lemmas this article discusses are statements in combinatorics and graph theory. The first one extracts information from -regular pairs of subsets of vertices in a graph , in order to guarantee patterns in the entire graph; more explicitly, these patterns correspond to the count of copies of a certain graph in . The second counting lemma provides a similar yet more general notion on the space of graphons, in which a scalar of the cut distance between two graphs is correlated to the homomorphism density between them and .
Sidorenko's conjecture is a conjecture in the field of graph theory, posed by Alexander Sidorenko in 1986. Roughly speaking, the conjecture states that for any bipartite graph and graph on vertices with average degree , there are at least labeled copies of in , up to a small error term. Formally, it provides an intuitive inequality about graph homomorphism densities in graphons. The conjectured inequality can be interpreted as a statement that the density of copies of in a graph is asymptotically minimized by a random graph, as one would expect a fraction of possible subgraphs to be a copy of if each edge exists with probability .
In graph theory, a forcing graph is one whose density determines whether a graph sequence is quasi-random. The term was first coined by Chung, Graham, and Wilson in 1989., and forcing graphs play an important role in the study of pseudorandomness in graph sequences.