Bidimensionality theory characterizes a broad range of graph problems (bidimensional) that admit efficient approximate, fixed-parameter or kernel solutions in a broad range of graphs. These graph classes include planar graphs, map graphs, bounded-genus graphs and graphs excluding any fixed minor. In particular, bidimensionality theory builds on the graph minor theory of Robertson and Seymour by extending the mathematical results and building new algorithmic tools. The theory was introduced in the work of Demaine, Fomin, Hajiaghayi, and Thilikos, [1] for which the authors received the Nerode Prize in 2015.
A parameterized problem is a subset of for some finite alphabet . An instance of a parameterized problem consists of (x,k), where k is called the parameter.
A parameterized problem is minor-bidimensional if
Examples of minor-bidimensional problems are the parameterized versions of vertex cover, feedback vertex set, minimum maximal matching, and longest path.
Let be the graph obtained from the -grid by triangulating internal faces such that all internal vertices become of degree 6, and then one corner of degree two joined by edges with all vertices of the external face. A parameterized problem is contraction-bidimensional if
Examples of contraction-bidimensional problems are dominating set, connected dominating set, max-leaf spanning tree, and edge dominating set.
All algorithmic applications of bidimensionality are based on the following combinatorial property: either the treewidth of a graph is small, or the graph contains a large grid as a minor or contraction. More precisely,
Halin's grid theorem is an analogous excluded grid theorem for infinite graphs. [4]
Let be a minor-bidimensional problem such that for any graph G excluding some fixed graph as a minor and of treewidth at most t, deciding whether can be done in time . Then for every graph G excluding some fixed graph as a minor, deciding whether can be done in time . Similarly, for contraction-bidimensional problems, for graph G excluding some fixed apex graph as a minor, inclusion can be decided in time .
Thus many bidimensional problems like Vertex Cover, Dominating Set, k-Path, are solvable in time on graphs excluding some fixed graph as a minor.
Bidimensionality theory has been used to obtain polynomial-time approximation schemes for many bidimensional problems. If a minor (contraction) bidimensional problem has several additional properties [5] [6] then the problem poses efficient polynomial-time approximation schemes on (apex) minor-free graphs.
In particular, by making use of bidimensionality, it was shown that feedback vertex set, vertex cover, connected vertex cover, cycle packing, diamond hitting set, maximum induced forest, maximum induced bipartite subgraph and maximum induced planar subgraph admit an EPTAS on H-minor-free graphs. Edge dominating set, dominating set, r-dominating set, connected dominating set, r-scattered set, minimum maximal matching, independent set, maximum full-degree spanning tree, maximum induced at most d-degree subgraph, maximum internal spanning tree, induced matching, triangle packing, partial r-dominating set and partial vertex cover admit an EPTAS on apex-minor-free graphs.
A parameterized problem with a parameter k is said to admit a linear vertex kernel if there is a polynomial time reduction, called a kernelization algorithm, that maps the input instance to an equivalent instance with at most O(k) vertices.
Every minor-bidimensional problem with additional properties, namely, with the separation property and with finite integer index, has a linear vertex kernel on graphs excluding some fixed graph as a minor. Similarly, every contraction-bidimensional problem with the separation property and with finite integer index has a linear vertex kernel on graphs excluding some fixed apex graph as a minor. [7]
In graph theory, an undirected graph H is called a minor of the graph G if H can be formed from G by deleting edges and vertices and by contracting edges.
In graph theory, a vertex cover of a graph is a set of vertices that includes at least one endpoint of every edge of the graph.
In computer science, parameterized complexity is a branch of computational complexity theory that focuses on classifying computational problems according to their inherent difficulty with respect to multiple parameters of the input or output. The complexity of a problem is then measured as a function of those parameters. This allows the classification of NP-hard problems on a finer scale than in the classical setting, where the complexity of a problem is only measured as a function of the number of bits in the input. The first systematic work on parameterized complexity was done by Downey & Fellows (1999).
In graph theory, the Hadwiger number of an undirected graph G is the size of the largest complete graph that can be obtained by contracting edges of G. Equivalently, the Hadwiger number h(G) is the largest number n for which the complete graph Kn is a minor of G, a smaller graph obtained from G by edge contractions and vertex and edge deletions. The Hadwiger number is also known as the contraction clique number of G or the homomorphism degree of G. It is named after Hugo Hadwiger, who introduced it in 1943 in conjunction with the Hadwiger conjecture, which states that the Hadwiger number is always at least as large as the chromatic number of G.
In graph theory, the metric dimension of a graph G is the minimum cardinality of a subset S of vertices such that all other vertices are uniquely determined by their distances to the vertices in S. Finding the metric dimension of a graph is an NP-hard problem; the decision version, determining whether the metric dimension is less than a given value, is NP-complete.
In graph theory, the treewidth of an undirected graph is an integer number which specifies, informally, how far the graph is from being a tree. The smallest treewidth is 1; the graphs with treewidth 1 are exactly the trees and the forests. The graphs with treewidth at most 2 are the series–parallel graphs. The maximal graphs with treewidth exactly k are called k-trees, and the graphs with treewidth at most k are called partial k-trees. Many other well-studied graph families also have bounded treewidth.
In graph theory, a path decomposition of a graph G is, informally, a representation of G as a "thickened" path graph, and the pathwidth of G is a number that measures how much the path was thickened to form G. More formally, a path-decomposition is a sequence of subsets of vertices of G such that the endpoints of each edge appear in one of the subsets and such that each vertex appears in a contiguous subsequence of the subsets, and the pathwidth is one less than the size of the largest set in such a decomposition. Pathwidth is also known as interval thickness, vertex separation number, or node searching number.
In computer science, a kernelization is a technique for designing efficient algorithms that achieve their efficiency by a preprocessing stage in which inputs to the algorithm are replaced by a smaller input, called a "kernel". The result of solving the problem on the kernel should either be the same as on the original input, or it should be easy to transform the output on the kernel to the desired output for the original problem.
In graph theory, a branch-decomposition of an undirected graph G is a hierarchical clustering of the edges of G, represented by an unrooted binary tree T with the edges of G as its leaves. Removing any edge from T partitions the edges of G into two subgraphs, and the width of the decomposition is the maximum number of shared vertices of any pair of subgraphs formed in this way. The branchwidth of G is the minimum width of any branch-decomposition of G.
In graph theory, a branch of mathematics, a clique-sum is a way of combining two graphs by gluing them together at a clique, analogous to the connected sum operation in topology. If two graphs G and H each contain cliques of equal size, the clique-sum of G and H is formed from their disjoint union by identifying pairs of vertices in these two cliques to form a single shared clique, and then possibly deleting some of the clique edges. A k-clique-sum is a clique-sum in which both cliques have at most k vertices. One may also form clique-sums and k-clique-sums of more than two graphs, by repeated application of the two-graph clique-sum operation.
In graph theory, a branch of mathematics, an apex graph is a graph that can be made planar by the removal of a single vertex. The deleted vertex is called an apex of the graph. It is an apex, not the apex because an apex graph may have more than one apex; for example, in the minimal nonplanar graphs K5 or K3,3, every vertex is an apex. The apex graphs include graphs that are themselves planar, in which case again every vertex is an apex. The null graph is also counted as an apex graph even though it has no vertex to remove.
In theoretical computer science, Baker's technique is a method for designing polynomial-time approximation schemes (PTASs) for problems on planar graphs. It is named after Brenda Baker, who announced it in a 1983 conference and published it in the Journal of the ACM in 1994.
In the study of graph algorithms, Courcelle's theorem is the statement that every graph property definable in the monadic second-order logic of graphs can be decided in linear time on graphs of bounded treewidth. The result was first proved by Bruno Courcelle in 1990 and independently rediscovered by Borie, Parker & Tovey (1992). It is considered the archetype of algorithmic meta-theorems.
In graph theory, a branch of mathematics, Halin's grid theorem states that the infinite graphs with thick ends are exactly the graphs containing subdivisions of the hexagonal tiling of the plane. It was published by Rudolf Halin (1965), and is a precursor to the work of Robertson and Seymour linking treewidth to large grid minors, which became an important component of the algorithmic theory of bidimensionality.
The EATCS–IPEC Nerode Prize is a theoretical computer science prize awarded for outstanding research in the area of multivariate algorithmics. It is awarded by the European Association for Theoretical Computer Science and the International Symposium on Parameterized and Exact Computation. The prize was offered for the first time in 2013.
In computer science, iterative compression is an algorithmic technique for the design of fixed-parameter tractable algorithms, in which one element is added to the problem in each step, and a small solution for the problem prior to the addition is used to help find a small solution to the problem after the step.
Mohammad Taghi Hajiaghayi is a computer scientist known for his work in algorithms, game theory, social networks, network design, graph theory, and big data. He has over 200 publications with over 185 collaborators and 10 issued patents.
In graph theory, a branch of mathematics, a map graph is an undirected graph formed as the intersection graph of finitely many simply connected and internally disjoint regions of the Euclidean plane. The map graphs include the planar graphs, but are more general. Any number of regions can meet at a common corner, and when they do the map graph will contain a clique connecting the corresponding vertices, unlike planar graphs in which the largest cliques have only four vertices. Another example of a map graph is the king's graph, a map graph of the squares of the chessboard connecting pairs of squares between which the chess king can move.
Fedor V. Fomin is a professor of Computer Science at the University of Bergen. He is known for his work in algorithms and graph theory. He received his PhD in 1997 at St. Petersburg State University under Nikolai Nikolaevich Petrov.
The twin-width of an undirected graph is a natural number associated with the graph, used to study the parameterized complexity of graph algorithms. Intuitively, it measures how similar the graph is to a cograph, a type of graph that can be reduced to a single vertex by repeatedly merging together twins, vertices that have the same neighbors. The twin-width is defined from a sequence of repeated mergers where the vertices are not required to be twins, but have nearly equal sets of neighbors.