Fedor V. Fomin | |
---|---|
Фёдор Владимирович Фомин | |
Born | Fedor Vladimirovič Fomin March 16, 1968 Leningrad, USSR |
Alma mater | St. Petersburg State University |
Scientific career | |
Fields | Algorithms |
Institutions | University of Bergen |
Doctoral advisor | Николай Николаевич Петров |
Fedor V. Fomin (born March 16, 1968) is a professor of Computer Science at the University of Bergen. He is known for his work in algorithms and graph theory. He received his PhD in 1997 at St. Petersburg State University under Nikolai Nikolaevich Petrov. [1]
Fomin is the co-author of three books:
With his co-authors Erik Demaine, Mohammad Hajiaghayi, and Dimitrios Thilikos, he received the 2015 European Association for Theoretical Computer Science Nerode Prize for his work on bidimensionality. [2] Together with Fabrizio Grandoni and Dieter Kratsch, he received the 2017 Nerode Prize for his work on Measure & Conquer. Fomin won the Nerode Prize a third time in 2024 for the paper "(Meta)Kernelization," coauthored with Hans L. Bodlaender, Daniel Lokshtanov, Eelko Penninkx, Saket Saurabh, and Dimitrios M. Thilikos. [3]
In 2019, Fomin was named an EATCS Fellow for "his fundamental contributions in the fields of parametrized complexity and exponential algorithms". [4] Fomin is an elected member of the Norwegian Academy of Science and Letters, the Norwegian Academy of Technological Sciences, and the Academia Europaea. In 2023, he was named an ACM Fellow. [5]
In combinatorial mathematics, the Steiner tree problem, or minimum Steiner tree problem, named after Jakob Steiner, is an umbrella term for a class of problems in combinatorial optimization. While Steiner tree problems may be formulated in a number of settings, they all require an optimal interconnect for a given set of objects and a predefined objective function. One well-known variant, which is often used synonymously with the term Steiner tree problem, is the Steiner tree problem in graphs. Given an undirected graph with non-negative edge weights and a subset of vertices, usually referred to as terminals, the Steiner tree problem in graphs requires a tree of minimum weight that contains all terminals and minimizes the total weight of its edges. Further well-known variants are the Euclidean Steiner tree problem and the rectilinear minimum Steiner tree problem.
In computer science, parameterized complexity is a branch of computational complexity theory that focuses on classifying computational problems according to their inherent difficulty with respect to multiple parameters of the input or output. The complexity of a problem is then measured as a function of those parameters. This allows the classification of NP-hard problems on a finer scale than in the classical setting, where the complexity of a problem is only measured as a function of the number of bits in the input. This appears to have been first demonstrated in Gurevich, Stockmeyer & Vishkin (1984). The first systematic work on parameterized complexity was done by Downey & Fellows (1999).
Erik D. Demaine is a Canadian-American professor of computer science at the Massachusetts Institute of Technology and a former child prodigy.
In graph theory, a connected dominating set and a maximum leaf spanning tree are two closely related structures defined on an undirected graph.
Michael Allen Langston is a professor of electrical engineering and computer science at the University of Tennessee. In several publications with Michael Fellows in the late 1980s, he showed that the Robertson–Seymour theorem could be used to prove the existence of a polynomial-time algorithm for problems such as linkless embedding without allowing the algorithm itself to be explicitly constructed; this work was foundational to the field of parameterized complexity. He has also collaborated with scientists at Oak Ridge National Laboratory on the computational analysis of genomics data and reconstruction of gene regulatory networks.
In computer science, a kernelization is a technique for designing efficient algorithms that achieve their efficiency by a preprocessing stage in which inputs to the algorithm are replaced by a smaller input, called a "kernel". The result of solving the problem on the kernel should either be the same as on the original input, or it should be easy to transform the output on the kernel to the desired output for the original problem.
Michael Ralph Fellows AC HFRSNZ MAE is a computer scientist and the Elite Professor of Computer Science in the Department of Informatics at the University of Bergen, Norway as of January 2016.
In computational complexity theory, the exponential time hypothesis is an unproven computational hardness assumption that was formulated by Impagliazzo & Paturi (1999). It states that satisfiability of 3-CNF Boolean formulas cannot be solved in subexponential time, . More precisely, the usual form of the hypothesis asserts the existence of a number such that all algorithms that correctly solve this problem require time at least The exponential time hypothesis, if true, would imply that P ≠ NP, but it is a stronger statement. It implies that many computational problems are equivalent in complexity, in the sense that if one of them has a subexponential time algorithm then they all do, and that many known algorithms for these problems have optimal or near-optimal time complexity.
Bidimensionality theory characterizes a broad range of graph problems (bidimensional) that admit efficient approximate, fixed-parameter or kernel solutions in a broad range of graphs. These graph classes include planar graphs, map graphs, bounded-genus graphs and graphs excluding any fixed minor. In particular, bidimensionality theory builds on the graph minor theory of Robertson and Seymour by extending the mathematical results and building new algorithmic tools. The theory was introduced in the work of Demaine, Fomin, Hajiaghayi, and Thilikos, for which the authors received the Nerode Prize in 2015.
Hans Leo Bodlaender is a Dutch computer scientist, a professor of computer science at Utrecht University. Bodlaender is known for his work on graph algorithms and parameterized complexity and in particular for algorithms relating to tree decomposition of graphs.
The EATCS–IPEC Nerode Prize is a theoretical computer science prize awarded for outstanding research in the area of multivariate algorithmics. It is awarded by the European Association for Theoretical Computer Science and the International Symposium on Parameterized and Exact Computation. The prize was offered for the first time in 2013.
In computer science, iterative compression is an algorithmic technique for the design of fixed-parameter tractable algorithms, in which one element is added to the problem in each step, and a small solution for the problem prior to the addition is used to help find a small solution to the problem after the step.
Mohammad Taghi Hajiaghayi is a computer scientist known for his work in algorithms, game theory, social networks, network design, graph theory, and big data. He has over 200 publications with over 185 collaborators and 10 issued patents.
In computer science and operations research, exact algorithms are algorithms that always solve an optimization problem to optimality.
In graph theory, a branch of mathematics, a map graph is an undirected graph formed as the intersection graph of finitely many simply connected and internally disjoint regions of the Euclidean plane. The map graphs include the planar graphs, but are more general. Any number of regions can meet at a common corner, and when they do the map graph will contain a clique connecting the corresponding vertices, unlike planar graphs in which the largest cliques have only four vertices. Another example of a map graph is the king's graph, a map graph of the squares of the chessboard connecting pairs of squares between which the chess king can move.
Uri Zwick is an Israeli computer scientist and mathematician known for his work on graph algorithms, in particular on distances in graphs and on the color-coding technique for subgraph isomorphism. With Howard Karloff, he is the namesake of the Karloff–Zwick algorithm for approximating the MAX-3SAT problem of Boolean satisfiability. He and his coauthors won the David P. Robbins Prize in 2011 for their work on the block-stacking problem.
In graph theory, an odd cycle transversal of an undirected graph is a set of vertices of the graph that has a nonempty intersection with every odd cycle in the graph. Removing the vertices of an odd cycle transversal from a graph leaves a bipartite graph as the remaining induced subgraph.
Saket Saurabh is an Indian computer scientist who is currently the Professor of Theoretical Computer Science at the Institute of Mathematical Sciences, Chennai (IMSc), India and an adjunct faculty at University of Bergen, Norway. He specializes in parameterized complexity, exact algorithms, graph algorithms and game theory. His fundamental contributions to the area of parameterized complexity include procedures for obtaining algorithmic lower bounds, and meta-theorems on preprocessing. Saket Saurabh was awarded the Shanti Swarup Bhatnagar Prize for Science and Technology in Mathematical Sciences in the year 2021.
In graph theory, the cutwidth of an undirected graph is the smallest integer with the following property: there is an ordering of the vertices of the graph, such that every cut obtained by partitioning the vertices into earlier and later subsets of the ordering is crossed by at most edges. That is, if the vertices are numbered , then for every , the number of edges with and is at most .
A parameterized approximation algorithm is a type of algorithm that aims to find approximate solutions to NP-hard optimization problems in polynomial time in the input size and a function of a specific parameter. These algorithms are designed to combine the best aspects of both traditional approximation algorithms and fixed-parameter tractability.