Fedor Fomin

Last updated
Fedor V. Fomin
Фёдор Владимирович Фомин
Born
Fedor Vladimirovič Fomin

(1968-03-16) March 16, 1968 (age 55)
Leningrad, USSR
Alma mater St. Petersburg State University
Scientific career
Fields Algorithms
Institutions University of Bergen
Doctoral advisor Николай Николаевич Петров

Fedor V. Fomin (born March 16, 1968) is a professor of Computer Science at the University of Bergen. He is known for his work in algorithms and graph theory. He received his PhD in 1997 at St. Petersburg State University under Nikolai Nikolaevich Petrov. [1]

Contents

Books

Fomin is the co-author of three books:

Awards and honours

With his co-authors Erik Demaine, Mohammad Hajiaghayi, and Dimitrios Thilikos, he received the 2015 European Association for Theoretical Computer Science Nerode Prize for his work on bidimensionality. [2] Together with Fabrizio Grandoni and Dieter Kratsch, he received the 2017 Nerode Prize for his work on Measure & Conquer. In 2019 Fomin was named an EATCS Fellow for "his fundamental contributions in the fields of parametrized complexity and exponential algorithms". [3] Fomin is elected member of the Norwegian Academy of Science and Letters, the Norwegian Academy of Technological Sciences, and the Academia Europaea. In 2023, he was named an ACM Fellow. [4]

Related Research Articles

<span class="mw-page-title-main">Steiner tree problem</span> On short connecting networks with added vertices

In combinatorial mathematics, the Steiner tree problem, or minimum Steiner tree problem, named after Jakob Steiner, is an umbrella term for a class of problems in combinatorial optimization. While Steiner tree problems may be formulated in a number of settings, they all require an optimal interconnect for a given set of objects and a predefined objective function. One well-known variant, which is often used synonymously with the term Steiner tree problem, is the Steiner tree problem in graphs. Given an undirected graph with non-negative edge weights and a subset of vertices, usually referred to as terminals, the Steiner tree problem in graphs requires a tree of minimum weight that contains all terminals and minimizes the total weight of its edges. Further well-known variants are the Euclidean Steiner tree problem and the rectilinear minimum Steiner tree problem.

In computer science, parameterized complexity is a branch of computational complexity theory that focuses on classifying computational problems according to their inherent difficulty with respect to multiple parameters of the input or output. The complexity of a problem is then measured as a function of those parameters. This allows the classification of NP-hard problems on a finer scale than in the classical setting, where the complexity of a problem is only measured as a function of the number of bits in the input. This appears to have been first demonstrated in Gurevich, Stockmeyer & Vishkin (1984). The first systematic work on parameterized complexity was done by Downey & Fellows (1999).

<span class="mw-page-title-main">Russell Impagliazzo</span> American computer scientist

Russell Graham Impagliazzo is a professor of computer science at the University of California, San Diego, specializing in computational complexity theory.

In graph theory, a connected dominating set and a maximum leaf spanning tree are two closely related structures defined on an undirected graph.

Michael Allen Langston is a professor of electrical engineering and computer science at the University of Tennessee. In several publications with Michael Fellows in the late 1980s, he showed that the Robertson–Seymour theorem could be used to prove the existence of a polynomial-time algorithm for problems such as linkless embedding without allowing the algorithm itself to be explicitly constructed; this work was foundational to the field of parameterized complexity. He has also collaborated with scientists at Oak Ridge National Laboratory on the computational analysis of genomics data and reconstruction of gene regulatory networks.

In computer science, a kernelization is a technique for designing efficient algorithms that achieve their efficiency by a preprocessing stage in which inputs to the algorithm are replaced by a smaller input, called a "kernel". The result of solving the problem on the kernel should either be the same as on the original input, or it should be easy to transform the output on the kernel to the desired output for the original problem.

<span class="mw-page-title-main">Clique-width</span> Measure of graph complexity

In graph theory, the clique-width of a graph G is a parameter that describes the structural complexity of the graph; it is closely related to treewidth, but unlike treewidth it can be small for dense graphs. It is defined as the minimum number of labels needed to construct G by means of the following 4 operations :

  1. Creation of a new vertex v with label i (denoted by i(v))
  2. Disjoint union of two labeled graphs G and H (denoted by )
  3. Joining by an edge every vertex labeled i to every vertex labeled j (denoted by η(i,j)), where ij
  4. Renaming label i to label j (denoted by ρ(i,j))

Michael Ralph Fellows AC HFRSNZ MAE is a computer scientist and the Elite Professor of Computer Science in the Department of Informatics at the University of Bergen, Norway as of January 2016.

In computational complexity theory, the exponential time hypothesis is an unproven computational hardness assumption that was formulated by Impagliazzo & Paturi (1999). It states that satisfiability of 3-CNF Boolean formulas cannot be solved in subexponential time, . More precisely, the usual form of the hypothesis asserts the existence of a number such that all algorithms that correctly solve this problem require time at least . The exponential time hypothesis, if true, would imply that P ≠ NP, but it is a stronger statement. It implies that many computational problems are equivalent in complexity, in the sense that if one of them has a subexponential time algorithm then they all do, and that many known algorithms for these problems have optimal or near-optimal time complexity.

Bidimensionality theory characterizes a broad range of graph problems (bidimensional) that admit efficient approximate, fixed-parameter or kernel solutions in a broad range of graphs. These graph classes include planar graphs, map graphs, bounded-genus graphs and graphs excluding any fixed minor. In particular, bidimensionality theory builds on the graph minor theory of Robertson and Seymour by extending the mathematical results and building new algorithmic tools. The theory was introduced in the work of Demaine, Fomin, Hajiaghayi, and Thilikos, for which the authors received the Nerode Prize in 2015.

Hans Leo Bodlaender is a Dutch computer scientist, a professor of computer science at Utrecht University. Bodlaender is known for his work on graph algorithms and parameterized complexity and in particular for algorithms relating to tree decomposition of graphs.

The EATCS–IPEC Nerode Prize is a theoretical computer science prize awarded for outstanding research in the area of multivariate algorithmics. It is awarded by the European Association for Theoretical Computer Science and the International Symposium on Parameterized and Exact Computation. The prize was offered for the first time in 2013.

In computer science, iterative compression is an algorithmic technique for the design of fixed-parameter tractable algorithms, in which one element is added to the problem in each step, and a small solution for the problem prior to the addition is used to help find a small solution to the problem after the step.

<span class="mw-page-title-main">Mohammad Hajiaghayi</span> American computer scientist

Mohammad Taghi Hajiaghayi is a computer scientist known for his work in algorithms, game theory, social networks, network design, graph theory, and big data. He has over 200 publications with over 185 collaborators and 10 issued patents.

In computer science and operations research, exact algorithms are algorithms that always solve an optimization problem to optimality.

Uri Zwick is an Israeli computer scientist and mathematician known for his work on graph algorithms, in particular on distances in graphs and on the color-coding technique for subgraph isomorphism. With Howard Karloff, he is the namesake of the Karloff–Zwick algorithm for approximating the MAX-3SAT problem of Boolean satisfiability. He and his coauthors won the David P. Robbins Prize in 2011 for their work on the block-stacking problem.

<span class="mw-page-title-main">Odd cycle transversal</span>

In graph theory, an odd cycle transversal of an undirected graph is a set of vertices of the graph that has a nonempty intersection with every odd cycle in the graph. Removing the vertices of an odd cycle transversal from a graph leaves a bipartite graph as the remaining induced subgraph.

Saket Saurabh is an Indian computer scientist who is currently the Professor of Theoretical Computer Science at the Institute of Mathematical Sciences, Chennai (IMSc), India and an adjunct faculty at University of Bergen, Norway. He specializes in parameterized complexity, exact algorithms, graph algorithms and game theory. His fundamental contributions to the area of parameterized complexity include procedures for obtaining algorithmic lower bounds, and meta-theorems on preprocessing. Saket Saurabh was awarded the Shanti Swarup Bhatnagar Prize for Science and Technology in Mathematical Sciences in the year 2021.

<span class="mw-page-title-main">Cutwidth</span> Property in graph theory

In graph theory, the cutwidth of an undirected graph is the smallest integer with the following property: there is an ordering of the vertices of the graph, such that every cut obtained by partitioning the vertices into earlier and later subsets of the ordering is crossed by at most edges. That is, if the vertices are numbered , then for every , the number of edges with and is at most .

A parameterized approximation algorithm is a type of algorithm that aims to find approximate solutions to NP-hard optimization problems in polynomial time in the input size and a function of a specific parameter. These algorithms are designed to combine the best aspects of both traditional approximation algorithms and fixed-parameter tractability.

References

  1. "Fedor Fomin". The Mathematics Genealogy Project. Retrieved 23 June 2022.
  2. "Nerode Prize" . Retrieved June 25, 2018.
  3. "EATCS Fellows" . Retrieved March 28, 2021. European Association for Theoretical Computer Science
  4. "Fedor Fomin". awards.acm.org. Retrieved 2024-01-26.