Martin Grohe (born 1967) [1] is a German mathematician and computer scientist known for his research on parameterized complexity, mathematical logic, finite model theory, the logic of graphs, database theory, descriptive complexity theory, and graph neural networks. He is a University Professor of Computer Science at RWTH Aachen University, where he holds the Chair for Logic and Theory of Discrete Systems. [2]
Grohe earned his doctorate (Dr. rer. nat.) at the University of Freiburg in 1994. His dissertation, The Structure of Fixed-Point Logics, was supervised by Heinz-Dieter Ebbinghaus. [3] After postdoctoral research at the University of California, Santa Cruz and Stanford University, he earned his habilitation at the University of Freiburg in 1998. [4] He became professor at the University of Illinois Chicago in 2000, reader at the University of Edinburgh in 2001, and professor at the Humboldt University of Berlin in 2003, before becoming professor at RWTH Aachen University in 2012. [5]
Grohe is the author of Descriptive Complexity, Canonisation, and Definable Graph Structure Theory (Lecture Notes in Logic 47, Cambridge University Press, 2017). [6] In 2011, Grohe and Johann A. Makowsky published as editors the 558th proceedings of the AMS-ASL special session on Model Theoretic Methods in Finite Combinatorics, which was held on January 5-8 2009 in Washington, DC. With Jörg Flum, he is the co-author of Parameterized Complexity Theory (Springer, 2006). [7]
Grohe won the Heinz Maier–Leibnitz Prize awarded by the German Research Foundation in 1999, [4] and he was elected as an ACM Fellow in 2017 for "contributions to logic in computer science, database theory, algorithms, and computational complexity". [8] In 2022, he was awarded an ERC Advanced Grant "Symmetry and Similarity". [9]
Combinatorics is an area of mathematics primarily concerned with counting, both as a means and as an end to obtaining results, and certain properties of finite structures. It is closely related to many other areas of mathematics and has many applications ranging from logic to statistical physics and from evolutionary biology to computer science.
Discrete mathematics is the study of mathematical structures that can be considered "discrete" rather than "continuous". Objects studied in discrete mathematics include integers, graphs, and statements in logic. By contrast, discrete mathematics excludes topics in "continuous mathematics" such as real numbers, calculus or Euclidean geometry. Discrete objects can often be enumerated by integers; more formally, discrete mathematics has been characterized as the branch of mathematics dealing with countable sets. However, there is no exact definition of the term "discrete mathematics".
Ronald Lewis Graham was an American mathematician credited by the American Mathematical Society as "one of the principal architects of the rapid development worldwide of discrete mathematics in recent years". He was president of both the American Mathematical Society and the Mathematical Association of America, and his honors included the Leroy P. Steele Prize for lifetime achievement and election to the National Academy of Sciences.
In computer science, parameterized complexity is a branch of computational complexity theory that focuses on classifying computational problems according to their inherent difficulty with respect to multiple parameters of the input or output. The complexity of a problem is then measured as a function of those parameters. This allows the classification of NP-hard problems on a finer scale than in the classical setting, where the complexity of a problem is only measured as a function of the number of bits in the input. This appears to have been first demonstrated in Gurevich, Stockmeyer & Vishkin (1984). The first systematic work on parameterized complexity was done by Downey & Fellows (1999).
Neil Immerman is an American theoretical computer scientist, a professor of computer science at the University of Massachusetts Amherst. He is one of the key developers of descriptive complexity, an approach he is currently applying to research in model checking, database theory, and computational complexity theory.
Finite model theory is a subarea of model theory. Model theory is the branch of logic which deals with the relation between a formal language (syntax) and its interpretations (semantics). Finite model theory is a restriction of model theory to interpretations on finite structures, which have a finite universe.
In computational complexity, problems that are in the complexity class NP but are neither in the class P nor NP-complete are called NP-intermediate, and the class of such problems is called NPI. Ladner's theorem, shown in 1975 by Richard E. Ladner, is a result asserting that, if P ≠ NP, then NPI is not empty; that is, NP contains problems that are neither in P nor NP-complete. Since it is also true that if NPI problems exist, then P ≠ NP, it follows that P = NP if and only if NPI is empty.
In computer science, a kernelization is a technique for designing efficient algorithms that achieve their efficiency by a preprocessing stage in which inputs to the algorithm are replaced by a smaller input, called a "kernel". The result of solving the problem on the kernel should either be the same as on the original input, or it should be easy to transform the output on the kernel to the desired output for the original problem.
In the study of graph algorithms, Courcelle's theorem is the statement that every graph property definable in the monadic second-order logic of graphs can be decided in linear time on graphs of bounded treewidth. The result was first proved by Bruno Courcelle in 1990 and independently rediscovered by Borie, Parker & Tovey (1992). It is considered the archetype of algorithmic meta-theorems.
Leif Kobbelt is a German university professor for Computer Science with a specialization in Computer Graphics. Since 2001 he is the head of the Institute for Computer Graphics and Multimedia at RWTH Aachen university.
Johann (János) A. Makowsky is a Hungarian-born naturalised Swiss mathematician who works in mathematical logic and the logical foundations of computer science and combinatorics. He studied at ETH Zurich from 1967–73. He was a student in Zürich of Ernst Specker and Hans Läuchli in mathematical logic,, of Beno Eckmann and Volker Strassen (Algorithmics), and in Warsaw of Andrzej Mostowski and Witek Marek, where he spent 1972 as an exchange student. Makowsky held visiting positions at the Banach Center in Warsaw (Poland), Stanford University (USA), Simon Fraser University (Canada), University of Florence (Italy), MIT (USA), Lausanne University and ETH Zurich (Switzerland). He held regular positions at the Free University of Berlin and the Technion – Israel Institute of Technology where he was a full professor.
In mathematical logic and computer science, two-variable logic is the fragment of first-order logic where formulae can be written using only two different variables. This fragment is usually studied without function symbols.
In mathematical logic, a fragment of a logical language or theory is a subset of this logical language obtained by imposing syntactical restrictions on the language. Hence, the well-formed formulae of the fragment are a subset of those in the original logic. However, the semantics of the formulae in the fragment and in the logic coincide, and any formula of the fragment can be expressed in the original logic.
Gabriele Nebe is a German mathematician with contributions in the theory of lattices, modular forms, spherical designs, and error-correcting codes. With Neil Sloane, she maintains the Online Catalogue of Lattices. She is a professor in the department of mathematics at RWTH Aachen University.
Ludwig Staiger is a German mathematician and computer scientist at the Martin Luther University of Halle-Wittenberg.
In the mathematical fields of graph theory and finite model theory, the logic of graphs deals with formal specifications of graph properties using sentences of mathematical logic. There are several variations in the types of logical operation that can be used in these sentences. The first-order logic of graphs concerns sentences in which the variables and predicates concern individual vertices and edges of a graph, while monadic second-order graph logic allows quantification over sets of vertices or edges. Logics based on least fixed point operators allow more general predicates over tuples of vertices, but these predicates can only be constructed through fixed-point operators, restricting their power.
In graph theory, a branch of mathematics, two graphs G and H are called homomorphically equivalent if there exists a graph homomorphism and a graph homomorphism . An example usage of this notion is that any two cores of a graph are homomorphically equivalent.
Algorithms and Combinatorics is a book series in mathematics, and particularly in combinatorics and the design and analysis of algorithms. It is published by Springer Science+Business Media, and was founded in 1987.
In graph theory and theoretical computer science, the colour refinement algorithm also known as the naive vertex classification, or the 1-dimensional version of the Weisfeiler-Leman algorithm, is a routine used for testing whether two graphs are isomorphic. While it solves graph isomorphism on almost all graphs, there are graphs such as all regular graphs that cannot be distinguished using colour refinement.
{{citation}}
: CS1 maint: untitled periodical (link){{citation}}
: CS1 maint: untitled periodical (link)