Alan M. Frieze

Last updated

Alan M. Frieze (born 25 October 1945 in London, England) is a professor in the Department of Mathematical Sciences at Carnegie Mellon University, Pittsburgh, United States. He graduated from the University of Oxford in 1966, and obtained his PhD from the University of London in 1975. His research interests lie in combinatorics, discrete optimisation and theoretical computer science. Currently, he focuses on the probabilistic aspects of these areas; in particular, the study of the asymptotic properties of random graphs, the average case analysis of algorithms, and randomised algorithms. His recent work has included approximate counting and volume computation via random walks; finding edge disjoint paths in expander graphs, and exploring anti-Ramsey theory and the stability of routing algorithms.

Contents

Key contributions

Two key contributions made by Alan Frieze are:

(1) polynomial time algorithm for approximating the volume of convex bodies

(2) algorithmic version for Szemerédi regularity lemma

Both these algorithms will be described briefly here.

Polynomial time algorithm for approximating the volume of convex bodies

The paper [1] is a joint work by Martin Dyer, Alan Frieze and Ravindran Kannan.

The main result of the paper is a randomised algorithm for finding an approximation to the volume of a convex body in -dimensional Euclidean space by assume the existence of a membership oracle. The algorithm takes time bounded by a polynomial in , the dimension of and .

The algorithm is a sophisticated usage of the so-called Markov chain Monte Carlo (MCMC) method. The basic scheme of the algorithm is a nearly uniform sampling from within by placing a grid consisting n-dimensional cubes and doing a random walk over these cubes. By using the theory of rapidly mixing Markov chains, they show that it takes a polynomial time for the random walk to settle down to being a nearly uniform distribution.

Algorithmic version for Szemerédi regularity partition

This paper [2] is a combined work by Alan Frieze and Ravindran Kannan. They use two lemmas to derive the algorithmic version of the Szemerédi regularity lemma to find an -regular partition.


Lemma 1:
Fix k and and let be a graph with vertices. Let be an equitable partition of in classes . Assume and . Given proofs that more than pairs are not -regular, it is possible to find in O(n) time an equitable partition (which is a refinement of ) into classes, with an exceptional class of cardinality at most and such that


Lemma 2:
Let be a matrix with , and and be a positive real.
(a) If there exist , such that , and then
(b) If , then there exist , such that , and where . Furthermore, , can be constructed in polynomial time.


These two lemmas are combined in the following algorithmic construction of the Szemerédi regularity lemma.


[Step 1] Arbitrarily divide the vertices of into an equitable partition with classes where and hence . denote .
[Step 2] For every pair of , compute . If the pair are not regular then by Lemma 2 we obtain a proof that they are not regular.
[Step 3] If there are at most pairs that produce proofs of non regularity that halt. is regular.
[Step 4] Apply Lemma 1 where , , and obtain with classes
[Step 5] Let , , and go to Step 2.

Awards and honours

Personal life

Frieze is married to Carol Frieze, who directs two outreach efforts for the computer science department at Carnegie Mellon University. [5]

Related Research Articles

<span class="mw-page-title-main">Pushdown automaton</span> Type of automaton

In the theory of computation, a branch of theoretical computer science, a pushdown automaton (PDA) is a type of automaton that employs a stack.

In theoretical computer science, a probabilistic Turing machine is a non-deterministic Turing machine that chooses between the available transitions at each point according to some probability distribution. As a consequence, a probabilistic Turing machine can—unlike a deterministic Turing Machine—have stochastic results; that is, on a given input and instruction state machine, it may have different run times, or it may not halt at all; furthermore, it may accept an input in one execution and reject the same input in another execution.

This is a glossary of some terms used in Riemannian geometry and metric geometry — it doesn't cover the terminology of differential topology.

<span class="mw-page-title-main">Szemerédi regularity lemma</span>

Szemerédi's regularity lemma is one of the most powerful tools in extremal graph theory, particularly in the study of large dense graphs. It states that the vertices of every large enough graph can be partitioned into a bounded number of parts so that the edges between different parts behave almost randomly.

In additive combinatorics, Freiman's theorem is a central result which indicates the approximate structure of sets whose sumset is small. It roughly states that if is small, then can be contained in a small generalized arithmetic progression.

In the mathematical discipline of graph theory, the expander walk sampling theorem intuitively states that sampling vertices in an expander graph by doing relatively short random walk can simulate sampling the vertices independently from a uniform distribution. The earliest version of this theorem is due to Ajtai, Komlós & Szemerédi (1987), and the more general version is typically attributed to Gillman (1998).

In mathematics, the Grothendieck inequality states that there is a universal constant with the following property. If Mij is an n × n matrix with

In mathematics, the Johnson–Lindenstrauss lemma is a result named after William B. Johnson and Joram Lindenstrauss concerning low-distortion embeddings of points from high-dimensional into low-dimensional Euclidean space. The lemma states that a set of points in a high-dimensional space can be embedded into a space of much lower dimension in such a way that distances between the points are nearly preserved. The map used for the embedding is at least Lipschitz, and can even be taken to be an orthogonal projection.

Uniform convergence in probability is a form of convergence in probability in statistical asymptotic theory and probability theory. It means that, under certain conditions, the empirical frequencies of all events in a certain event-family converge to their theoretical probabilities. Uniform convergence in probability has applications to statistics as well as machine learning as part of statistical learning theory.

In cryptography, Learning with errors (LWE) is a mathematical problem that is widely used in cryptography to create secure encryption algorithms. It is based on the idea of representing secret information as a set of equations with errors. In other words, LWE is a way to hide the value of a secret by introducing noise to it. In more technical terms, it refers to the computational problem of inferring a linear -ary function over a finite ring from given samples some of which may be erroneous. The LWE problem is conjectured to be hard to solve, and thus to be useful in cryptography.

In arithmetic combinatorics, the corners theorem states that for every , for large enough , any set of at least points in the grid contains a corner, i.e., a triple of points of the form with . It was first proved by Miklós Ajtai and Endre Szemerédi in 1974 using Szemerédi's theorem. In 2003, József Solymosi gave a short proof using the triangle removal lemma.

<span class="mw-page-title-main">Expander code</span>

In coding theory, expander codes form a class of error-correcting codes that are constructed from bipartite expander graphs. Along with Justesen codes, expander codes are of particular interest since they have a constant positive rate, a constant positive relative distance, and a constant alphabet size. In fact, the alphabet contains only two elements, so expander codes belong to the class of binary codes. Furthermore, expander codes can be both encoded and decoded in time proportional to the block length of the code.

In coding theory, folded Reed–Solomon codes are like Reed–Solomon codes, which are obtained by mapping Reed–Solomon codewords over a larger alphabet by careful bundling of codeword symbols.

In mathematics, the connective constant is a numerical quantity associated with self-avoiding walks on a lattice. It is studied in connection with the notion of universality in two-dimensional statistical physics models. While the connective constant depends on the choice of lattice so itself is not universal, it is nonetheless an important quantity that appears in conjectures for universal laws. Furthermore, the mathematical techniques used to understand the connective constant, for example in the recent rigorous proof by Duminil-Copin and Smirnov that the connective constant of the hexagonal lattice has the precise value , may provide clues to a possible approach for attacking other important open problems in the study of self-avoiding walks, notably the conjecture that self-avoiding walks converge in the scaling limit to the Schramm–Loewner evolution.

In graph theory, the graph removal lemma states that when a graph contains few copies of a given subgraph, then all of the copies can be eliminated by removing a small number of edges. The special case in which the subgraph is a triangle is known as the triangle removal lemma.

In graph theory, the hypergraph removal lemma states that when a hypergraph contains few copies of a given sub-hypergraph, then all of the copies can be eliminated by removing a small number of hyperedges. It is a generalization of the graph removal lemma. The special case in which the graph is a tetrahedron is known as the tetrahedron removal lemma. It was first proved by Nagle, Rödl, Schacht and Skokan and, independently, by Gowers.

Roth's theorem on arithmetic progressions is a result in additive combinatorics concerning the existence of arithmetic progressions in subsets of the natural numbers. It was first proven by Klaus Roth in 1953. Roth's theorem is a special case of Szemerédi's theorem for the case .

The counting lemmas this article discusses are statements in combinatorics and graph theory. The first one extracts information from -regular pairs of subsets of vertices in a graph , in order to guarantee patterns in the entire graph; more explicitly, these patterns correspond to the count of copies of a certain graph in . The second counting lemma provides a similar yet more general notion on the space of graphons, in which a scalar of the cut distance between two graphs is correlated to the homomorphism density between them and .

In mathematics, the hypergraph regularity method is a powerful tool in extremal graph theory that refers to the combined application of the hypergraph regularity lemma and the associated counting lemma. It is a generalization of the graph regularity method, which refers to the use of Szemerédi's regularity and counting lemmas.

The blow-up lemma, proved by János Komlós, Gábor N. Sárközy, and Endre Szemerédi in 1997, is an important result in extremal graph theory, particularly within the context of the regularity method. It states that the regular pairs in the statement of Szemerédi's regularity lemma behave like complete bipartite graphs in the context of embedding spanning graphs of bounded degree.

References

  1. M.Dyer, A.Frieze and R.Kannan (1991). "A random polynomial-time algorithm for approximating the volume of convex bodies". Journal of the ACM. Vol. 38, no. 1. pp. 1–17.
  2. A.Frieze and R.Kannan (1999). "A Simple Algorithm for Constructing Szemere'di's Regularity Partition" (PDF). Electron. J. Comb. Vol. 6.
  3. Siam Fellows Class of 2011
  4. List of Fellows of the American Mathematical Society, retrieved 29 December 2012.
  5. Frieze, Carol, Personal, Carnegie Mellon University, retrieved 20 January 2019