First passage percolation is a mathematical method used to describe the paths reachable in a random medium within a given amount of time.
First passage percolation is one of the most classical areas of probability theory. It was first introduced by John Hammersley and Dominic Welsh in 1965 as a model of fluid flow in a porous media. [1] It is part of percolation theory, and classical Bernoulli percolation can be viewed as a subset of first passage percolation.
Most of the beauty of the model lies in its simple definition (as a random metric space) and the property that several of its fascinating conjectures do not require much effort to be stated. Most times, the goal of first passage percolation is to understand a random distance on a graph, where weights are assigned to edges. Most questions are tied to either find the path with the least weight between two points, known as a geodesic, or to understand how the random geometry behaves in large scales.
As is the case in percolation theory in general, many of the problems related to first passage percolation involve finding optimal routes or optimal times. The model is defined as follows. [2] Let be a graph. We place a non-negative random variable , called the passage time of the edge , at each nearest-neighbor edge of the graph . The collection is usually assumed to be independent, identically distributed but there are variants of the model. The random variable is interpreted as the time or the cost needed to traverse edge .
Since each edge in first passage percolation has its own individual weight (or time) we can write the total time of a path as the summation of weights of each edge in the path. [3]
Given two vertices of one then sets
where the infimum is over all finite paths that start at and end at . The function induces a random pseudo-metric on .
The most famous model of first passage percolation is on the lattice . One of its most notorious questions is "What does a ball of large radius look like?". This question was raised in the original paper of Hammersley and Welsh in 1969 and gave rise to the Cox-Durrett limit shape theorem in 1981. [4] Although the Cox-Durrett theorem provides existence of the limit shape, not many properties of this set are known. For instance, it is expected that under mild assumptions this set should be strictly convex. As of 2016, the best result is the existence of the Auffinger-Damron differentiability point in the Cox-Liggett flat edge case. [5]
There are also some specific examples of first passage percolation that can be modeled using Markov chains. For example: a complete graph can be described using Markov chains and recursive trees [6] and 2-width strips can be described using a Markov chain and solved using a Harris chain. [7]
First passage percolation is well-known for giving rise to other tools of mathematics, including the Subadditive Ergodic Theorem, a fundamental result in ergodic theory.
Outside mathematics, the Eden growth model is used to model bacteria growth and deposition of material. Another example is comparing a minimized cost from the Vickrey–Clarke–Groves auction (VCG-auction) to a minimized path from first passage percolation to gauge how pessimistic the VCG-auction is at its lower limit. Both problems are solved similarly and one can find distributions to use in auction theory.
In probability theory, the central limit theorem (CLT) establishes that, in many situations, when independent random variables are summed up, their properly normalized sum tends toward a normal distribution even if the original variables themselves are not normally distributed. The theorem is a key concept in probability theory because it implies that probabilistic and statistical methods that work for normal distributions can be applicable to many problems involving other types of distributions. This theorem has seen many changes during the formal development of probability theory. Previous versions of the theorem date back to 1811, but in its modern general form, this fundamental result in probability theory was precisely stated as late as 1920, thereby serving as a bridge between classical and modern probability theory.
In statistical physics and mathematics, percolation theory describes the behavior of a network when nodes or links are added. This is a geometric type of phase transition, since at a critical fraction of addition the network of small, disconnected clusters merge into significantly larger connected, so-called spanning cluster. The applications of percolation theory to materials science and in many other disciplines are discussed here and in the articles network theory and percolation.
In mathematics, a random walk is a mathematical object, known as a stochastic or random process, that describes a path that consists of a succession of random steps on some mathematical space such as the integers.
In graph theory, a component of an undirected graph is a connected subgraph that is not part of any larger connected subgraph. The components of any graph partition its vertices into disjoint sets, and are the induced subgraphs of those sets. A graph that is itself connected has exactly one component, consisting of the whole graph. Components are also sometimes called connected components.
In mathematics, loop-erased random walk is a model for a random simple path with important applications in combinatorics and, in physics, quantum field theory. It is intimately connected to the uniform spanning tree, a model for a random tree. See also random walk for more general treatment of this topic.
In the domain of physics and probability, a Markov random field (MRF), Markov network or undirected graphical model is a set of random variables having a Markov property described by an undirected graph. In other words, a random field is said to be a Markov random field if it satisfies Markov properties.
A factor graph is a bipartite graph representing the factorization of a function. In probability theory and its applications, factor graphs are used to represent factorization of a probability distribution function, enabling efficient computations, such as the computation of marginal distributions through the sum-product algorithm. One of the important success stories of factor graphs and the sum-product algorithm is the decoding of capacity-approaching error-correcting codes, such as LDPC and turbo codes.
The contact process is a stochastic process used to model population growth on the set of sites of a graph in which occupied sites become vacant at a constant rate, while vacant sites become occupied at a rate proportional to the number of occupied neighboring sites. Therefore, if we denote by the proportionality constant, each site remains occupied for a random time period which is exponentially distributed parameter 1 and places descendants at every vacant neighboring site at times of events of a Poisson process parameter during this period. All processes are independent of one another and of the random period of time sites remains occupied. The contact process can also be interpreted as a model for the spread of an infection by thinking of particles as a bacterium spreading over individuals that are positioned at the sites of , occupied sites correspond to infected individuals, whereas vacant correspond to healthy ones.
In probability theory, the Schramm–Loewner evolution with parameter κ, also known as stochastic Loewner evolution (SLEκ), is a family of random planar curves that have been proven to be the scaling limit of a variety of two-dimensional lattice models in statistical mechanics. Given a parameter κ and a domain in the complex plane U, it gives a family of random curves in U, with κ controlling how much the curve turns. There are two main variants of SLE, chordal SLE which gives a family of random curves from two fixed boundary points, and radial SLE, which gives a family of random curves from a fixed boundary point to a fixed interior point. These curves are defined to satisfy conformal invariance and a domain Markov property.
In the mathematical field of graph theory, the Erdős–Rényi model is either of two closely related models for generating random graphs or the evolution of a random network. They are named after Hungarian mathematicians Paul Erdős and Alfréd Rényi, who first introduced one of the models in 1959, while Edgar Gilbert introduced the other model contemporaneously and independently of Erdős and Rényi. In the model of Erdős and Rényi, all graphs on a fixed vertex set with a fixed number of edges are equally likely; in the model introduced by Gilbert, also called the Erdős–Rényi–Gilbert model, each edge has a fixed probability of being present or absent, independently of the other edges. These models can be used in the probabilistic method to prove the existence of graphs satisfying various properties, or to provide a rigorous definition of what it means for a property to hold for almost all graphs.
In graph theory the conductance of a graph G=(V,E) measures how "well-knit" the graph is: it controls how fast a random walk on G converges to its stationary distribution. The conductance of a graph is often called the Cheeger constant of a graph as the analog of its counterpart in spectral geometry. Since electrical networks are intimately related to random walks with a long history in the usage of the term "conductance", this alternative name helps avoid possible confusion.
In the mathematical subject of group theory, the Grushko theorem or the Grushko–Neumann theorem is a theorem stating that the rank of a free product of two groups is equal to the sum of the ranks of the two free factors. The theorem was first obtained in a 1940 article of Grushko and then, independently, in a 1943 article of Neumann.
Harry Kesten was an American mathematician best known for his work in probability, most notably on random walks on groups and graphs, random matrices, branching processes, and percolation theory.
In mathematics, the Fortuin–Kasteleyn–Ginibre (FKG) inequality is a correlation inequality, a fundamental tool in statistical mechanics and probabilistic combinatorics, due to Cees M. Fortuin, Pieter W. Kasteleyn, and Jean Ginibre (1971). Informally, it says that in many random systems, increasing events are positively correlated, while an increasing and a decreasing event are negatively correlated. It was obtained by studying the random cluster model.
This page lists articles related to probability theory. In particular, it lists many articles corresponding to specific probability distributions. Such articles are marked here by a code of the form (X:Y), which refers to number of random variables involved and the type of the distribution. For example (2:DC) indicates a distribution with two random variables, discrete or continuous. Other codes are just abbreviations for topics. The list of codes can be found in the table of contents.
The Hammersley–Clifford theorem is a result in probability theory, mathematical statistics and statistical mechanics that gives necessary and sufficient conditions under which a strictly positive probability distribution can be represented as events generated by a Markov network. It is the fundamental theorem of random fields. It states that a probability distribution that has a strictly positive mass or density satisfies one of the Markov properties with respect to an undirected graph G if and only if it is a Gibbs random field, that is, its density can be factorized over the cliques of the graph.
In probability theory, an interacting particle system (IPS) is a stochastic process on some configuration space given by a site space, a countable-infinite graph and a local state space, a compact metric space . More precisely IPS are continuous-time Markov jump processes describing the collective behavior of stochastically interacting components. IPS are the continuous-time analogue of stochastic cellular automata.
In mathematics, the Poisson boundary is a measure space associated to a random walk. It is an object designed to encode the asymptotic behaviour of the random walk, i.e. how trajectories diverge when the number of steps goes to infinity. Despite being called a boundary it is in general a purely measure-theoretical object and not a boundary in the topological sense. However, in the case where the random walk is on a topological space the Poisson boundary can be related to the Martin boundary which is an analytic construction yielding a genuine topological boundary. Both boundaries are related to harmonic functions on the space via generalisations of the Poisson formula.
Maximal entropy random walk (MERW) is a popular type of biased random walk on a graph, in which transition probabilities are chosen accordingly to the principle of maximum entropy, which says that the probability distribution which best represents the current state of knowledge is the one with largest entropy. While standard random walk chooses for every vertex uniform probability distribution among its outgoing edges, locally maximizing entropy rate, MERW maximizes it globally by assuming uniform probability distribution among all paths in a given graph.