The contact process is a stochastic process used to model population growth on the set of sites of a graph in which occupied sites become vacant at a constant rate, while vacant sites become occupied at a rate proportional to the number of occupied neighboring sites. Therefore, if we denote by the proportionality constant, each site remains occupied for a random time period which is exponentially distributed parameter 1 and places descendants at every vacant neighboring site at times of events of a Poisson process parameter during this period. All processes are independent of one another and of the random period of time sites remains occupied. The contact process can also be interpreted as a model for the spread of an infection by thinking of particles as a bacterium spreading over individuals that are positioned at the sites of , occupied sites correspond to infected individuals, whereas vacant correspond to healthy ones.
The main quantity of interest is the number of particles in the process, say , in the first interpretation, which corresponds to the number of infected sites in the second one. Therefore, the process survives whenever the number of particles is positive for all times, which corresponds to the case that there are always infected individuals in the second one. For any infinite graph there exists a positive and finite critical value so that if then survival of the process starting from a finite number of particles occurs with positive probability, while if their extinction is almost certain. Note that by reductio ad absurdum and the infinite monkey theorem, survival of the process is equivalent to , as , whereas extinction is equivalent to , as , and therefore, it is natural to ask about the rate at which when the process survives.
If the state of the process at time is , then a site in is occupied, say by a particle, if and vacant if . The contact process is a continuous-time Markov process with state space , where is a finite or countable graph, usually , and a special case of an interacting particle system. More specifically, the dynamics of the basic contact process is defined by the following transition rates: at site ,
where the sum is over all the neighbors of in . This means that each site waits an exponential time with the corresponding rate, and then flips (so 0 becomes 1 and vice versa).
The contact process is a stochastic process that is closely connected to percolation theory. Ted Harris (1974) noted that the contact process on when infections and recoveries can occur only in discrete times corresponds to one-step-at-a-time bond percolation on the graph obtained by orienting each edge of in the direction of increasing coordinate-value.
A law of large numbers for the number of particles in the process on the integers informally means that for all large , is approximately equal to for some positive constant . Harris (1974) proved that, if the process survives, then the rate of growth of is at most and at least linear in time. A weak law of large numbers (that the process converges in probability) was shown by Durrett (1980). A few years later, Durrett and Griffeath (1983) improved this to a strong law of large numbers, giving almost sure convergence of the process.
Contact processes on all integer lattices die out almost surely at the critical value. [1]
Durrett conjectured in survey papers and lecture notes during the 1980s and early 1990s regarding the central limit theorem for the Harris contact process, viz. that, if the process survives, then for all large , equals and the error equals multiplied by a (random) error distributed according to a standard Gaussian distribution. [2] [3] [4]
Durrett's conjecture turned out to be correct for a different value of as proved in 2018. [5]
In probability theory and statistics, the exponential distribution or negative exponential distribution is the probability distribution of the distance between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant average rate; the distance parameter could be any meaningful mono-dimensional measure of the process, such as time between production errors, or length along a roll of fabric in the weaving manufacturing process. It is a particular case of the gamma distribution. It is the continuous analogue of the geometric distribution, and it has the key property of being memoryless. In addition to being used for the analysis of Poisson point processes it is found in various other contexts.
In physics, engineering and mathematics, the Fourier transform (FT) is an integral transform that takes a function as input and outputs another function that describes the extent to which various frequencies are present in the original function. The output of the transform is a complex-valued function of frequency. The term Fourier transform refers to both this complex-valued function and the mathematical operation. When a distinction needs to be made the Fourier transform is sometimes called the frequency domain representation of the original function. The Fourier transform is analogous to decomposing the sound of a musical chord into the intensities of its constituent pitches.
In probability theory and statistics, the Weibull distribution is a continuous probability distribution. It models a broad range of random variables, largely in the nature of a time to failure or time between events. Examples are maximum one-day rainfalls and the time a user spends on a web page.
In mathematics, real trees are a class of metric spaces generalising simplicial trees. They arise naturally in many mathematical contexts, in particular geometric group theory and probability theory. They are also the simplest examples of Gromov hyperbolic spaces.
In probability theory, a compound Poisson distribution is the probability distribution of the sum of a number of independent identically-distributed random variables, where the number of terms to be added is itself a Poisson-distributed variable. The result can be either a continuous or a discrete distribution.
In system analysis, among other fields of study, a linear time-invariant (LTI) system is a system that produces an output signal from any input signal subject to the constraints of linearity and time-invariance; these terms are briefly defined below. These properties apply (exactly or approximately) to many important physical systems, in which case the response y(t) of the system to an arbitrary input x(t) can be found directly using convolution: y(t) = (x ∗ h)(t) where h(t) is called the system's impulse response and ∗ represents convolution (not to be confused with multiplication). What's more, there are systematic methods for solving any such system (determining h(t)), whereas systems not meeting both properties are generally more difficult (or impossible) to solve analytically. A good example of an LTI system is any electrical circuit consisting of resistors, capacitors, inductors and linear amplifiers.
In statistics and probability theory, a point process or point field is a collection of mathematical points randomly located on a mathematical space such as the real line or Euclidean space. Point processes can be used for spatial data analysis, which is of interest in such diverse disciplines as forestry, plant ecology, epidemiology, geography, seismology, materials science, astronomy, telecommunications, computational neuroscience, economics and others.
In queueing theory, a discipline within the mathematical theory of probability, a Jackson network is a class of queueing network where the equilibrium distribution is particularly simple to compute as the network has a product-form solution. It was the first significant development in the theory of networks of queues, and generalising and applying the ideas of the theorem to search for similar product-form solutions in other networks has been the subject of much research, including ideas used in the development of the Internet. The networks were first identified by James R. Jackson and his paper was re-printed in the journal Management Science’s ‘Ten Most Influential Titles of Management Sciences First Fifty Years.’
In probability theory, the inverse Gaussian distribution is a two-parameter family of continuous probability distributions with support on (0,∞).
In statistics, the generalized Pareto distribution (GPD) is a family of continuous probability distributions. It is often used to model the tails of another distribution. It is specified by three parameters: location , scale , and shape . Sometimes it is specified by only scale and shape and sometimes only by its shape parameter. Some references give the shape parameter as .
In the mathematical field of graph theory, the Erdős–Rényi model refers to one of two closely related models for generating random graphs or the evolution of a random network. These models are named after Hungarian mathematicians Paul Erdős and Alfréd Rényi, who introduced one of the models in 1959. Edgar Gilbert introduced the other model contemporaneously with and independently of Erdős and Rényi. In the model of Erdős and Rényi, all graphs on a fixed vertex set with a fixed number of edges are equally likely. In the model introduced by Gilbert, also called the Erdős–Rényi–Gilbert model, each edge has a fixed probability of being present or absent, independently of the other edges. These models can be used in the probabilistic method to prove the existence of graphs satisfying various properties, or to provide a rigorous definition of what it means for a property to hold for almost all graphs.
An -superprocess, , within mathematics probability theory is a stochastic process on that is usually constructed as a special limit of near-critical branching diffusions.
In actuarial science and applied probability, ruin theory uses mathematical models to describe an insurer's vulnerability to insolvency/ruin. In such models key quantities of interest are the probability of ruin, distribution of surplus immediately prior to ruin and deficit at time of ruin.
In mathematics, the spectral theory of ordinary differential equations is the part of spectral theory concerned with the determination of the spectrum and eigenfunction expansion associated with a linear ordinary differential equation. In his dissertation, Hermann Weyl generalized the classical Sturm–Liouville theory on a finite closed interval to second order differential operators with singularities at the endpoints of the interval, possibly semi-infinite or infinite. Unlike the classical case, the spectrum may no longer consist of just a countable set of eigenvalues, but may also contain a continuous part. In this case the eigenfunction expansion involves an integral over the continuous part with respect to a spectral measure, given by the Titchmarsh–Kodaira formula. The theory was put in its final simplified form for singular differential equations of even degree by Kodaira and others, using von Neumann's spectral theorem. It has had important applications in quantum mechanics, operator theory and harmonic analysis on semisimple Lie groups.
Harry Kesten was a Jewish American mathematician best known for his work in probability, most notably on random walks on groups and graphs, random matrices, branching processes, and percolation theory.
In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time if these events occur with a known constant mean rate and independently of the time since the last event. It can also be used for the number of events in other types of intervals than time, and in dimension greater than 1.
In the mathematical theory of random matrices, the Marchenko–Pastur distribution, or Marchenko–Pastur law, describes the asymptotic behavior of singular values of large rectangular random matrices. The theorem is named after soviet mathematicians Volodymyr Marchenko and Leonid Pastur who proved this result in 1967.
In the mathematical theory of probability, the voter model is an interacting particle system introduced by Richard A. Holley and Thomas M. Liggett in 1975.
In probability theory, an interacting particle system (IPS) is a stochastic process on some configuration space given by a site space, a countably-infinite-order graph and a local state space, a compact metric space . More precisely IPS are continuous-time Markov jump processes describing the collective behavior of stochastically interacting components. IPS are the continuous-time analogue of stochastic cellular automata.
Stochastic chains with memory of variable length are a family of stochastic chains of finite order in a finite alphabet, such as, for every time pass, only one finite suffix of the past, called context, is necessary to predict the next symbol. These models were introduced in the information theory literature by Jorma Rissanen in 1983, as a universal tool to data compression, but recently have been used to model data in different areas such as biology, linguistics and music.