In probability and statistics, a point process operation or point process transformation is a type of mathematical operation performed on a random object known as a point process, which are often used as mathematical models of phenomena that can be represented as points randomly located in space. These operations can be purely random, deterministic or both, and are used to construct new point processes, which can be then also used as mathematical models. The operations may include removing or thinning points from a point process, combining or superimposing multiple point processes into one point process or transforming the underlying space of the point process into another space. Point process operations and the resulting point processes are used in the theory of point processes and related fields such as stochastic geometry and spatial statistics. [1]
One point process that gives particularly convenient results under random point process operations is the Poisson point process, [2] The Poisson point process often exhibits a type of mathematical closure such that when a point process operation is applied to some Poisson point process, then provided some conditions on the point process operation, the resulting process will be often another Poisson point process operation, hence it is often used as a mathematical model. [2] [1]
Point process operations have been studied in the mathematical limit as the number of random point process operations applied approaches infinity. This had led to convergence theorems of point process operations, which have their origins in the pioneering work of Conny Palm in 1940s and later Aleksandr Khinchin in the 1950s and 1960s who both studied point processes on the real line, in the context of studying the arrival of phone calls and queueing theory in general. [3] Provided that the original point process and the point process operation meet certain mathematical conditions, then as point process operations are applied to the process, then often the resulting point process will behave stochastically more like a Poisson point process if it has a non-random mean measure, which gives the average number of points of the point process located in some region. In other words, in the limit as the number of operations applied approaches infinity, the point process will converge in distribution (or weakly) to a Poisson point process or, if its measure is a random measure, to a Cox point process. [4] Convergence results, such as the Palm-Khinchin theorem for renewal processes, are then also used to justify the use of the Poisson point process as a mathematical of various phenomena.
Point processes are mathematical objects that can be used to represent collections of points randomly scattered on some underlying mathematical space. They have a number of interpretations, which is reflected by the various types of point process notation. [1] [5] For example, if a point belongs to or is a member of a point process, denoted by , then this can be written as: [1]
and represents the point process as a random set. Alternatively, the number of points of located in some Borel set is often written as: [1] [6] [7]
which reflects a random measure interpretation for point processes.
A point process needs to be defined on an underlying mathematical space. Often this space is d-dimensional Euclidean space denoted here by , although point processes can be defined on more abstract mathematical spaces. [4]
To develop suitable models with point processes in stochastic geometry, spatial statistics and related fields, there are number of useful transformations that can be performed on point processes including: thinning, superposition, mapping (or transformation of space), clustering, and random displacement. [2] [1] [7] [8]
The thinning operation entails using some predefined rule to remove points from a point process to form a new point process . These thinning rules may be deterministic, that is, not random, which is the case for one of the simplest rules known as -thinning: [1] each point of is independently removed (or kept) with some probability (or ). This rule may be generalized by introducing a non-negative function in order to define the located-dependent -thinning where now the probability of a point being removed is and is dependent on where the point of is located on the underlying space. A further generalization is to have the thinning probability random itself.
These three operations are all types of independent thinning, which means the interaction between points has no effect on the where a point is removed (or kept). Another generalization involves dependent thinning where points of the point process are removed (or kept) depending on their location in relation to other points of the point process. Thinning can be used to create new point processes such as hard-core processes where points do not exist (due to thinning) within a certain radius of each point in the thinned point process. [1]
The superposition operation is used to combine two or more point processes together onto one underlying mathematical space or state space. If there is a countable set or collection of point processes with mean measures , then their superposition
also forms a point process. In this expression the superposition operation is denoted by a set union), which implies the random set interpretation of point processes; see Point process notation for more information.
In the case where each is a Poisson point process, then the resulting process is also a Poisson point process with mean intensity
The point operation known as clustering entails replacing every point in a given point process with a cluster of points . Each cluster is also a point process, but with a finite number of points. The union of all the clusters forms a cluster point process
Often is it assumed that the clusters are all sets of finite points with each set being independent and identically distributed. Furthermore, if the original point process has a constant intensity , then the intensity of the cluster point process will be
where the constant is the mean of number of points in each .
A mathematical model may require randomly moving points of a point process from some locations to other locations on the underlying mathematical space. [2] This point process operation is referred to as random displacement [2] or translation. [4] If each point in the process is displaced or translated independently to other all other points in the process, then the operation forms an independent displacement or translation. [4] It is usually assume that all the random translations have a common probability distribution; hence the displacements form a set of independent and identically distributed random vectors in the underlying mathematical space.
Applying random displacements or translations to point processes may be used as mathematical models for mobility of objects in, for example, ecology [2] or wireless networks. [5]
The result known as the Displacement theorem [2] effectively says that the random independent displacement of points of a Poisson point process (on the same underlying space) forms another Poisson point process.
Another property that is considered useful is the ability to map a point process from one underlying space to another space. For example, a point process defined on the plane R2 can be transformed from Cartesian coordinates to polar coordinates. [2]
Provided that the mapping (or transformation) adheres to some conditions, then a result sometimes known as the Mapping theorem [2] says that if the original process is a Poisson point process with some intensity measure, then the resulting mapped (or transformed) collection of points also forms a Poisson point process with another intensity measure.
A point operation performed once on some point process can be, in general, performed again and again. In the theory of point processes, results have been derived to study the behaviour of the resulting point process, via convergence results, in the limit as the number of performed operations approaches infinity. [4] For example, if each point in a general point process is repeatedly displaced in a certain random and independent manner, then the new point process, informally speaking, will more and more resemble a Poisson point process. Similar convergence results have been developed for the operations of thinning and superposition (with suitable rescaling of the underlying space). [4]
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of possible outcomes for an experiment. It is a mathematical description of a random phenomenon in terms of its sample space and the probabilities of events.
In probability theory and related fields, a stochastic or random process is a mathematical object usually defined as a family of random variables in a probability space, where the index of the family often has the interpretation of time. Stochastic processes are widely used as mathematical models of systems and phenomena that appear to vary in a random manner. Examples include the growth of a bacterial population, an electrical current fluctuating due to thermal noise, or the movement of a gas molecule. Stochastic processes have applications in many disciplines such as biology, chemistry, ecology, neuroscience, physics, image processing, signal processing, control theory, information theory, computer science, and telecommunications. Furthermore, seemingly random changes in financial markets have motivated the extensive use of stochastic processes in finance.
In probability theory, a compound Poisson distribution is the probability distribution of the sum of a number of independent identically-distributed random variables, where the number of terms to be added is itself a Poisson-distributed variable. The result can be either a continuous or a discrete distribution.
In statistics and probability theory, a point process or point field is a set of a random number of mathematical points randomly located on a mathematical space such as the real line or Euclidean space.
Also known as the (Moran-)Gamma Process, the gamma process is a random process studied in mathematics, statistics, probability theory, and stochastics. The gamma process is a stochastic or random process consisting of independently distributed gamma distributions where represents the number of event occurrences from time 0 to time . The gamma distribution has shape parameter and rate parameter , often written as . Both and must be greater than 0. The gamma process is often written as where represents the time from 0. The process is a pure-jump increasing Lévy process with intensity measure for all positive . Thus jumps whose size lies in the interval occur as a Poisson process with intensity The parameter controls the rate of jump arrivals and the scaling parameter inversely controls the jump size. It is assumed that the process starts from a value 0 at t = 0 meaning .
In actuarial science and applied probability, ruin theory uses mathematical models to describe an insurer's vulnerability to insolvency/ruin. In such models key quantities of interest are the probability of ruin, distribution of surplus immediately prior to ruin and deficit at time of ruin.
In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time if these events occur with a known constant mean rate and independently of the time since the last event. It can also be used for the number of events in other types of intervals than time, and in dimension greater than 1.
In probability theory, the Palm–Khintchine theorem, the work of Conny Palm and Aleksandr Khinchin, expresses that a large number of renewal processes, not necessarily Poissonian, when combined ("superimposed") will have Poissonian properties.
In probability theory, a Laplace functional refers to one of two possible mathematical functions of functions or, more precisely, functionals that serve as mathematical tools for studying either point processes or concentration of measure properties of metric spaces. One type of Laplace functional, also known as a characteristic functional is defined in relation to a point process, which can be interpreted as random counting measures, and has applications in characterizing and deriving results on point processes. Its definition is analogous to a characteristic function for a random variable.
In queueing theory, a discipline within the mathematical theory of probability, an M/G/1 queue is a queue model where arrivals are Markovian, service times have a General distribution and there is a single server. The model name is written in Kendall's notation, and is an extension of the M/M/1 queue, where service times must be exponentially distributed. The classic application of the M/G/1 queue is to model performance of a fixed head hard disk.
In probability theory and statistics, Campbell's theorem or the Campbell–Hardy theorem is either a particular equation or set of results relating to the expectation of a function summed over a point process to an integral involving the mean measure of the point process, which allows for the calculation of expected value and variance of the random sum. One version of the theorem, also known as Campbell's formula, entails an integral equation for the aforementioned sum over a general point process, and not necessarily a Poisson point process. There also exist equations involving moment measures and factorial moment measures that are considered versions of Campbell's formula. All these results are employed in probability and statistics with a particular importance in the theory of point processes and queueing theory as well as the related fields stochastic geometry, continuum percolation theory, and spatial statistics.
In probability theory, an interacting particle system (IPS) is a stochastic process on some configuration space given by a site space, a countably-infinite-order graph and a local state space, a compact metric space . More precisely IPS are continuous-time Markov jump processes describing the collective behavior of stochastically interacting components. IPS are the continuous-time analogue of stochastic cellular automata.
In mathematics and probability theory, continuum percolation theory is a branch of mathematics that extends discrete percolation theory to continuous space. More specifically, the underlying points of discrete percolation form types of lattices whereas the underlying points of continuum percolation are often randomly positioned in some continuous space and form a type of point process. For each point, a random shape is frequently placed on it and the shapes overlap each with other to form clumps or components. As in discrete percolation, a common research focus of continuum percolation is studying the conditions of occurrence for infinite or giant components. Other shared concepts and analysis techniques exist in these two types of percolation theory as well as the study of random graphs and random geometric graphs.
In mathematics and telecommunications, stochastic geometry models of wireless networks refer to mathematical models based on stochastic geometry that are designed to represent aspects of wireless networks. The related research consists of analyzing these models with the aim of better understanding wireless communication networks in order to predict and control various network performance metrics. The models require using techniques from stochastic geometry and related fields including point processes, spatial statistics, geometric probability, percolation theory, as well as methods from more general mathematical disciplines such as geometry, probability theory, stochastic processes, queueing theory, information theory, and Fourier analysis.
In probability and statistics, point process notation comprises the range of mathematical notation used to symbolically represent random objects known as point processes, which are used in related fields such as stochastic geometry, spatial statistics and continuum percolation theory and frequently serve as mathematical models of random phenomena, representable as points, in time, space or both.
In probability and statistics, a moment measure is a mathematical quantity, function or, more precisely, measure that is defined in relation to mathematical objects known as point processes, which are types of stochastic processes often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. Moment measures generalize the idea of (raw) moments of random variables, hence arise often in the study of point processes and related fields.
In probability and statistics, a factorial moment measure is a mathematical quantity, function or, more precisely, measure that is defined in relation to mathematical objects known as point processes, which are types of stochastic processes often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. Moment measures generalize the idea of factorial moments, which are useful for studying non-negative integer-valued random variables.
In probability and statistics, a spherical contact distribution function, first contact distribution function, or empty space function is a mathematical function that is defined in relation to mathematical objects known as point processes, which are types of stochastic processes often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. More specifically, a spherical contact distribution function is defined as probability distribution of the radius of a sphere when it first encounters or makes contact with a point in a point process. This function can be contrasted with the nearest neighbour function, which is defined in relation to some point in the point process as being the probability distribution of the distance from that point to its nearest neighbouring point in the same point process.
In probability theory, statistics and related fields, a Poisson point process is a type of mathematical object that consists of points randomly located on a mathematical space with the essential feature that the points occur independently of one another. The process's name derives from the fact that the number of points in any given finite region follows a Poisson distribution. The process and the distribution are named after French mathematician Siméon Denis Poisson. The process itself was discovered independently and repeatedly in several settings, including experiments on radioactive decay, telephone call arrivals and actuarial science.
In probability and statistics, a nearest neighbor function, nearest neighbor distance distribution, nearest-neighbor distribution function or nearest neighbor distribution is a mathematical function that is defined in relation to mathematical objects known as point processes, which are often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. More specifically, nearest neighbor functions are defined with respect to some point in the point process as being the probability distribution of the distance from this point to its nearest neighboring point in the same point process, hence they are used to describe the probability of another point existing within some distance of a point. A nearest neighbor function can be contrasted with a spherical contact distribution function, which is not defined in reference to some initial point but rather as the probability distribution of the radius of a sphere when it first encounters or makes contact with a point of a point process.