Probability |
---|
In probability and statistics, point process notation comprises the range of mathematical notation used to symbolically represent random objects known as point processes, which are used in related fields such as stochastic geometry, spatial statistics and continuum percolation theory and frequently serve as mathematical models of random phenomena, representable as points, in time, space or both.
The notation varies due to the histories of certain mathematical fields and the different interpretations of point processes, [1] [2] [3] and borrows notation from mathematical areas of study such as measure theory and set theory. [1]
The notation, as well as the terminology, of point processes depends on their setting and interpretation as mathematical objects which under certain assumptions can be interpreted as random sequences of points, random sets of points or random counting measures. [1]
In some mathematical frameworks, a given point process may be considered as a sequence of points with each point randomly positioned in d-dimensional Euclidean space Rd [1] as well as some other more abstract mathematical spaces. In general, whether or not a random sequence is equivalent to the other interpretations of a point process depends on the underlying mathematical space, but this holds true for the setting of finite-dimensional Euclidean space Rd. [4]
A point process is called simple if no two (or more points) coincide in location with probability one. Given that often point processes are simple and the order of the points does not matter, a collection of random points can be considered as a random set of points [1] [5] The theory of random sets was independently developed by David Kendall and Georges Matheron. In terms of being considered as a random set, a sequence of random points is a random closed set if the sequence has no accumulation points with probability one [6]
A point process is often denoted by a single letter, [1] [7] [8] for example , and if the point process is considered as a random set, then the corresponding notation: [1]
is used to denote that a random point is an element of (or belongs to) the point process . The theory of random sets can be applied to point processes owing to this interpretation, which alongside the random sequence interpretation has resulted in a point process being written as:
which highlights its interpretation as either a random sequence or random closed set of points. [1] Furthermore, sometimes an uppercase letter denotes the point process, while a lowercase denotes a point from the process, so, for example, the point (or ) belongs to or is a point of the point process , or with set notation, . [8]
To denote the number of points of located in some Borel set , it is sometimes written [7]
where is a random variable and is a counting measure, which gives the number of points in some set. In this mathematical expression the point process is denoted by:
On the other hand, the symbol:
represents the number of points of in . In the context of random measures, one can write:
to denote that there is the set that contains points of . In other words, a point process can be considered as a random measure that assigns some non-negative integer-valued measure to sets. [1] This interpretation has motivated a point process being considered just another name for a random counting measure [9] : 106 and the techniques of random measure theory offering another way to study point processes, [1] [10] which also induces the use of the various notations used in integration and measure theory. [lower-alpha 1]
The different interpretations of point processes as random sets and counting measures is captured with the often used notation [1] [3] [8] [11] in which:
Denoting the counting measure again with , this dual notation implies:
If is some measurable function on Rd, then the sum of over all the points in can be written in a number of ways [1] [3] such as:
which has the random sequence appearance, or with set notation as:
or, equivalently, with integration notation as:
where which puts an emphasis on the interpretation of being a random counting measure. An alternative integration notation may be used to write this integral as:
The dual interpretation of point processes is illustrated when writing the number of points in a set as:
where the indicator function if the point is exists in and zero otherwise, which in this setting is also known as a Dirac measure. [11] In this expression the random measure interpretation is on the left-hand side while the random set notation is used is on the right-hand side.
The average or expected value of a sum of functions over a point process is written as: [1] [3]
where (in the random measure sense) is an appropriate probability measure defined on the space of counting measures . The expected value of can be written as: [1]
which is also known as the first moment measure of . The expectation of such a random sum, known as a shot noise process in the theory of point processes, can be calculated with Campbell's theorem. [2]
Point processes are employed in other mathematical and statistical disciplines, hence the notation may be used in fields such stochastic geometry, spatial statistics or continuum percolation theory, and areas which use the methods and theory from these fields.
Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event.
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of possible outcomes for an experiment. It is a mathematical description of a random phenomenon in terms of its sample space and the probabilities of events.
A random variable is a mathematical formalization of a quantity or object which depends on random events. The term 'random variable' in its mathematical definition refers to neither randomness nor variability but instead is a mathematical function in which
In probability theory and related fields, a stochastic or random process is a mathematical object usually defined as a sequence of random variables in a probability space, where the index of the sequence often has the interpretation of time. Stochastic processes are widely used as mathematical models of systems and phenomena that appear to vary in a random manner. Examples include the growth of a bacterial population, an electrical current fluctuating due to thermal noise, or the movement of a gas molecule. Stochastic processes have applications in many disciplines such as biology, chemistry, ecology, neuroscience, physics, image processing, signal processing, control theory, information theory, computer science, and telecommunications. Furthermore, seemingly random changes in financial markets have motivated the extensive use of stochastic processes in finance.
In probability theory, there exist several different notions of convergence of sequences of random variables, including convergence in probability, convergence in distribution, and almost sure convergence. The different notions of convergence capture different properties about the sequence, with some notions of convergence being stronger than others. For example, convergence in distribution tells us about the limit distribution of a sequence of random variables. This is a weaker notion than convergence in probability, which tells us about the value a random variable will take, rather than just the distribution.
A mathematical symbol is a figure or a combination of figures that is used to represent a mathematical object, an action on mathematical objects, a relation between mathematical objects, or for structuring the other symbols that occur in a formula. As formulas are entirely constituted with symbols of various types, many symbols are needed for expressing all mathematics.
In statistics and probability theory, a point process or point field is a collection of mathematical points randomly located on a mathematical space such as the real line or Euclidean space. Point processes can be used for spatial data analysis, which is of interest in such diverse disciplines as forestry, plant ecology, epidemiology, geography, seismology, materials science, astronomy, telecommunications, computational neuroscience, economics and others.
In probability theory, random element is a generalization of the concept of random variable to more complicated spaces than the simple real line. The concept was introduced by Maurice Fréchet who commented that the “development of probability theory and expansion of area of its applications have led to necessity to pass from schemes where (random) outcomes of experiments can be described by number or a finite set of numbers, to schemes where outcomes of experiments represent, for example, vectors, functions, processes, fields, series, transformations, and also sets or collections of sets.”
In mathematics, ergodicity expresses the idea that a point of a moving system, either a dynamical system or a stochastic process, will eventually visit all parts of the space that the system moves in, in a uniform and random sense. This implies that the average behavior of the system can be deduced from the trajectory of a "typical" point. Equivalently, a sufficiently large collection of random samples from a process can represent the average statistical properties of the entire process. Ergodicity is a property of the system; it is a statement that the system cannot be reduced or factored into smaller components. Ergodic theory is the study of systems possessing ergodicity.
In the theory of stochastic processes in discrete time, a part of the mathematical theory of probability, the Doob decomposition theorem gives a unique decomposition of every adapted and integrable stochastic process as the sum of a martingale and a predictable process starting at zero. The theorem was proved by and is named for Joseph L. Doob.
In probability theory, a Laplace functional refers to one of two possible mathematical functions of functions or, more precisely, functionals that serve as mathematical tools for studying either point processes or concentration of measure properties of metric spaces. One type of Laplace functional, also known as a characteristic functional is defined in relation to a point process, which can be interpreted as random counting measures, and has applications in characterizing and deriving results on point processes. Its definition is analogous to a characteristic function for a random variable.
In probability theory and statistics, Campbell's theorem or the Campbell–Hardy theorem is either a particular equation or set of results relating to the expectation of a function summed over a point process to an integral involving the mean measure of the point process, which allows for the calculation of expected value and variance of the random sum. One version of the theorem, also known as Campbell's formula, entails an integral equation for the aforementioned sum over a general point process, and not necessarily a Poisson point process. There also exist equations involving moment measures and factorial moment measures that are considered versions of Campbell's formula. All these results are employed in probability and statistics with a particular importance in the theory of point processes and queueing theory as well as the related fields stochastic geometry, continuum percolation theory, and spatial statistics.
In information theory and telecommunication engineering, the signal-to-interference-plus-noise ratio (SINR) is a quantity used to give theoretical upper bounds on channel capacity in wireless communication systems such as networks. Analogous to the signal-to-noise ratio (SNR) used often in wired communications systems, the SINR is defined as the power of a certain signal of interest divided by the sum of the interference power and the power of some background noise. If the power of noise term is zero, then the SINR reduces to the signal-to-interference ratio (SIR). Conversely, zero interference reduces the SINR to the SNR, which is used less often when developing mathematical models of wireless networks such as cellular networks.
In probability and statistics, a point process operation or point process transformation is a type of mathematical operation performed on a random object known as a point process, which are often used as mathematical models of phenomena that can be represented as points randomly located in space. These operations can be purely random, deterministic or both, and are used to construct new point processes, which can be then also used as mathematical models. The operations may include removing or thinning points from a point process, combining or superimposing multiple point processes into one point process or transforming the underlying space of the point process into another space. Point process operations and the resulting point processes are used in the theory of point processes and related fields such as stochastic geometry and spatial statistics.
In probability and statistics, a moment measure is a mathematical quantity, function or, more precisely, measure that is defined in relation to mathematical objects known as point processes, which are types of stochastic processes often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. Moment measures generalize the idea of (raw) moments of random variables, hence arise often in the study of point processes and related fields.
In probability and statistics, a factorial moment measure is a mathematical quantity, function or, more precisely, measure that is defined in relation to mathematical objects known as point processes, which are types of stochastic processes often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. Moment measures generalize the idea of factorial moments, which are useful for studying non-negative integer-valued random variables.
In probability and statistics, a spherical contact distribution function, first contact distribution function, or empty space function is a mathematical function that is defined in relation to mathematical objects known as point processes, which are types of stochastic processes often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. More specifically, a spherical contact distribution function is defined as probability distribution of the radius of a sphere when it first encounters or makes contact with a point in a point process. This function can be contrasted with the nearest neighbour function, which is defined in relation to some point in the point process as being the probability distribution of the distance from that point to its nearest neighbouring point in the same point process.
In probability theory, statistics and related fields, a Poisson point process is a type of random mathematical object that consists of points randomly located on a mathematical space with the essential feature that the points occur independently of one another. The Poisson point process is also called a Poisson random measure, Poisson random point field and Poisson point field. When the process is defined on the real number line, it is often called simply the Poisson process.
In probability and statistics, a nearest neighbor function, nearest neighbor distance distribution, nearest-neighbor distribution function or nearest neighbor distribution is a mathematical function that is defined in relation to mathematical objects known as point processes, which are often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. More specifically, nearest neighbor functions are defined with respect to some point in the point process as being the probability distribution of the distance from this point to its nearest neighboring point in the same point process, hence they are used to describe the probability of another point existing within some distance of a point. A nearest neighbor function can be contrasted with a spherical contact distribution function, which is not defined in reference to some initial point but rather as the probability distribution of the radius of a sphere when it first encounters or makes contact with a point of a point process.
Mean-field particle methods are a broad class of interacting type Monte Carlo algorithms for simulating from a sequence of probability distributions satisfying a nonlinear evolution equation. These flows of probability measures can always be interpreted as the distributions of the random states of a Markov process whose transition probabilities depends on the distributions of the current random states. A natural way to simulate these sophisticated nonlinear Markov processes is to sample a large number of copies of the process, replacing in the evolution equation the unknown distributions of the random states by the sampled empirical measures. In contrast with traditional Monte Carlo and Markov chain Monte Carlo methods these mean-field particle techniques rely on sequential interacting samples. The terminology mean-field reflects the fact that each of the samples interacts with the empirical measures of the process. When the size of the system tends to infinity, these random empirical measures converge to the deterministic distribution of the random states of the nonlinear Markov chain, so that the statistical interaction between particles vanishes. In other words, starting with a chaotic configuration based on independent copies of initial state of the nonlinear Markov chain model, the chaos propagates at any time horizon as the size the system tends to infinity; that is, finite blocks of particles reduces to independent copies of the nonlinear Markov process. This result is called the propagation of chaos property. The terminology "propagation of chaos" originated with the work of Mark Kac in 1976 on a colliding mean-field kinetic gas model.