Probability |
---|
In probability and statistics, a factorial moment measure is a mathematical quantity, function or, more precisely, measure that is defined in relation to mathematical objects known as point processes, which are types of stochastic processes often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. Moment measures generalize the idea of factorial moments, which are useful for studying non-negative integer-valued random variables. [1]
The first factorial moment measure of a point process coincides with its first moment measure or intensity measure, [2] which gives the expected or average number of points of the point process located in some region of space. In general, if the number of points in some region is considered as a random variable, then the moment factorial measure of this region is the factorial moment of this random variable. [3] Factorial moment measures completely characterize a wide class of point processes, which means they can be used to uniquely identify a point process.
If a factorial moment measure is absolutely continuous, then with respect to the Lebesgue measure it is said to have a density (which is a generalized form of a derivative), and this density is known by a number of names such as factorial moment density and product density, as well as coincidence density, [1] joint intensity [4] , correlation function or multivariate frequency spectrum [5] The first and second factorial moment densities of a point process are used in the definition of the pair correlation function, which gives a way to statistically quantify the strength of interaction or correlation between points of a point process. [6]
Factorial moment measures serve as useful tools in the study of point processes [1] [6] [7] as well as the related fields of stochastic geometry [3] and spatial statistics, [6] [8] which are applied in various scientific and engineering disciplines such as biology, geology, physics, and telecommunications. [1] [3] [9]
Point processes are mathematical objects that are defined on some underlying mathematical space. Since these processes are often used to represent collections of points randomly scattered in space, time or both, the underlying space is usually d-dimensional Euclidean space denoted here by Rd, but they can be defined on more abstract mathematical spaces. [7]
Point processes have a number of interpretations, which is reflected by the various types of point process notation. [3] [9] For example, if a point belongs to or is a member of a point process, denoted by N, then this can be written as: [3]
and represents the point process being interpreted as a random set. Alternatively, the number of points of N located in some Borel set B is often written as: [2] [3] [8]
which reflects a random measure interpretation for point processes. These two notations are often used in parallel or interchangeably. [3] [8] [2]
For some positive integer , the -th factorial power of a point process on is defined as: [2]
where is a collection of not necessarily disjoint Borel sets in , which form an -fold Cartesian product of sets denoted by:
The symbol denotes an indicator function such that is a Dirac measure for the set . The summation in the above expression is performed over all -tuples of distinct points, including permutations, which can be contrasted with the definition of the n-th power of a point process. The symbol denotes multiplication while the existence of various point process notation means that the n-th factorial power of a point process is sometimes defined using other notation. [2]
The n th factorial moment measure or n th order factorial moment measure is defined as:
where the E denotes the expectation (operator) of the point process N. In other words, the n-th factorial moment measure is the expectation of the n th factorial power of some point process.
The n th factorial moment measure of a point process N is equivalently defined [3] by:
where is any non-negative measurable function on , and the above summation is performed over all tuples of distinct points, including permutations. Consequently, the factorial moment measure is defined such that there are no points repeating in the product set, as opposed to the moment measure. [7]
The first factorial moment measure coincides with the first moment measure: [2]
where is known, among other terms, as the intensity measure [3] or mean measure, [10] and is interpreted as the expected number of points of found or located in the set
The second factorial moment measure for two Borel sets and is:
For some Borel set , the namesake of this measure is revealed when the th factorial moment measure reduces to:
which is the -th factorial moment of the random variable . [3]
If a factorial moment measure is absolutely continuous, then it has a density (or more precisely, a Radon–Nikodym derivative or density) with respect to the Lebesgue measure and this density is known as the factorial moment density or product density, joint intensity, correlation function, or multivariate frequency spectrum. Denoting the -th factorial moment density by , it is defined in respect to the equation: [3]
Furthermore, this means the following expression
where is any non-negative bounded measurable function defined on .
In spatial statistics and stochastic geometry, to measure the statistical correlation relationship between points of a point process, the pair correlation function of a point process is defined as: [3] [6]
where the points . In general, whereas corresponds to no correlation (between points) in the typical statistical sense. [6]
For a general Poisson point process with intensity measure the -th factorial moment measure is given by the expression: [3]
where is the intensity measure or first moment measure of , which for some Borel set is given by:
For a homogeneous Poisson point process the -th factorial moment measure is simply: [2]
where is the length, area, or volume (or more generally, the Lebesgue measure) of . Furthermore, the -th factorial moment density is: [3]
The pair-correlation function of the homogeneous Poisson point process is simply
which reflects the lack of interaction between points of this point process.
The expectations of general functionals of simple point processes, provided some certain mathematical conditions, have (possibly infinite) expansions or series consisting of the corresponding factorial moment measures. [11] [12] In comparison to the Taylor series, which consists of a series of derivatives of some function, the nth factorial moment measure plays the roll as that of the n th derivative the Taylor series. In other words, given a general functional f of some simple point process, then this Taylor-like theorem for non-Poisson point processes means an expansion exists for the expectation of the function E, provided some mathematical condition is satisfied, which ensures convergence of the expansion.
In probability theory, the central limit theorem (CLT) states that, under appropriate conditions, the distribution of a normalized version of the sample mean converges to a standard normal distribution. This holds even if the original variables themselves are not normally distributed. There are several versions of the CLT, each applying in the context of different conditions.
In probability, and statistics, a multivariate random variable or random vector is a list or vector of mathematical variables each of whose value is unknown, either because the value has not yet occurred or because there is imperfect knowledge of its value. The individual variables in a random vector are grouped together because they are all part of a single mathematical system — often they represent different properties of an individual statistical unit. For example, while a given person has a specific age, height and weight, the representation of these features of an unspecified person from within a group would be a random vector. Normally each element of a random vector is a real number.
In mathematics, differential forms provide a unified approach to define integrands over curves, surfaces, solids, and higher-dimensional manifolds. The modern notion of differential forms was pioneered by Élie Cartan. It has many applications, especially in geometry, topology and physics.
In mathematics, the moments of a function are certain quantitative measures related to the shape of the function's graph. If the function represents mass density, then the zeroth moment is the total mass, the first moment is the center of mass, and the second moment is the moment of inertia. If the function is a probability distribution, then the first moment is the expected value, the second central moment is the variance, the third standardized moment is the skewness, and the fourth standardized moment is the kurtosis.
In signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions. In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy.
In statistics and probability theory, a point process or point field is a collection of mathematical points randomly located on a mathematical space such as the real line or Euclidean space. Point processes can be used for spatial data analysis, which is of interest in such diverse disciplines as forestry, plant ecology, epidemiology, geography, seismology, materials science, astronomy, telecommunications, computational neuroscience, economics and others.
In probability theory and statistics, the characteristic function of any real-valued random variable completely defines its probability distribution. If a random variable admits a probability density function, then the characteristic function is the Fourier transform of the probability density function. Thus it provides an alternative route to analytical results compared with working directly with probability density functions or cumulative distribution functions. There are particularly simple results for the characteristic functions of distributions defined by the weighted sums of random variables.
In mathematics, ergodicity expresses the idea that a point of a moving system, either a dynamical system or a stochastic process, will eventually visit all parts of the space that the system moves in, in a uniform and random sense. This implies that the average behavior of the system can be deduced from the trajectory of a "typical" point. Equivalently, a sufficiently large collection of random samples from a process can represent the average statistical properties of the entire process. Ergodicity is a property of the system; it is a statement that the system cannot be reduced or factored into smaller components. Ergodic theory is the study of systems possessing ergodicity.
An -superprocess, , within mathematics probability theory is a stochastic process on that is usually constructed as a special limit of near-critical branching diffusions.
In mathematics, especially measure theory, a set function is a function whose domain is a family of subsets of some given set and that (usually) takes its values in the extended real number line which consists of the real numbers and
In statistics and in probability theory, distance correlation or distance covariance is a measure of dependence between two paired random vectors of arbitrary, not necessarily equal, dimension. The population distance correlation coefficient is zero if and only if the random vectors are independent. Thus, distance correlation measures both linear and nonlinear association between two random variables or random vectors. This is in contrast to Pearson's correlation, which can only detect linear association between two random variables.
In probability theory, a Laplace functional refers to one of two possible mathematical functions of functions or, more precisely, functionals that serve as mathematical tools for studying either point processes or concentration of measure properties of metric spaces. One type of Laplace functional, also known as a characteristic functional is defined in relation to a point process, which can be interpreted as random counting measures, and has applications in characterizing and deriving results on point processes. Its definition is analogous to a characteristic function for a random variable.
In mathematics, a determinantal point process is a stochastic point process, the probability distribution of which is characterized as a determinant of some function. Such processes arise as important tools in random matrix theory, combinatorics, physics, machine learning, and wireless network modeling.
In probability theory and statistics, Campbell's theorem or the Campbell–Hardy theorem is either a particular equation or set of results relating to the expectation of a function summed over a point process to an integral involving the mean measure of the point process, which allows for the calculation of expected value and variance of the random sum. One version of the theorem, also known as Campbell's formula, entails an integral equation for the aforementioned sum over a general point process, and not necessarily a Poisson point process. There also exist equations involving moment measures and factorial moment measures that are considered versions of Campbell's formula. All these results are employed in probability and statistics with a particular importance in the theory of point processes and queueing theory as well as the related fields stochastic geometry, continuum percolation theory, and spatial statistics.
In probability and statistics, point process notation comprises the range of mathematical notation used to symbolically represent random objects known as point processes, which are used in related fields such as stochastic geometry, spatial statistics and continuum percolation theory and frequently serve as mathematical models of random phenomena, representable as points, in time, space or both.
In probability and statistics, a moment measure is a mathematical quantity, function or, more precisely, measure that is defined in relation to mathematical objects known as point processes, which are types of stochastic processes often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. Moment measures generalize the idea of (raw) moments of random variables, hence arise often in the study of point processes and related fields.
In probability and statistics, a spherical contact distribution function, first contact distribution function, or empty space function is a mathematical function that is defined in relation to mathematical objects known as point processes, which are types of stochastic processes often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. More specifically, a spherical contact distribution function is defined as probability distribution of the radius of a sphere when it first encounters or makes contact with a point in a point process. This function can be contrasted with the nearest neighbour function, which is defined in relation to some point in the point process as being the probability distribution of the distance from that point to its nearest neighbouring point in the same point process.
In probability theory, statistics and related fields, a Poisson point process is a type of mathematical object that consists of points randomly located on a mathematical space with the essential feature that the points occur independently of one another. The process's name derives from the fact that the number of points in any given finite region follows a Poisson distribution. The process and the distribution are named after French mathematician Siméon Denis Poisson. The process itself was discovered independently and repeatedly in several settings, including experiments on radioactive decay, telephone call arrivals and actuarial science.
In probability and statistics, a nearest neighbor function, nearest neighbor distance distribution, nearest-neighbor distribution function or nearest neighbor distribution is a mathematical function that is defined in relation to mathematical objects known as point processes, which are often used as mathematical models of physical phenomena representable as randomly positioned points in time, space or both. More specifically, nearest neighbor functions are defined with respect to some point in the point process as being the probability distribution of the distance from this point to its nearest neighboring point in the same point process, hence they are used to describe the probability of another point existing within some distance of a point. A nearest neighbor function can be contrasted with a spherical contact distribution function, which is not defined in reference to some initial point but rather as the probability distribution of the radius of a sphere when it first encounters or makes contact with a point of a point process.
In statistics, functional correlation is a dimensionality reduction technique used to quantify the correlation and dependence between two variables when the data is functional. Several approaches have been developed to quantify the relation between two functional variables.