Poisson-type random measure

Last updated

Poisson-type random measures are a family of three random counting measures which are closed under restriction to a subspace, i.e. closed under thinning. They are the only distributions in the canonical non-negative power series family of distributions to possess this property and include the Poisson distribution, negative binomial distribution, and binomial distribution. [1] The PT family of distributions is also known as the Katz family of distributions, [2] the Panjer or (a,b,0) class of distributions [3] and may be retrieved through the Conway–Maxwell–Poisson distribution. [4]

Contents

Throwing stones

Let be a non-negative integer-valued random variable ) with law , mean and when it exists variance . Let be a probability measure on the measurable space . Let be a collection of iid random variables (stones) taking values in with law .

The random counting measure on depends on the pair of deterministic probability measures through the stone throwing construction (STC) [5]

where has law and iid have law . is a mixed binomial process [6]

Let be the collection of positive -measurable functions. The probability law of is encoded in the Laplace functional

where is the generating function of . The mean and variance are given by

and

The covariance for arbitrary is given by

When is Poisson, negative binomial, or binomial, it is said to be Poisson-type (PT). The joint distribution of the collection is for and

The following result extends construction of a random measure to the case when the collection is expanded to where is a random transformation of . Heuristically, represents some properties (marks) of . We assume that the conditional law of follows some transition kernel according to .

Theorem: Marked STC

Consider random measure and the transition probability kernel from into . Assume that given the collection the variables are conditionally independent with . Then is a random measure on . Here is understood as . Moreover, for any we have that where is pgf of and is defined as

The following corollary is an immediate consequence.

Corollary: Restricted STC

The quantity is a well-defined random measure on the measurable subspace where and . Moreover, for any , we have that where .

Note where we use .

Collecting Bones

The probability law of the random measure is determined by its Laplace functional and hence generating function.

Definition: Bone

Let be the counting variable of restricted to . When and share the same family of laws subject to a rescaling of the parameter , then is a called a bone distribution. The bone condition for the pgf is given by .

Equipped with the notion of a bone distribution and condition, the main result for the existence and uniqueness of Poisson-type (PT) random counting measures is given as follows.

Theorem: existence and uniqueness of PT random measures

Assume that with pgf belongs to the canonical non-negative power series (NNPS) family of distributions and . Consider the random measure on the space and assume that is diffuse. Then for any with there exists a mapping such that the restricted random measure is , that is,

iff is Poisson, negative binomial, or binomial (Poisson-type).

The proof for this theorem is based on a generalized additive Cauchy equation and its solutions. The theorem states that out of all NNPS distributions, only PT have the property that their restrictions share the same family of distribution as , that is, they are closed under thinning. The PT random measures are the Poisson random measure, negative binomial random measure, and binomial random measure. Poisson is additive with independence on disjoint sets, whereas negative binomial has positive covariance and binomial has negative covariance. The binomial process is a limiting case of binomial random measure where .

Distributional self-similarity applications

The "bone" condition on the pgf of encodes a distributional self-similarity property whereby all counts in restrictions (thinnings) to subspaces (encoded by pgf ) are in the same family as of through rescaling of the canonical parameter. These ideas appear closely connected to those of self-decomposability and stability of discrete random variables. [7] Binomial thinning is a foundational model to count time-series. [8] [9] The Poisson random measure has the well-known splitting property, is prototypical to the class of additive (completely random) random measures, and is related to the structure of Lévy processes, the jumps of Kolmogorov equations, and the excursions of Brownian motion. [10] Hence the self-similarity property of the PT family is fundamental to multiple areas. The PT family members are "primitives" or prototypical random measures by which many random measures and processes can be constructed.

Related Research Articles

<span class="mw-page-title-main">Chi-squared distribution</span> Probability distribution and special case of gamma distribution

In probability theory and statistics, the chi-squared distribution with degrees of freedom is the distribution of a sum of the squares of independent standard normal random variables.

In mathematical analysis, Hölder's inequality, named after Otto Hölder, is a fundamental inequality between integrals and an indispensable tool for the study of Lp spaces.

<span class="mw-page-title-main">Beta distribution</span> Probability distribution

In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval [0, 1] or in terms of two positive parameters, denoted by alpha (α) and beta (β), that appear as exponents of the variable and its complement to 1, respectively, and control the shape of the distribution.

<span class="mw-page-title-main">Gamma distribution</span> Probability distribution

In probability theory and statistics, the gamma distribution is a versatile two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-squared distribution are special cases of the gamma distribution. There are two equivalent parameterizations in common use:

  1. With a shape parameter α and a scale parameter θ
  2. With a shape parameter and a rate parameter

In probability and statistics, an exponential family is a parametric set of probability distributions of a certain form, specified below. This special form is chosen for mathematical convenience, including the enabling of the user to calculate expectations, covariances using differentiation based on some useful algebraic properties, as well as for generality, as exponential families are in a sense very natural sets of distributions to consider. The term exponential class is sometimes used in place of "exponential family", or the older term Koopman–Darmois family. Sometimes loosely referred to as "the" exponential family, this class of distributions is distinct because they all possess a variety of desirable properties, most importantly the existence of a sufficient statistic.

In probability theory, a Lévy process, named after the French mathematician Paul Lévy, is a stochastic process with independent, stationary increments: it represents the motion of a point whose successive displacements are random, in which displacements in pairwise disjoint time intervals are independent, and displacements in different time intervals of the same length have identical probability distributions. A Lévy process may thus be viewed as the continuous-time analog of a random walk.

<span class="mw-page-title-main">Rice distribution</span> Probability distribution

In probability theory, the Rice distribution or Rician distribution is the probability distribution of the magnitude of a circularly-symmetric bivariate normal random variable, possibly with non-zero mean (noncentral). It was named after Stephen O. Rice (1907–1986).

Coherent states have been introduced in a physical context, first as quasi-classical states in quantum mechanics, then as the backbone of quantum optics and they are described in that spirit in the article Coherent states. However, they have generated a huge variety of generalizations, which have led to a tremendous amount of literature in mathematical physics. In this article, we sketch the main directions of research on this line. For further details, we refer to several existing surveys.

In probability and statistics, a natural exponential family (NEF) is a class of probability distributions that is a special case of an exponential family (EF).

In operator theory, a branch of mathematics, a positive-definite kernel is a generalization of a positive-definite function or a positive-definite matrix. It was first introduced by James Mercer in the early 20th century, in the context of solving integral operator equations. Since then, positive-definite functions and their various analogues and generalizations have arisen in diverse parts of mathematics. They occur naturally in Fourier analysis, probability theory, operator theory, complex function-theory, moment problems, integral equations, boundary-value problems for partial differential equations, machine learning, embedding problem, information theory, and other areas.

In mathematics — specifically, in stochastic analysis — the infinitesimal generator of a Feller process is a Fourier multiplier operator that encodes a great deal of information about the process.

In fluid dynamics, the Oseen equations describe the flow of a viscous and incompressible fluid at small Reynolds numbers, as formulated by Carl Wilhelm Oseen in 1910. Oseen flow is an improved description of these flows, as compared to Stokes flow, with the (partial) inclusion of convective acceleration.

In probability theory, a Markov kernel is a map that in the general theory of Markov processes plays the role that the transition matrix does in the theory of Markov processes with a finite state space.

<span class="mw-page-title-main">Bending of plates</span> Deformation of slabs under load

Bending of plates, or plate bending, refers to the deflection of a plate perpendicular to the plane of the plate under the action of external forces and moments. The amount of deflection can be determined by solving the differential equations of an appropriate plate theory. The stresses in the plate can be calculated from these deflections. Once the stresses are known, failure theories can be used to determine whether a plate will fail under a given load.

In mathematical physics, the Belinfante–Rosenfeld tensor is a modification of the stress–energy tensor that is constructed from the canonical stress–energy tensor and the spin current so as to be symmetric yet still conserved.

Coherent states have been introduced in a physical context, first as quasi-classical states in quantum mechanics, then as the backbone of quantum optics and they are described in that spirit in the article Coherent states. However, they have generated a huge variety of generalizations, which have led to a tremendous amount of literature in mathematical physics. In this article, we sketch the main directions of research on this line. For further details, we refer to several existing surveys.

<span class="mw-page-title-main">Loop representation in gauge theories and quantum gravity</span> Description of gauge theories using loop operators

Attempts have been made to describe gauge theories in terms of extended objects such as Wilson loops and holonomies. The loop representation is a quantum hamiltonian representation of gauge theories in terms of loops. The aim of the loop representation in the context of Yang–Mills theories is to avoid the redundancy introduced by Gauss gauge symmetries allowing to work directly in the space of physical states. The idea is well known in the context of lattice Yang–Mills theory. Attempts to explore the continuous loop representation was made by Gambini and Trias for canonical Yang–Mills theory, however there were difficulties as they represented singular objects. As we shall see the loop formalism goes far beyond a simple gauge invariant description, in fact it is the natural geometrical framework to treat gauge theories and quantum gravity in terms of their fundamental physical excitations.

Exponential Tilting (ET), Exponential Twisting, or Exponential Change of Measure (ECM) is a distribution shifting technique used in many parts of mathematics. The different exponential tiltings of a random variable is known as the natural exponential family of .

In mathematics, the Poisson boundary is a probability space associated to a random walk. It is an object designed to encode the asymptotic behaviour of the random walk, i.e. how trajectories diverge when the number of steps goes to infinity. Despite being called a boundary it is in general a purely measure-theoretical object and not a boundary in the topological sense. However, in the case where the random walk is on a topological space the Poisson boundary can be related to the Martin boundary, which is an analytic construction yielding a genuine topological boundary. Both boundaries are related to harmonic functions on the space via generalisations of the Poisson formula.

Distributional data analysis is a branch of nonparametric statistics that is related to functional data analysis. It is concerned with random objects that are probability distributions, i.e., the statistical analysis of samples of random distributions where each atom of a sample is a distribution. One of the main challenges in distributional data analysis is that although the space of probability distributions is a convex space, it is not a vector space.

References

  1. Caleb Bastian, Gregory Rempala. Throwing stones and collecting bones: Looking for Poisson-like random measures, Mathematical Methods in the Applied Sciences, 2020. doi:10.1002/mma.6224
  2. Katz L.. Classical and Contagious Discrete Distributions ch. Unified treatment of a broad class of discrete probability distributions, :175-182. Pergamon Press, Oxford 1965.
  3. Panjer Harry H.. Recursive Evaluation of a Family of Compound Distributions. 1981;12(1):22-26
  4. Conway R. W., Maxwell W. L.. A Queuing Model with State Dependent Service Rates. Journal of Industrial Engineering. 1962;12.
  5. Cinlar Erhan. Probability and Stochastics. Springer-Verlag New York; 2011
  6. Kallenberg Olav. Random Measures, Theory and Applications. Springer; 2017
  7. Steutel FW, Van Harn K. Discrete analogues of self-decomposability and stability. The Annals of Probability. 1979;:893–899.
  8. Al-Osh M. A., Alzaid A. A.. First-order integer-valued autoregressive (INAR(1)) process. Journal of Time Series Analysis. 1987;8(3):261–275.
  9. Scotto Manuel G., Weiß Christian H., Gouveia Sónia. Thinning models in the analysis of integer-valued time series: a review. Statistical Modelling. 2015;15(6):590–618.
  10. Cinlar Erhan. Probability and Stochastics. Springer-Verlag New York; 2011.