Generalized inverse Gaussian distribution

Last updated
Generalized inverse Gaussian
Probability density function
GIG distribution pdf.svg
Parameters a > 0, b > 0, p real
Support x > 0
PDF
Mean

Mode
Variance
MGF
CF

In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function

Contents

where Kp is a modified Bessel function of the second kind, a > 0, b > 0 and p a real parameter. It is used extensively in geostatistics, statistical linguistics, finance, etc. This distribution was first proposed by Étienne Halphen. [1] [2] [3] It was rediscovered and popularised by Ole Barndorff-Nielsen, who called it the generalized inverse Gaussian distribution. Its statistical properties are discussed in Bent Jørgensen's lecture notes. [4]

Properties

Alternative parametrization

By setting and , we can alternatively express the GIG distribution as

where is the concentration parameter while is the scaling parameter.

Summation

Barndorff-Nielsen and Halgreen proved that the GIG distribution is infinitely divisible. [5]

Entropy

The entropy of the generalized inverse Gaussian distribution is given as[ citation needed ]

where is a derivative of the modified Bessel function of the second kind with respect to the order evaluated at

Characteristic Function

The characteristic of a random variable is given as(for a derivation of the characteristic function, see supplementary materials of [6] )

for where denotes the imaginary number.

Special cases

The inverse Gaussian and gamma distributions are special cases of the generalized inverse Gaussian distribution for p = −1/2 and b = 0, respectively. [7] Specifically, an inverse Gaussian distribution of the form

is a GIG with , , and . A Gamma distribution of the form

is a GIG with , , and .

Other special cases include the inverse-gamma distribution, for a = 0. [7]

Conjugate prior for Gaussian

The GIG distribution is conjugate to the normal distribution when serving as the mixing distribution in a normal variance-mean mixture. [8] [9] Let the prior distribution for some hidden variable, say , be GIG:

and let there be observed data points, , with normal likelihood function, conditioned on

where is the normal distribution, with mean and variance . Then the posterior for , given the data is also GIG:

where . [note 1]

Sichel distribution

The Sichel distribution results when the GIG is used as the mixing distribution for the Poisson parameter . [10] [11]

Notes

  1. Due to the conjugacy, these details can be derived without solving integrals, by noting that
    .
    Omitting all factors independent of , the right-hand-side can be simplified to give an un-normalized GIG distribution, from which the posterior parameters can be identified.

Related Research Articles

<span class="mw-page-title-main">Exponential distribution</span> Probability distribution

In probability theory and statistics, the exponential distribution or negative exponential distribution is the probability distribution of the distance between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant average rate; the distance parameter could be any meaningful mono-dimensional measure of the process, such as time between production errors, or length along a roll of fabric in the weaving manufacturing process. It is a particular case of the gamma distribution. It is the continuous analogue of the geometric distribution, and it has the key property of being memoryless. In addition to being used for the analysis of Poisson point processes it is found in various other contexts.

<span class="mw-page-title-main">Stress–energy tensor</span> Tensor describing energy momentum density in spacetime

The stress–energy tensor, sometimes called the stress–energy–momentum tensor or the energy–momentum tensor, is a tensor physical quantity that describes the density and flux of energy and momentum in spacetime, generalizing the stress tensor of Newtonian physics. It is an attribute of matter, radiation, and non-gravitational force fields. This density and flux of energy and momentum are the sources of the gravitational field in the Einstein field equations of general relativity, just as mass density is the source of such a field in Newtonian gravity.

<span class="mw-page-title-main">Student's t-distribution</span> Probability distribution

In probability theory and statistics, Student's t distribution is a continuous probability distribution that generalizes the standard normal distribution. Like the latter, it is symmetric around zero and bell-shaped.

<span class="mw-page-title-main">Chi-squared distribution</span> Probability distribution and special case of gamma distribution

In probability theory and statistics, the chi-squared distribution with degrees of freedom is the distribution of a sum of the squares of independent standard normal random variables.

In the mathematical field of differential geometry, the Riemann curvature tensor or Riemann–Christoffel tensor is the most common way used to express the curvature of Riemannian manifolds. It assigns a tensor to each point of a Riemannian manifold. It is a local invariant of Riemannian metrics which measures the failure of the second covariant derivatives to commute. A Riemannian manifold has zero curvature if and only if it is flat, i.e. locally isometric to the Euclidean space. The curvature tensor can also be defined for any pseudo-Riemannian manifold, or indeed any manifold equipped with an affine connection.

<span class="mw-page-title-main">Beta distribution</span> Probability distribution

In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval [0, 1] or in terms of two positive parameters, denoted by alpha (α) and beta (β), that appear as exponents of the variable and its complement to 1, respectively, and control the shape of the distribution.

<span class="mw-page-title-main">Gamma distribution</span> Probability distribution

In probability theory and statistics, the gamma distribution is a versatile two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-squared distribution are special cases of the gamma distribution. There are two equivalent parameterizations in common use:

  1. With a shape parameter k and a scale parameter θ
  2. With a shape parameter and an inverse scale parameter , called a rate parameter.
<span class="mw-page-title-main">Stable distribution</span> Distribution of variables which satisfies a stability property under linear combinations

In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be stable if its distribution is stable. The stable distribution family is also sometimes referred to as the Lévy alpha-stable distribution, after Paul Lévy, the first mathematician to have studied it.

<span class="mw-page-title-main">Pearson distribution</span> Family of continuous probability distributions

The Pearson distribution is a family of continuous probability distributions. It was first published by Karl Pearson in 1895 and subsequently extended by him in 1901 and 1916 in a series of articles on biostatistics.

In general relativity, a geodesic generalizes the notion of a "straight line" to curved spacetime. Importantly, the world line of a particle free from all external, non-gravitational forces is a particular type of geodesic. In other words, a freely moving or falling particle always moves along a geodesic.

The generalised hyperbolic distribution (GH) is a continuous probability distribution defined as the normal variance-mean mixture where the mixing distribution is the generalized inverse Gaussian distribution (GIG). Its probability density function is given in terms of modified Bessel function of the second kind, denoted by . It was introduced by Ole Barndorff-Nielsen, who studied it in the context of physics of wind-blown sand.

<span class="mw-page-title-main">Maxwell's equations in curved spacetime</span> Electromagnetism in general relativity

In physics, Maxwell's equations in curved spacetime govern the dynamics of the electromagnetic field in curved spacetime or where one uses an arbitrary coordinate system. These equations can be viewed as a generalization of the vacuum Maxwell's equations which are normally formulated in the local coordinates of flat spacetime. But because general relativity dictates that the presence of electromagnetic fields induce curvature in spacetime, Maxwell's equations in flat spacetime should be viewed as a convenient approximation.

<span class="mw-page-title-main">Inverse Gaussian distribution</span> Family of continuous probability distributions

In probability theory, the inverse Gaussian distribution is a two-parameter family of continuous probability distributions with support on (0,∞).

A ratio distribution is a probability distribution constructed as the distribution of the ratio of random variables having two other known distributions. Given two random variables X and Y, the distribution of the random variable Z that is formed as the ratio Z = X/Y is a ratio distribution.

<span class="mw-page-title-main">Normal-inverse-gamma distribution</span>

In probability theory and statistics, the normal-inverse-gamma distribution is a four-parameter family of multivariate continuous probability distributions. It is the conjugate prior of a normal distribution with unknown mean and variance.

The generalized normal distribution (GND) or generalized Gaussian distribution (GGD) is either of two families of parametric continuous probability distributions on the real line. Both families add a shape parameter to the normal distribution. To distinguish the two families, they are referred to below as "symmetric" and "asymmetric"; however, this is not a standard nomenclature.

In statistics, the generalized Marcum Q-function of order is defined as

In statistics and probability theory, the nonparametric skew is a statistic occasionally used with random variables that take real values. It is a measure of the skewness of a random variable's distribution—that is, the distribution's tendency to "lean" to one side or the other of the mean. Its calculation does not require any knowledge of the form of the underlying distribution—hence the name nonparametric. It has some desirable properties: it is zero for any symmetric distribution; it is unaffected by a scale shift; and it reveals either left- or right-skewness equally well. In some statistical samples it has been shown to be less powerful than the usual measures of skewness in detecting departures of the population from normality.

In statistics, the matrix t-distribution is the generalization of the multivariate t-distribution from vectors to matrices.

<span class="mw-page-title-main">Dual graviton</span> Hypothetical particle found in supergravity

In theoretical physics, the dual graviton is a hypothetical elementary particle that is a dual of the graviton under electric-magnetic duality, as an S-duality, predicted by some formulations of eleven-dimensional supergravity.

References

  1. Seshadri, V. (1997). "Halphen's laws". In Kotz, S.; Read, C. B.; Banks, D. L. (eds.). Encyclopedia of Statistical Sciences, Update Volume 1. New York: Wiley. pp. 302–306.
  2. Perreault, L.; Bobée, B.; Rasmussen, P. F. (1999). "Halphen Distribution System. I: Mathematical and Statistical Properties". Journal of Hydrologic Engineering. 4 (3): 189. doi:10.1061/(ASCE)1084-0699(1999)4:3(189).
  3. Étienne Halphen was the grandson of the mathematician Georges Henri Halphen.
  4. Jørgensen, Bent (1982). Statistical Properties of the Generalized Inverse Gaussian Distribution. Lecture Notes in Statistics. Vol. 9. New York–Berlin: Springer-Verlag. ISBN   0-387-90665-7. MR   0648107.
  5. Barndorff-Nielsen, O.; Halgreen, Christian (1977). "Infinite Divisibility of the Hyperbolic and Generalized Inverse Gaussian Distributions". Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete. 38: 309–311. doi:10.1007/BF00533162.
  6. Pal, Subhadip; Gaskins, Jeremy (23 May 2022). "Modified Pólya-Gamma data augmentation for Bayesian analysis of directional data". Journal of Statistical Computation and Simulation. 92 (16): 3430–3451. doi:10.1080/00949655.2022.2067853. ISSN   0094-9655. S2CID   249022546.
  7. 1 2 Johnson, Norman L.; Kotz, Samuel; Balakrishnan, N. (1994), Continuous univariate distributions. Vol. 1, Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics (2nd ed.), New York: John Wiley & Sons, pp. 284–285, ISBN   978-0-471-58495-7, MR   1299979
  8. Karlis, Dimitris (2002). "An EM type algorithm for maximum likelihood estimation of the normal–inverse Gaussian distribution". Statistics & Probability Letters. 57 (1): 43–52. doi:10.1016/S0167-7152(02)00040-8.
  9. Barndorf-Nielsen, O. E. (1997). "Normal Inverse Gaussian Distributions and stochastic volatility modelling". Scand. J. Statist. 24 (1): 1–13. doi:10.1111/1467-9469.00045.
  10. Sichel, Herbert S. (1975). "On a distribution law for word frequencies". Journal of the American Statistical Association. 70 (351a): 542–547. doi:10.1080/01621459.1975.10482469.
  11. Stein, Gillian Z.; Zucchini, Walter; Juritz, June M. (1987). "Parameter estimation for the Sichel distribution and its multivariate extension". Journal of the American Statistical Association. 82 (399): 938–944. doi:10.1080/01621459.1987.10478520.

See also