Parameters | (real) (real) asymmetry parameter (real) scale parameter (real) location (real) | ||
---|---|---|---|
Support | |||
Mean | |||
Variance | |||
MGF |
The generalised hyperbolic distribution (GH) is a continuous probability distribution defined as the normal variance-mean mixture where the mixing distribution is the generalized inverse Gaussian distribution (GIG). Its probability density function (see the box) is given in terms of modified Bessel function of the second kind, denoted by . [1] It was introduced by Ole Barndorff-Nielsen, who studied it in the context of physics of wind-blown sand. [2]
This class is closed under affine transformations. [1]
Barndorff-Nielsen and Halgreen proved that the GIG distribution is infinitely divisible and since the GH distribution can be obtained as a normal variance-mean mixture where the mixing distribution is the generalized inverse Gaussian distribution, Barndorff-Nielsen and Halgreen showed the GH distribution is infinitely divisible as well. [3]
An important point about infinitely divisible distributions is their connection to Lévy processes, i.e. at any point in time a Lévy process is infinitely divisibly distributed. Many families of well-known infinitely divisible distributions are so-called convolution-closed, i.e. if the distribution of a Lévy process at one point in time belongs to one of these families, then the distribution of the Lévy process at all points in time belong to the same family of distributions. For example, a Poisson process will be Poisson-distributed at all points in time, or a Brownian motion will be normally distributed at all points in time. However, a Lévy process that is generalised hyperbolic at one point in time might fail to be generalized hyperbolic at another point in time. In fact, the generalized Laplace distributions and the normal inverse Gaussian distributions are the only subclasses of the generalized hyperbolic distributions that are closed under convolution. [4]
As the name suggests it is of a very general form, being the superclass of, among others, the Student's t-distribution, the Laplace distribution, the hyperbolic distribution, the normal-inverse Gaussian distribution and the variance-gamma distribution.
It is mainly applied to areas that require sufficient probability of far-field behaviour[ clarification needed ], which it can model due to its semi-heavy tails—a property the normal distribution does not possess. The generalised hyperbolic distribution is often used in economics, with particular application in the fields of modelling financial markets and risk management, due to its semi-heavy tails.
In mathematics, convolution is a mathematical operation on two functions that produces a third function. The term convolution refers to both the result function and to the process of computing it. It is defined as the integral of the product of the two functions after one is reflected about the y-axis and shifted. The integral is evaluated for all values of shift, producing the convolution function. The choice of which function is reflected and shifted before the integral does not change the integral result. Graphically, it expresses how the 'shape' of one function is modified by the other.
In probability theory and statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is
In probability and statistics, Student's t distribution is a continuous probability distribution that generalizes the standard normal distribution. Like the latter, it is symmetric around zero and bell-shaped.
In probability theory and statistics, the gamma distribution is a versatile two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-squared distribution are special cases of the gamma distribution. There are two equivalent parameterizations in common use:
In probability theory, a Lévy process, named after the French mathematician Paul Lévy, is a stochastic process with independent, stationary increments: it represents the motion of a point whose successive displacements are random, in which displacements in pairwise disjoint time intervals are independent, and displacements in different time intervals of the same length have identical probability distributions. A Lévy process may thus be viewed as the continuous-time analog of a random walk.
In probability theory and statistics, the Laplace distribution is a continuous probability distribution named after Pierre-Simon Laplace. It is also sometimes called the double exponential distribution, because it can be thought of as two exponential distributions spliced together along the abscissa, although the term is also sometimes used to refer to the Gumbel distribution. The difference between two independent identically distributed exponential random variables is governed by a Laplace distribution, as is a Brownian motion evaluated at an exponentially distributed random time. Increments of Laplace motion or a variance gamma process evaluated over the time scale also have a Laplace distribution.
In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be stable if its distribution is stable. The stable distribution family is also sometimes referred to as the Lévy alpha-stable distribution, after Paul Lévy, the first mathematician to have studied it.
In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function
In probability theory, the inverse Gaussian distribution is a two-parameter family of continuous probability distributions with support on (0,∞).
Ole Eiler Barndorff-Nielsen was a Danish statistician who has contributed to many areas of statistical science.
In probability theory and statistics, a normal variance-mean mixture with mixing probability density is the continuous probability distribution of a random variable of the form
The normal-inverse Gaussian distribution is a continuous probability distribution that is defined as the normal variance-mean mixture where the mixing density is the inverse Gaussian distribution. The NIG distribution was noted by Blaesild in 1977 as a subclass of the generalised hyperbolic distribution discovered by Ole Barndorff-Nielsen. In the next year Barndorff-Nielsen published the NIG in another paper. It was introduced in the mathematical finance literature in 1997.
The variance-gamma distribution, generalized Laplace distribution or Bessel function distribution is a continuous probability distribution that is defined as the normal variance-mean mixture where the mixing density is the gamma distribution. The tails of the distribution decrease more slowly than the normal distribution. It is therefore suitable to model phenomena where numerically large values are more probable than is the case for the normal distribution. Examples are returns from financial assets and turbulent wind speeds. The distribution was introduced in the financial literature by Madan and Seneta. The variance-gamma distributions form a subclass of the generalised hyperbolic distributions.
In probability and statistics, a natural exponential family (NEF) is a class of probability distributions that is a special case of an exponential family (EF).
Financial models with long-tailed distributions and volatility clustering have been introduced to overcome problems with the realism of classical financial models. These classical models of financial time series typically assume homoskedasticity and normality cannot explain stylized phenomena such as skewness, heavy tails, and volatility clustering of the empirical asset returns in finance. In 1963, Benoit Mandelbrot first used the stable distribution to model the empirical distributions which have the skewness and heavy-tail property. Since -stable distributions have infinite -th moments for all , the tempered stable processes have been proposed for overcoming this limitation of the stable distribution.
The generalized normal distribution or generalized Gaussian distribution (GGD) is either of two families of parametric continuous probability distributions on the real line. Both families add a shape parameter to the normal distribution. To distinguish the two families, they are referred to below as "symmetric" and "asymmetric"; however, this is not a standard nomenclature.
In probability theory and statistics, the normal-Wishart distribution is a multivariate four-parameter family of continuous probability distributions. It is the conjugate prior of a multivariate normal distribution with unknown mean and precision matrix.
In probability theory and statistics, the normal-inverse-Wishart distribution is a multivariate four-parameter family of continuous probability distributions. It is the conjugate prior of a multivariate normal distribution with unknown mean and covariance matrix.
An additive process, in probability theory, is a cadlag, continuous in probability stochastic process with independent increments. An additive process is the generalization of a Lévy process. An example of an additive process that is not a Lévy process is a Brownian motion with a time-dependent drift. The additive process was introduced by Paul Lévy in 1937.