Generalized normal distribution

Last updated

The generalized normal distribution (GND) or generalized Gaussian distribution (GGD) is either of two families of parametric continuous probability distributions on the real line. Both families add a shape parameter to the normal distribution. To distinguish the two families, they are referred to below as "symmetric" and "asymmetric"; however, this is not a standard nomenclature.

Contents

Symmetric version

Symmetric Generalized Normal
Probability density function
Generalized normal densities.svg
Cumulative distribution function
Generalized normal cdfs.svg
Parameters location (real)
scale (positive, real)
shape (positive, real)
Support
PDF



denotes the gamma function
CDF

where is a shape parameter, is a scale parameter and is the unnormalized incomplete lower gamma function.
Quantile


where is the quantile function of Gamma distribution [1]
Mean
Median
Mode
Variance
Skewness 0
Excess kurtosis
Entropy [2]

The symmetric generalized normal distribution, also known as the exponential power distribution or the generalized error distribution, is a parametric family of symmetric distributions. It includes all normal and Laplace distributions, and as limiting cases it includes all continuous uniform distributions on bounded intervals of the real line.

This family includes the normal distribution when (with mean and variance ) and it includes the Laplace distribution when . As , the density converges pointwise to a uniform density on .

This family allows for tails that are either heavier than normal (when ) or lighter than normal (when ). It is a useful way to parametrize a continuum of symmetric, platykurtic densities spanning from the normal () to the uniform density (), and a continuum of symmetric, leptokurtic densities spanning from the Laplace () to the normal density (). The shape parameter also controls the peakedness in addition to the tails.

Parameter estimation

Parameter estimation via maximum likelihood and the method of moments has been studied. [3] The estimates do not have a closed form and must be obtained numerically. Estimators that do not require numerical calculation have also been proposed. [4]

The generalized normal log-likelihood function has infinitely many continuous derivates (i.e. it belongs to the class C of smooth functions) only if is a positive, even integer. Otherwise, the function has continuous derivatives. As a result, the standard results for consistency and asymptotic normality of maximum likelihood estimates of only apply when .

Maximum likelihood estimator

It is possible to fit the generalized normal distribution adopting an approximate maximum likelihood method. [5] [6] With initially set to the sample first moment , is estimated by using a Newton–Raphson iterative procedure, starting from an initial guess of ,

where

is the first statistical moment of the absolute values and is the second statistical moment. The iteration is

where

and

and where and are the digamma function and trigamma function.

Given a value for , it is possible to estimate by finding the minimum of:

Finally is evaluated as

For , median is a more appropriate estimator of . Once is estimated, and can be estimated as described above. [7]

Applications

The symmetric generalized normal distribution has been used in modeling when the concentration of values around the mean and the tail behavior are of particular interest. [8] [9] Other families of distributions can be used if the focus is on other deviations from normality. If the symmetry of the distribution is the main interest, the skew normal family or asymmetric version of the generalized normal family discussed below can be used. If the tail behavior is the main interest, the student t family can be used, which approximates the normal distribution as the degrees of freedom grows to infinity. The t distribution, unlike this generalized normal distribution, obtains heavier than normal tails without acquiring a cusp at the origin. It finds uses in plasma physics under the name of Langdon Distribution resulting from inverse bremsstrahlung. [10]

In a linear regression problem modeled as , the MLE will be the where the p-norm is used.

Properties

Moments

Let be zero mean generalized Gaussian distribution of shape and scaling parameter . The moments of exist and are finite for any k greater than −1. For any non-negative integer k, the plain central moments are [2]

Connection to Stable Count Distribution

From the viewpoint of the Stable count distribution, can be regarded as Lévy's stability parameter. This distribution can be decomposed to an integral of kernel density where the kernel is either a Laplace distribution or a Gaussian distribution:

where is the Stable count distribution and is the Stable vol distribution.

Connection to Positive-Definite Functions

The probability density function of the symmetric generalized normal distribution is a positive-definite function for . [11] [12]

Infinite divisibility

The symmetric generalized Gaussian distribution is an infinitely divisible distribution if and only if . [13]

Generalizations

The multivariate generalized normal distribution, i.e. the product of exponential power distributions with the same and parameters, is the only probability density that can be written in the form and has independent marginals. [14] The results for the special case of the Multivariate normal distribution is originally attributed to Maxwell. [15]

Asymmetric version

Asymmetric Generalized Normal
Probability density function
Generalized normal densities 2.svg
Cumulative distribution function
Generalized normal cdfs 2.svg
Parameters location (real)
scale (positive, real)
shape (real)
Support

PDF , where

is the standard normal pdf
CDF , where

is the standard normal CDF
Mean
Median
Variance
Skewness
Excess kurtosis

The asymmetric generalized normal distribution is a family of continuous probability distributions in which the shape parameter can be used to introduce asymmetry or skewness. [16] [17] When the shape parameter is zero, the normal distribution results. Positive values of the shape parameter yield left-skewed distributions bounded to the right, and negative values of the shape parameter yield right-skewed distributions bounded to the left. Only when the shape parameter is zero is the density function for this distribution positive over the whole real line: in this case the distribution is a normal distribution, otherwise the distributions are shifted and possibly reversed log-normal distributions.

Parameter estimation

Parameters can be estimated via maximum likelihood estimation or the method of moments. The parameter estimates do not have a closed form, so numerical calculations must be used to compute the estimates. Since the sample space (the set of real numbers where the density is non-zero) depends on the true value of the parameter, some standard results about the performance of parameter estimates will not automatically apply when working with this family.

Applications

The asymmetric generalized normal distribution can be used to model values that may be normally distributed, or that may be either right-skewed or left-skewed relative to the normal distribution. The skew normal distribution is another distribution that is useful for modeling deviations from normality due to skew. Other distributions used to model skewed data include the gamma, lognormal, and Weibull distributions, but these do not include the normal distributions as special cases.

Kullback-Leibler divergence between two PDFs

Kullback-Leibler divergence (KLD) is a method using for compute the divergence or similarity between two probability density functions. [18]

Let and two generalized Gaussian distributions with parameters and subject to the constraint . [19] Then this divergence is given by:

The two generalized normal families described here, like the skew normal family, are parametric families that extends the normal distribution by adding a shape parameter. Due to the central role of the normal distribution in probability and statistics, many distributions can be characterized in terms of their relationship to the normal distribution. For example, the log-normal, folded normal, and inverse normal distributions are defined as transformations of a normally-distributed value, but unlike the generalized normal and skew-normal families, these do not include the normal distributions as special cases.

Actually all distributions with finite variance are in the limit highly related to the normal distribution. The Student-t distribution, the Irwin–Hall distribution and the Bates distribution also extend the normal distribution, and include in the limit the normal distribution. So there is no strong reason to prefer the "generalized" normal distribution of type 1, e.g. over a combination of Student-t and a normalized extended Irwin–Hall – this would include e.g. the triangular distribution (which cannot be modeled by the generalized Gaussian type 1).

A symmetric distribution which can model both tail (long and short) and center behavior (like flat, triangular or Gaussian) completely independently could be derived e.g. by using X = IH/chi.

The Tukey g- and h-distribution also allows for a deviation from normality, both through skewness and fat tails<ref>The Tukey g-and-h Distribution Yuan Yan, Marc G. Genton Significance, Volume 16, Issue 3, June 2019, Pages 12–13, https://doi.org/10.1111/j.1740-9713.2019.01273.x, https://academic.oup.com/jrssig/article/16/3/12/7037766?login=false<ref>.

See also

Related Research Articles

<span class="mw-page-title-main">Normal distribution</span> Probability distribution

In probability theory and statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is The parameter is the mean or expectation of the distribution, while the parameter is the variance. The standard deviation of the distribution is (sigma). A random variable with a Gaussian distribution is said to be normally distributed, and is called a normal deviate.

<span class="mw-page-title-main">Stress–energy tensor</span> Tensor describing energy momentum density in spacetime

The stress–energy tensor, sometimes called the stress–energy–momentum tensor or the energy–momentum tensor, is a tensor physical quantity that describes the density and flux of energy and momentum in spacetime, generalizing the stress tensor of Newtonian physics. It is an attribute of matter, radiation, and non-gravitational force fields. This density and flux of energy and momentum are the sources of the gravitational field in the Einstein field equations of general relativity, just as mass density is the source of such a field in Newtonian gravity.

<span class="mw-page-title-main">Beta distribution</span> Probability distribution

In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval [0, 1] or in terms of two positive parameters, denoted by alpha (α) and beta (β), that appear as exponents of the variable and its complement to 1, respectively, and control the shape of the distribution.

<span class="mw-page-title-main">Gumbel distribution</span> Particular case of the generalized extreme value distribution

In probability theory and statistics, the Gumbel distribution is used to model the distribution of the maximum of a number of samples of various distributions.

<span class="mw-page-title-main">Stable distribution</span> Distribution of variables which satisfies a stability property under linear combinations

In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be stable if its distribution is stable. The stable distribution family is also sometimes referred to as the Lévy alpha-stable distribution, after Paul Lévy, the first mathematician to have studied it.

<span class="mw-page-title-main">Pearson distribution</span> Family of continuous probability distributions

The Pearson distribution is a family of continuous probability distributions. It was first published by Karl Pearson in 1895 and subsequently extended by him in 1901 and 1916 in a series of articles on biostatistics.

<span class="mw-page-title-main">Generalized inverse Gaussian distribution</span> Family of continuous probability distributions

In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function

<span class="mw-page-title-main">Folded normal distribution</span> Probability distribution

The folded normal distribution is a probability distribution related to the normal distribution. Given a normally distributed random variable X with mean μ and variance σ2, the random variable Y = |X| has a folded normal distribution. Such a case may be encountered if only the magnitude of some variable is recorded, but not its sign. The distribution is called "folded" because probability mass to the left of x = 0 is folded over by taking the absolute value. In the physics of heat conduction, the folded normal distribution is a fundamental solution of the heat equation on the half space; it corresponds to having a perfect insulator on a hyperplane through the origin.

The normal-inverse Gaussian distribution is a continuous probability distribution that is defined as the normal variance-mean mixture where the mixing density is the inverse Gaussian distribution. The NIG distribution was noted by Blaesild in 1977 as a subclass of the generalised hyperbolic distribution discovered by Ole Barndorff-Nielsen. In the next year Barndorff-Nielsen published the NIG in another paper. It was introduced in the mathematical finance literature in 1997.

A ratio distribution is a probability distribution constructed as the distribution of the ratio of random variables having two other known distributions. Given two random variables X and Y, the distribution of the random variable Z that is formed as the ratio Z = X/Y is a ratio distribution.

In probability theory and statistics, the normal-gamma distribution is a bivariate four-parameter family of continuous probability distributions. It is the conjugate prior of a normal distribution with unknown mean and precision.

<span class="mw-page-title-main">Log-logistic distribution</span> Continuous probability distribution for a non-negative random variable

In probability and statistics, the log-logistic distribution is a continuous probability distribution for a non-negative random variable. It is used in survival analysis as a parametric model for events whose rate increases initially and decreases later, as, for example, mortality rate from cancer following diagnosis or treatment. It has also been used in hydrology to model stream flow and precipitation, in economics as a simple model of the distribution of wealth or income, and in networking to model the transmission times of data considering both the network and the software.

<span class="mw-page-title-main">Shifted log-logistic distribution</span>

The shifted log-logistic distribution is a probability distribution also known as the generalized log-logistic or the three-parameter log-logistic distribution. It has also been called the generalized logistic distribution, but this conflicts with other uses of the term: see generalized logistic distribution.

The term generalized logistic distribution is used as the name for several different families of probability distributions. For example, Johnson et al. list four forms, which are listed below.

<span class="mw-page-title-main">Skew normal distribution</span> Probability distribution

In probability theory and statistics, the skew normal distribution is a continuous probability distribution that generalises the normal distribution to allow for non-zero skewness.

<span class="mw-page-title-main">Normal-inverse-gamma distribution</span>

In probability theory and statistics, the normal-inverse-gamma distribution is a four-parameter family of multivariate continuous probability distributions. It is the conjugate prior of a normal distribution with unknown mean and variance.

In probability and statistics, the generalized K-distribution is a three-parameter family of continuous probability distributions. The distribution arises by compounding two gamma distributions. In each case, a re-parametrization of the usual form of the family of gamma distributions is used, such that the parameters are:

<span class="mw-page-title-main">Logit-normal distribution</span> Probability distribution

In probability theory, a logit-normal distribution is a probability distribution of a random variable whose logit has a normal distribution. If Y is a random variable with a normal distribution, and t is the standard logistic function, then X = t(Y) has a logit-normal distribution; likewise, if X is logit-normally distributed, then Y = logit(X)= log (X/(1-X)) is normally distributed. It is also known as the logistic normal distribution, which often refers to a multinomial logit version (e.g.).

A geometric stable distribution or geo-stable distribution is a type of leptokurtic probability distribution. Geometric stable distributions were introduced in Klebanov, L. B., Maniya, G. M., and Melamed, I. A. (1985). A problem of Zolotarev and analogs of infinitely divisible and stable distributions in a scheme for summing a random number of random variables. These distributions are analogues for stable distributions for the case when the number of summands is random, independent of the distribution of summand, and having geometric distribution. The geometric stable distribution may be symmetric or asymmetric. A symmetric geometric stable distribution is also referred to as a Linnik distribution. The Laplace distribution and asymmetric Laplace distribution are special cases of the geometric stable distribution. The Mittag-Leffler distribution is also a special case of a geometric stable distribution.

In statistics and probability theory, the nonparametric skew is a statistic occasionally used with random variables that take real values. It is a measure of the skewness of a random variable's distribution—that is, the distribution's tendency to "lean" to one side or the other of the mean. Its calculation does not require any knowledge of the form of the underlying distribution—hence the name nonparametric. It has some desirable properties: it is zero for any symmetric distribution; it is unaffected by a scale shift; and it reveals either left- or right-skewness equally well. In some statistical samples it has been shown to be less powerful than the usual measures of skewness in detecting departures of the population from normality.

References

  1. Griffin, Maryclare. "Working with the Exponential Power Distribution Using gnorm". Github, gnorm package. Retrieved 26 June 2020.
  2. 1 2 Nadarajah, Saralees (September 2005). "A generalized normal distribution". Journal of Applied Statistics. 32 (7): 685–694. Bibcode:2005JApSt..32..685N. doi:10.1080/02664760500079464. S2CID   121914682.
  3. Varanasi, M.K.; Aazhang, B. (October 1989). "Parametric generalized Gaussian density estimation". Journal of the Acoustical Society of America. 86 (4): 1404–1415. Bibcode:1989ASAJ...86.1404V. doi:10.1121/1.398700.
  4. Domínguez-Molina, J. Armando; González-Farías, Graciela; Rodríguez-Dagnino, Ramón M. "A practical procedure to estimate the shape parameter in the generalized Gaussian distribution" (PDF). Archived from the original (PDF) on 2007-09-28. Retrieved 2009-03-03.
  5. Varanasi, M.K.; Aazhang B. (1989). "Parametric generalized Gaussian density estimation". J. Acoust. Soc. Am. 86 (4): 1404–1415. Bibcode:1989ASAJ...86.1404V. doi:10.1121/1.398700.
  6. Do, M.N.; Vetterli, M. (February 2002). "Wavelet-based Texture Retrieval Using Generalised Gaussian Density and Kullback-Leibler Distance". IEEE Transactions on Image Processing. 11 (2): 146–158. Bibcode:2002ITIP...11..146D. doi:10.1109/83.982822. PMID   18244620.
  7. Varanasi, Mahesh K.; Aazhang, Behnaam (1989-10-01). "Parametric generalized Gaussian density estimation". The Journal of the Acoustical Society of America. 86 (4): 1404–1415. Bibcode:1989ASAJ...86.1404V. doi:10.1121/1.398700. ISSN   0001-4966.
  8. Liang, Faming; Liu, Chuanhai; Wang, Naisyin (April 2007). "A robust sequential Bayesian method for identification of differentially expressed genes". Statistica Sinica. 17 (2): 571–597. Archived from the original on 2007-10-09. Retrieved 2009-03-03.
  9. Box, George E. P.; Tiao, George C. (1992). Bayesian Inference in Statistical Analysis. New York: Wiley. ISBN   978-0-471-57428-6.
  10. Milder, Avram L. (2021). Electron velocity distribution functions and Thomson scattering (PhD thesis). University of Rochester. hdl: 1802/36536 .
  11. Dytso, Alex; Bustin, Ronit; Poor, H. Vincent; Shamai, Shlomo (2018). "Analytical properties of generalized Gaussian distributions". Journal of Statistical Distributions and Applications. 5 (1): 6. doi: 10.1186/s40488-018-0088-5 .
  12. Bochner, Salomon (1937). "Stable laws of probability and completely monotone functions". Duke Mathematical Journal. 3 (4): 726–728. doi:10.1215/s0012-7094-37-00360-0.
  13. Dytso, Alex; Bustin, Ronit; Poor, H. Vincent; Shamai, Shlomo (2018). "Analytical properties of generalized Gaussian distributions". Journal of Statistical Distributions and Applications. 5 (1): 6. doi: 10.1186/s40488-018-0088-5 .
  14. Sinz, Fabian; Gerwinn, Sebastian; Bethge, Matthias (May 2009). "Characterization of the p-Generalized Normal Distribution". Journal of Multivariate Analysis. 100 (5): 817–820. doi: 10.1016/j.jmva.2008.07.006 .
  15. Kac, M. (1939). "On a characterization of the normal distribution". American Journal of Mathematics. 61 (3): 726–728. doi:10.2307/2371328. JSTOR   2371328.
  16. Hosking, J.R.M., Wallis, J.R. (1997) Regional frequency analysis: an approach based on L-moments, Cambridge University Press. ISBN   0-521-43045-3. Section A.8
  17. Documentation for the lmomco R package
  18. Kullback, S.; Leibler, R.A. (1951). "On information and sufficency". The Annals of Mathematical Statistics. 22 (1): 79-86. doi: 10.1214/aoms/1177729694 .
  19. Quintero-Rincón, A.; Pereyra, M.; D’Giano, C.; Batatia, H.; Risk, M. (2017). "A visual EEG epilepsy detection method based on a wavelet statistical representation and the Kullback-Leibler divergence". IFMBE Proceedings. 60: 13-16. doi: 10.1007/978-981-10-4086-3_4 . hdl: 11336/77054 .