Generalized integer gamma distribution

Last updated

In probability and statistics, the generalized integer gamma distribution (GIG) is the distribution of the sum of independent gamma distributed random variables, all with integer shape parameters and different rate parameters. This is a special case of the generalized chi-squared distribution. A related concept is the generalized near-integer gamma distribution (GNIG).

Contents

Definition

The random variable has a gamma distribution with shape parameter and rate parameter if its probability density function is

and this fact is denoted by

Let , where be independent random variables, with all being positive integers and all different. In other words, each variable has the Erlang distribution with different shape parameters. The uniqueness of each shape parameter comes without loss of generality, because any case where some of the are equal would be treated by first adding the corresponding variables: this sum would have a gamma distribution with the same rate parameter and a shape parameter which is equal to the sum of the shape parameters in the original distributions.

Then the random variable Y defined by

has a GIG (generalized integer gamma) distribution of depth with shape parameters and rate parameters . This fact is denoted by

It is also a special case of the generalized chi-squared distribution.

Properties

The probability density function and the cumulative distribution function of Y are respectively given by [1] [2] [3]

and

where

and

with

 

 

 

 

(1)

and

 

 

 

 

(2)

where

 

 

 

 

(3)

Alternative expressions are available in the literature on generalized chi-squared distribution, which is a field where computer algorithms have been available for some years.[ when? ]

Generalization

The GNIG (generalized near-integer gamma) distribution of depth is the distribution of the random variable [4]

where and are two independent random variables, where is a positive non-integer real and where .

Properties

The probability density function of is given by

and the cumulative distribution function is given by

where

with given by ( 1 )-( 3 ) above. In the above expressions is the Kummer confluent hypergeometric function. This function has usually very good convergence properties and is nowadays easily handled by a number of software packages.

Applications

The GIG and GNIG distributions are the basis for the exact and near-exact distributions of a large number of likelihood ratio test statistics and related statistics used in multivariate analysis. [5] [6] [7] [8] [9] More precisely, this application is usually for the exact and near-exact distributions of the negative logarithm of such statistics. If necessary, it is then easy, through a simple transformation, to obtain the corresponding exact or near-exact distributions for the corresponding likelihood ratio test statistics themselves. [4] [10] [11]

The GIG distribution is also the basis for a number of wrapped distributions in the wrapped gamma family. [12]

As being a special case of the generalized chi-squared distribution, there are many other applications; for example, in renewal theory [1] and in multi-antenna wireless communications. [13] [14] [15] [16]

Related Research Articles

<span class="mw-page-title-main">Negative binomial distribution</span> Probability distribution

In probability theory and statistics, the negative binomial distribution is a discrete probability distribution that models the number of failures in a sequence of independent and identically distributed Bernoulli trials before a specified (non-random) number of successes occurs. For example, we can define rolling a 6 on a dice as a success, and rolling any other number as a failure, and ask how many failure rolls will occur before we see the third success. In such a case, the probability distribution of the number of failures that appear will be a negative binomial distribution.

<span class="mw-page-title-main">Exponential distribution</span> Probability distribution

In probability theory and statistics, the exponential distribution or negative exponential distribution is the probability distribution of the distance between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant average rate; the distance parameter could be any meaningful mono-dimensional measure of the process, such as time between production errors, or length along a roll of fabric in the weaving manufacturing process. It is a particular case of the gamma distribution. It is the continuous analogue of the geometric distribution, and it has the key property of being memoryless. In addition to being used for the analysis of Poisson point processes it is found in various other contexts.

<span class="mw-page-title-main">Chi-squared distribution</span> Probability distribution and special case of gamma distribution

In probability theory and statistics, the chi-squared distribution with degrees of freedom is the distribution of a sum of the squares of independent standard normal random variables. The chi-squared distribution is a special case of the gamma distribution and is one of the most widely used probability distributions in inferential statistics, notably in hypothesis testing and in construction of confidence intervals. This distribution is sometimes called the central chi-squared distribution, a special case of the more general noncentral chi-squared distribution.

<span class="mw-page-title-main">Erlang distribution</span> Family of continuous probability distributions

The Erlang distribution is a two-parameter family of continuous probability distributions with support . The two parameters are:

<span class="mw-page-title-main">Weibull distribution</span> Continuous probability distribution

In probability theory and statistics, the Weibull distribution is a continuous probability distribution. It models a broad range of random variables, largely in the nature of a time to failure or time between events. Examples are maximum one-day rainfalls and the time a user spends on a web page.

<span class="mw-page-title-main">Gamma distribution</span> Probability distribution

In probability theory and statistics, the gamma distribution is a versatile two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-squared distribution are special cases of the gamma distribution. There are two equivalent parameterizations in common use:

  1. With a shape parameter k and a scale parameter θ
  2. With a shape parameter and an inverse scale parameter , called a rate parameter.
<span class="mw-page-title-main">Gumbel distribution</span> Particular case of the generalized extreme value distribution

In probability theory and statistics, the Gumbel distribution is used to model the distribution of the maximum of a number of samples of various distributions.

In probability theory, a compound Poisson distribution is the probability distribution of the sum of a number of independent identically-distributed random variables, where the number of terms to be added is itself a Poisson-distributed variable. The result can be either a continuous or a discrete distribution.

<span class="mw-page-title-main">Laplace distribution</span> Probability distribution

In probability theory and statistics, the Laplace distribution is a continuous probability distribution named after Pierre-Simon Laplace. It is also sometimes called the double exponential distribution, because it can be thought of as two exponential distributions spliced together along the abscissa, although the term is also sometimes used to refer to the Gumbel distribution. The difference between two independent identically distributed exponential random variables is governed by a Laplace distribution, as is a Brownian motion evaluated at an exponentially distributed random time. Increments of Laplace motion or a variance gamma process evaluated over the time scale also have a Laplace distribution.

<span class="mw-page-title-main">Dirichlet distribution</span> Probability distribution

In probability and statistics, the Dirichlet distribution (after Peter Gustav Lejeune Dirichlet), often denoted , is a family of continuous multivariate probability distributions parameterized by a vector of positive reals. It is a multivariate generalization of the beta distribution, hence its alternative name of multivariate beta distribution (MBD). Dirichlet distributions are commonly used as prior distributions in Bayesian statistics, and in fact, the Dirichlet distribution is the conjugate prior of the categorical distribution and multinomial distribution.

In statistics and information theory, a maximum entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions. According to the principle of maximum entropy, if nothing is known about a distribution except that it belongs to a certain class, then the distribution with the largest entropy should be chosen as the least-informative default. The motivation is twofold: first, maximizing entropy minimizes the amount of prior information built into the distribution; second, many physical systems tend to move towards maximal entropy configurations over time.

<span class="mw-page-title-main">Noncentral chi-squared distribution</span> Noncentral generalization of the chi-squared distribution

In probability theory and statistics, the noncentral chi-squared distribution is a noncentral generalization of the chi-squared distribution. It often arises in the power analysis of statistical tests in which the null distribution is a chi-squared distribution; important examples of such tests are the likelihood-ratio tests.

<span class="mw-page-title-main">Generalized inverse Gaussian distribution</span>

In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function

A ratio distribution is a probability distribution constructed as the distribution of the ratio of random variables having two other known distributions. Given two random variables X and Y, the distribution of the random variable Z that is formed as the ratio Z = X/Y is a ratio distribution.

<span class="mw-page-title-main">Conway–Maxwell–Poisson distribution</span> Probability distribution

In probability theory and statistics, the Conway–Maxwell–Poisson distribution is a discrete probability distribution named after Richard W. Conway, William L. Maxwell, and Siméon Denis Poisson that generalizes the Poisson distribution by adding a parameter to model overdispersion and underdispersion. It is a member of the exponential family, has the Poisson distribution and geometric distribution as special cases and the Bernoulli distribution as a limiting case.

<span class="mw-page-title-main">Poisson distribution</span> Discrete probability distribution

In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time if these events occur with a known constant mean rate and independently of the time since the last event. It can also be used for the number of events in other types of intervals than time, and in dimension greater than 1.

<span class="mw-page-title-main">Wrapped exponential distribution</span> Probability distribution

In probability theory and directional statistics, a wrapped exponential distribution is a wrapped probability distribution that results from the "wrapping" of the exponential distribution around the unit circle.

In probability and statistics, the generalized beta distribution is a continuous probability distribution with four shape parameters, including more than thirty named distributions as limiting or special cases. It has been used in the modeling of income distribution, stock returns, as well as in regression analysis. The exponential generalized beta (EGB) distribution follows directly from the GB and generalizes other common distributions.

In probability theory and statistics, the generalized multivariate log-gamma (G-MVLG) distribution is a multivariate distribution introduced by Demirhan and Hamurkaroglu in 2011. The G-MVLG is a flexible distribution. Skewness and kurtosis are well controlled by the parameters of the distribution. This enables one to control dispersion of the distribution. Because of this property, the distribution is effectively used as a joint prior distribution in Bayesian analysis, especially when the likelihood is not from the location-scale family of distributions such as normal distribution.

In statistics, the complex Wishart distribution is a complex version of the Wishart distribution. It is the distribution of times the sample Hermitian covariance matrix of zero-mean independent Gaussian random variables. It has support for Hermitian positive definite matrices.

References

  1. 1 2 Amari S.V. and Misra R.B. (1997). Closed-From Expressions for Distribution of Sum of Exponential Random Variables [ permanent dead link ]. IEEE Transactions on Reliability, vol. 46, no. 4, 519-522.
  2. Coelho, C. A. (1998). The Generalized Integer Gamma distribution – a basis for distributions in Multivariate Statistics. Journal of Multivariate Analysis, 64, 86-102.
  3. Coelho, C. A. (1999). Addendum to the paper ’The Generalized IntegerGamma distribution - a basis for distributions in MultivariateAnalysis’. Journal of Multivariate Analysis, 69, 281-285.
  4. 1 2 Coelho, C. A. (2004). "The Generalized Near-Integer Gamma distribution – a basis for ’near-exact’ approximations to the distributions of statistics which are the product of an odd number of particular independent Beta random variables". Journal of Multivariate Analysis, 89 (2), 191-218. MR 2063631 Zbl   1047.62014 [WOS: 000221483200001]
  5. Bilodeau, M., Brenner, D. (1999) "Theory of Multivariate Statistics". Springer, New York [Ch. 11, sec. 11.4]
  6. Das, S., Dey, D. K. (2010) "On Bayesian inference for generalized multivariate gamma distribution". Statistics and Probability Letters, 80, 1492-1499.
  7. Karagiannidis, K., Sagias, N. C., Tsiftsis, T. A. (2006) "Closed-form statistics for the sum of squared Nakagami-m variates and its applications". Transactions on Communications, 54, 1353-1359.
  8. Paolella, M. S. (2007) "Intermediate Probability - A Computational Approach". J. Wiley & Sons, New York [Ch. 2, sec. 2.2]
  9. Timm, N. H. (2002) "Applied Multivariate Analysis". Springer, New York [Ch. 3, sec. 3.5]
  10. Coelho, C. A. (2006) "The exact and near-exact distributions of the product of independent Beta random variables whose second parameter is rational". Journal of Combinatorics, Information & System Sciences, 31 (1-4), 21-44. MR 2351709
  11. Coelho, C. A., Alberto, R. P. and Grilo, L. M. (2006) "A mixture of Generalized Integer Gamma distributions as the exact distribution of the product of an odd number of independent Beta random variables.Applications". Journal of Interdisciplinary Mathematics, 9, 2, 229-248. MR 2245158 Zbl   1117.62017
  12. Coelho, C. A. (2007) "The wrapped Gamma distribution and wrapped sums and linear combinations of independent Gamma and Laplace distributions". Journal of Statistical Theory and Practice, 1 (1), 1-29.
  13. E. Björnson, D. Hammarwall, B. Ottersten (2009) "Exploiting Quantized Channel Norm Feedback through Conditional Statistics in Arbitrarily Correlated MIMO Systems", IEEE Transactions on Signal Processing, 57, 4027-4041
  14. Kaiser, T., Zheng, F. (2010) "Ultra Wideband Systems with MIMO". J. Wiley & Sons, Chichester, U.K. [Ch. 6, sec. 6.6]
  15. Suraweera, H. A., Smith, P. J., Surobhi, N. A. (2008) "Exact outage probability of cooperative diversity with opportunistic spectrum access". IEEE International Conference on Communications, 2008, ICC Workshops '08, 79-86 ( ISBN   978-1-4244-2052-0 - doi : 10.1109/ICCW.2008.20).
  16. Surobhi, N. A. (2010) "Outage performance of cooperative cognitive relay networks". MsC Thesis, School of Engineering and Science, Victoria University, Melbourne, Australia [Ch. 3, sec. 3.4].