Q-Gaussian distribution

Last updated
q-Gaussian
Probability density function
The PDF of QGaussian.svg
Parameters shape (real)
(real)
Support for
for
PDF
CDF see text
Mean , otherwise undefined
Median
Mode
Variance

Skewness
Excess kurtosis

The q-Gaussian is a probability distribution arising from the maximization of the Tsallis entropy under appropriate constraints. It is one example of a Tsallis distribution. The q-Gaussian is a generalization of the Gaussian in the same way that Tsallis entropy is a generalization of standard Boltzmann–Gibbs entropy or Shannon entropy. [1] The normal distribution is recovered as q  1.

Contents

The q-Gaussian has been applied to problems in the fields of statistical mechanics, geology, anatomy, astronomy, economics, finance, and machine learning.[ citation needed ] The distribution is often favored for its heavy tails in comparison to the Gaussian for 1 < q < 3. For the q-Gaussian distribution is the PDF of a bounded random variable. This makes in biology and other domains [2] the q-Gaussian distribution more suitable than Gaussian distribution to model the effect of external stochasticity. A generalized q-analog of the classical central limit theorem [3] was proposed in 2008, in which the independence constraint for the i.i.d. variables is relaxed to an extent defined by the q parameter, with independence being recovered as q  1. However, a proof of such a theorem is still lacking. [4]

In the heavy tail regions, the distribution is equivalent to the Student's t-distribution with a direct mapping between q and the degrees of freedom. A practitioner using one of these distributions can therefore parameterize the same distribution in two different ways. The choice of the q-Gaussian form may arise if the system is non-extensive, or if there is lack of a connection to small samples sizes.

Characterization

Probability density function

The standard q-Gaussian has the probability density function [3]

where

is the q-exponential and the normalization factor is given by

Note that for the q-Gaussian distribution is the PDF of a bounded random variable.

Cumulative density function

For cumulative density function is [5]

where is the hypergeometric function. As the hypergeometric function is defined for |z| < 1 but x is unbounded, Pfaff transformation could be used.

For ,

Entropy

Just as the normal distribution is the maximum information entropy distribution for fixed values of the first moment and second moment (with the fixed zeroth moment corresponding to the normalization condition), the q-Gaussian distribution is the maximum Tsallis entropy distribution for fixed values of these three moments.

Student's t-distribution

While it can be justified by an interesting alternative form of entropy, statistically it is a scaled reparametrization of the Student's t-distribution introduced by W. Gosset in 1908 to describe small-sample statistics. In Gosset's original presentation the degrees of freedom parameter ν was constrained to be a positive integer related to the sample size, but it is readily observed that Gosset's density function is valid for all real values of ν.[ citation needed ] The scaled reparametrization introduces the alternative parameters q and β which are related to ν.

Given a Student's t-distribution with ν degrees of freedom, the equivalent q-Gaussian has

with inverse

Whenever , the function is simply a scaled version of Student's t-distribution.

It is sometimes argued that the distribution is a generalization of Student's t-distribution to negative and or non-integer degrees of freedom. However, the theory of Student's t-distribution extends trivially to all real degrees of freedom, where the support of the distribution is now compact rather than infinite in the case of ν < 0.[ citation needed ]

Three-parameter version

As with many distributions centered on zero, the q-Gaussian can be trivially extended to include a location parameter μ. The density then becomes defined by

Generating random deviates

The Box–Muller transform has been generalized to allow random sampling from q-Gaussians. [6] The standard Box–Muller technique generates pairs of independent normally distributed variables from equations of the following form.

The generalized Box–Muller technique can generates pairs of q-Gaussian deviates that are not independent. In practice, only a single deviate will be generated from a pair of uniformly distributed variables. The following formula will generate deviates from a q-Gaussian with specified parameter q and

where is the q-logarithm and

These deviates can be transformed to generate deviates from an arbitrary q-Gaussian by

Applications

Physics

It has been shown that the momentum distribution of cold atoms in dissipative optical lattices is a q-Gaussian. [7]

The q-Gaussian distribution is also obtained as the asymptotic probability density function of the position of the unidimensional motion of a mass subject to two forces: a deterministic force of the type (determining an infinite potential well) and a stochastic white noise force , where is a white noise. Note that in the overdamped/small mass approximation the above-mentioned convergence fails for , as recently shown. [8]

Finance

Financial return distributions in the New York Stock Exchange, NASDAQ and elsewhere have been interpreted as q-Gaussians. [9] [10]

See also

Notes

  1. Tsallis, C. Nonadditive entropy and nonextensive statistical mechanics-an overview after 20 years. Braz. J. Phys. 2009, 39, 337–356
  2. d'Onofrio A. (ed.) Bounded Noises in Physics, Biology, and Engineering. Birkhauser (2013)
  3. 1 2 Umarov, Sabir; Tsallis, Constantino; Steinberg, Stanly (2008). "On a q-Central Limit Theorem Consistent with Nonextensive Statistical Mechanics" (PDF). Milan J. Math. 76. Birkhauser Verlag: 307–328. doi:10.1007/s00032-008-0087-y. S2CID   55967725 . Retrieved 2011-07-27.
  4. Hilhorst, H.J. (2010), "Note on a q-modified central limit theorem", Journal of Statistical Mechanics: Theory and Experiment, 2010 (10): 10023, arXiv: 1008.4259 , Bibcode:2010JSMTE..10..023H, doi:10.1088/1742-5468/2010/10/P10023, S2CID   119316670.
  5. https://reference.wolframcloud.com/language/ref/TsallisQGaussianDistribution.html [ bare URL ]
  6. W. Thistleton, J.A. Marsh, K. Nelson and C. Tsallis, Generalized Box–Muller method for generating q-Gaussian random deviates, IEEE Transactions on Information Theory 53, 4805 (2007)
  7. Douglas, P.; Bergamini, S.; Renzoni, F. (2006). "Tunable Tsallis Distributions in Dissipative Optical Lattices" (PDF). Physical Review Letters. 96 (11): 110601. Bibcode:2006PhRvL..96k0601D. doi:10.1103/PhysRevLett.96.110601. PMID   16605807.
  8. Domingo, Dario; d’Onofrio, Alberto; Flandoli, Franco (2017). "Boundedness vs unboundedness of a noise linked to Tsallis q-statistics: The role of the overdamped approximation". Journal of Mathematical Physics. 58 (3). AIP Publishing: 033301. arXiv: 1709.08260 . Bibcode:2017JMP....58c3301D. doi:10.1063/1.4977081. ISSN   0022-2488. S2CID   84178785.
  9. Borland, Lisa (2002-08-07). "Option Pricing Formulas Based on a Non-Gaussian Stock Price Model". Physical Review Letters. 89 (9). American Physical Society (APS): 098701. arXiv: cond-mat/0204331 . Bibcode:2002PhRvL..89i8701B. doi:10.1103/physrevlett.89.098701. ISSN   0031-9007. PMID   12190447. S2CID   5740827.
  10. L. Borland, The pricing of stock options, in Nonextensive Entropy – Interdisciplinary Applications, eds. M. Gell-Mann and C. Tsallis (Oxford University Press, New York, 2004)

Further reading

Related Research Articles

<span class="mw-page-title-main">Student's t-distribution</span> Probability distribution

In probability theory and statistics, Student's t distribution is a continuous probability distribution that generalizes the standard normal distribution. Like the latter, it is symmetric around zero and bell-shaped.

<span class="mw-page-title-main">Weibull distribution</span> Continuous probability distribution

In probability theory and statistics, the Weibull distribution is a continuous probability distribution. It models a broad range of random variables, largely in the nature of a time to failure or time between events. Examples are maximum one-day rainfalls and the time a user spends on a web page.

<span class="mw-page-title-main">Beta distribution</span> Probability distribution

In probability theory and statistics, the beta distribution is a family of continuous probability distributions defined on the interval [0, 1] or in terms of two positive parameters, denoted by alpha (α) and beta (β), that appear as exponents of the variable and its complement to 1, respectively, and control the shape of the distribution.

<span class="mw-page-title-main">Gamma distribution</span> Probability distribution

In probability theory and statistics, the gamma distribution is a versatile two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-squared distribution are special cases of the gamma distribution. There are two equivalent parameterizations in common use:

  1. With a shape parameter k and a scale parameter θ
  2. With a shape parameter and an inverse scale parameter , called a rate parameter.
<span class="mw-page-title-main">Theta function</span> Special functions of several complex variables

In mathematics, theta functions are special functions of several complex variables. They show up in many topics, including Abelian varieties, moduli spaces, quadratic forms, and solitons. As Grassmann algebras, they appear in quantum field theory.

In physics, a wave vector is a vector used in describing a wave, with a typical unit being cycle per metre. It has a magnitude and direction. Its magnitude is the wavenumber of the wave, and its direction is perpendicular to the wavefront. In isotropic media, this is also the direction of wave propagation.

<span class="mw-page-title-main">Stable distribution</span> Distribution of variables which satisfies a stability property under linear combinations

In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be stable if its distribution is stable. The stable distribution family is also sometimes referred to as the Lévy alpha-stable distribution, after Paul Lévy, the first mathematician to have studied it.

<span class="mw-page-title-main">Voigt profile</span> Probability distribution

The Voigt profile is a probability distribution given by a convolution of a Cauchy-Lorentz distribution and a Gaussian distribution. It is often used in analyzing data from spectroscopy or diffraction.

<span class="mw-page-title-main">Inverse-gamma distribution</span> Two-parameter family of continuous probability distributions

In probability theory and statistics, the inverse gamma distribution is a two-parameter family of continuous probability distributions on the positive real line, which is the distribution of the reciprocal of a variable distributed according to the gamma distribution.

<span class="mw-page-title-main">Chi distribution</span> Probability distribution

In probability theory and statistics, the chi distribution is a continuous probability distribution over the non-negative real line. It is the distribution of the positive square root of a sum of squared independent Gaussian random variables. Equivalently, it is the distribution of the Euclidean distance between a multivariate Gaussian random variable and the origin. The chi distribution describes the positive square roots of a variable obeying a chi-squared distribution.

<span class="mw-page-title-main">Hypergeometric function</span> Function defined by a hypergeometric series

In mathematics, the Gaussian or ordinary hypergeometric function2F1(a,b;c;z) is a special function represented by the hypergeometric series, that includes many other special functions as specific or limiting cases. It is a solution of a second-order linear ordinary differential equation (ODE). Every second-order linear ODE with three regular singular points can be transformed into this equation.

<span class="mw-page-title-main">Lemniscate constant</span> Ratio of the perimeter of Bernoullis lemniscate to its diameter

In mathematics, the lemniscate constantϖ is a transcendental mathematical constant that is the ratio of the perimeter of Bernoulli's lemniscate to its diameter, analogous to the definition of π for the circle. Equivalently, the perimeter of the lemniscate is 2ϖ. The lemniscate constant is closely related to the lemniscate elliptic functions and approximately equal to 2.62205755. It also appears in evaluation of the gamma and beta function at certain rational values. The symbol ϖ is a cursive variant of π; see Pi § Variant pi.

Differential entropy is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP). Differential entropy is commonly encountered in the literature, but it is a limiting case of the LDDP, and one that loses its fundamental association with discrete entropy.

Expected shortfall (ES) is a risk measure—a concept used in the field of financial risk measurement to evaluate the market risk or credit risk of a portfolio. The "expected shortfall at q% level" is the expected return on the portfolio in the worst of cases. ES is an alternative to value at risk that is more sensitive to the shape of the tail of the loss distribution.

A ratio distribution is a probability distribution constructed as the distribution of the ratio of random variables having two other known distributions. Given two random variables X and Y, the distribution of the random variable Z that is formed as the ratio Z = X/Y is a ratio distribution.

<span class="mw-page-title-main">Contact mechanics</span> Study of the deformation of solids that touch each other

Contact mechanics is the study of the deformation of solids that touch each other at one or more points. A central distinction in contact mechanics is between stresses acting perpendicular to the contacting bodies' surfaces and frictional stresses acting tangentially between the surfaces. Normal contact mechanics or frictionless contact mechanics focuses on normal stresses caused by applied normal forces and by the adhesion present on surfaces in close contact, even if they are clean and dry. Frictional contact mechanics emphasizes the effect of friction forces.

A product distribution is a probability distribution constructed as the distribution of the product of random variables having two other known distributions. Given two statistically independent random variables X and Y, the distribution of the random variable Z that is formed as the product is a product distribution.

<span class="mw-page-title-main">Thermal fluctuations</span> Random temperature-influenced deviations of particles from their average state

In statistical mechanics, thermal fluctuations are random deviations of an atomic system from its average state, that occur in a system at equilibrium. All thermal fluctuations become larger and more frequent as the temperature increases, and likewise they decrease as temperature approaches absolute zero.

In experimental physics, researchers have proposed non-extensive self-consistent thermodynamic theory to describe phenomena observed in the Large Hadron Collider (LHC). This theory investigates a fireball for high-energy particle collisions, while using Tsallis non-extensive thermodynamics. Fireballs lead to the bootstrap idea, or self-consistency principle, just as in the Boltzmann statistics used by Rolf Hagedorn. Assuming the distribution function gets variations, due to possible symmetrical change, Abdel Nasser Tawfik applied the non-extensive concepts of high-energy particle production.

<span class="mw-page-title-main">Kaniadakis Gaussian distribution</span> Continuous probability distribution

The Kaniadakis Gaussian distribution is a probability distribution which arises as a generalization of the Gaussian distribution from the maximization of the Kaniadakis entropy under appropriated constraints. It is one example of a Kaniadakis κ-distribution. The κ-Gaussian distribution has been applied successfully for describing several complex systems in economy, geophysics, astrophysics, among many others.