Multivariate Laplace distribution

Last updated

In the mathematical theory of probability, multivariate Laplace distributions are extensions of the Laplace distribution and the asymmetric Laplace distribution to multiple variables. The marginal distributions of symmetric multivariate Laplace distribution variables are Laplace distributions. The marginal distributions of asymmetric multivariate Laplace distribution variables are asymmetric Laplace distributions. [1]

Contents

Symmetric multivariate Laplace distribution

Multivariate Laplace (symmetric)
Parameters μRklocation
ΣRk×kcovariance (positive-definite matrix)
Support xμ + span(Σ) ⊆ Rk
PDF
If ,

where and is the modified Bessel function of the second kind.
Mean μ
Mode μ
Variance Σ
Skewness 0
CF

A typical characterization of the symmetric multivariate Laplace distribution has the characteristic function:

where is the vector of means for each variable and is the covariance matrix. [2]

Unlike the multivariate normal distribution, even if the covariance matrix has zero covariance and correlation the variables are not independent. [1] The symmetric multivariate Laplace distribution is elliptical. [1]

Probability density function

If , the probability density function (pdf) for a k-dimensional multivariate Laplace distribution becomes:

where:

and is the modified Bessel function of the second kind. [1]

In the correlated bivariate case, i.e., k = 2, with the pdf reduces to:

where:

and are the standard deviations of and , respectively, and is the correlation coefficient of and . [1]

For the uncorrelated bivariate Laplace case, that is k = 2, and , the pdf becomes:

[1]

Asymmetric multivariate Laplace distribution

Multivariate Laplace (asymmetric)
Parameters μRklocation
ΣRk×kcovariance (positive-definite matrix)
Support xμ + span(Σ) ⊆ Rk
PDF
where and is the modified Bessel function of the second kind.
Mean μ
Variance Σ + μ ' μ
Skewness non-zero unless μ=0
CF

A typical characterization of the asymmetric multivariate Laplace distribution has the characteristic function:

[1]

As with the symmetric multivariate Laplace distribution, the asymmetric multivariate Laplace distribution has mean , but the covariance becomes . [3] The asymmetric multivariate Laplace distribution is not elliptical unless , in which case the distribution reduces to the symmetric multivariate Laplace distribution with . [1]

The probability density function (pdf) for a k-dimensional asymmetric multivariate Laplace distribution is:

where:

and is the modified Bessel function of the second kind. [1]

The asymmetric Laplace distribution, including the special case of , is an example of a geometric stable distribution. [3] It represents the limiting distribution for a sum of independent, identically distributed random variables with finite variance and covariance where the number of elements to be summed is itself an independent random variable distributed according to a geometric distribution. [1] Such geometric sums can arise in practical applications within biology, economics and insurance. [1] The distribution may also be applicable in broader situations to model multivariate data with heavier tails than a normal distribution but finite moments. [1]

The relationship between the exponential distribution and the Laplace distribution allows for a simple method for simulating bivariate asymmetric Laplace variables (including for the case of ). Simulate a bivariate normal random variable vector from a distribution with and covariance matrix . Independently simulate an exponential random variables W from an Exp(1) distribution. will be distributed (asymmetric) bivariate Laplace with mean and covariance matrix . [1]

Related Research Articles

<span class="mw-page-title-main">Normal distribution</span> Probability distribution

In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is

<span class="mw-page-title-main">Multivariate normal distribution</span> Generalization of the one-dimensional normal distribution to higher dimensions

In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One definition is that a random vector is said to be k-variate normally distributed if every linear combination of its k components has a univariate normal distribution. Its importance derives mainly from the multivariate central limit theorem. The multivariate normal distribution is often used to describe, at least approximately, any set of (possibly) correlated real-valued random variables each of which clusters around a mean value.

<span class="mw-page-title-main">Correlation</span> Statistical concept

In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statistics it usually refers to the degree to which a pair of variables are linearly related. Familiar examples of dependent phenomena include the correlation between the height of parents and their offspring, and the correlation between the price of a good and the quantity the consumers are willing to purchase, as it is depicted in the so-called demand curve.

<span class="mw-page-title-main">Covariance matrix</span> Measure of covariance of components of a random vector

In probability theory and statistics, a covariance matrix is a square matrix giving the covariance between each pair of elements of a given random vector. Any covariance matrix is symmetric and positive semi-definite and its main diagonal contains variances.

In 1851, George Gabriel Stokes derived an expression, now known as Stokes' law, for the frictional force – also called drag force – exerted on spherical objects with very small Reynolds numbers in a viscous fluid. Stokes' law is derived by solving the Stokes flow limit for small Reynolds numbers of the Navier–Stokes equations.

<span class="mw-page-title-main">Pearson correlation coefficient</span> Measure of linear correlation

In statistics, the Pearson correlation coefficient ― also known as Pearson's r, the Pearson product-moment correlation coefficient (PPMCC), the bivariate correlation, or colloquially simply as the correlation coefficient ― is a measure of linear correlation between two sets of data. It is the ratio between the covariance of two variables and the product of their standard deviations; thus, it is essentially a normalized measurement of the covariance, such that the result always has a value between −1 and 1. As with covariance itself, the measure can only reflect a linear correlation of variables, and ignores many other types of relationships or correlations. As a simple example, one would expect the age and height of a sample of teenagers from a high school to have a Pearson correlation coefficient significantly greater than 0, but less than 1.

In statistics, the Wishart distribution is a generalization to multiple dimensions of the gamma distribution. It is named in honor of John Wishart, who first formulated the distribution in 1928.

In statistics, propagation of uncertainty is the effect of variables' uncertainties on the uncertainty of a function based on them. When the variables are the values of experimental measurements they have uncertainties due to measurement limitations which propagate due to the combination of variables in the function.

Hotellings <i>T</i>-squared distribution

In statistics, particularly in hypothesis testing, the Hotelling's T-squared distribution (T2), proposed by Harold Hotelling, is a multivariate probability distribution that is tightly related to the F-distribution and is most notable for arising as the distribution of a set of sample statistics that are natural generalizations of the statistics underlying the Student's t-distribution. The Hotelling's t-squared statistic (t2) is a generalization of Student's t-statistic that is used in multivariate hypothesis testing.

In probability theory and statistics, the noncentral chi distribution is a noncentral generalization of the chi distribution. It is also known as the generalized Rayleigh distribution.

In probability theory, calculation of the sum of normally distributed random variables is an instance of the arithmetic of random variables, which can be quite complex based on the probability distributions of the random variables involved and their relationships.

<span class="mw-page-title-main">Oblate spheroidal coordinates</span> Three-dimensional orthogonal coordinate system

Oblate spheroidal coordinates are a three-dimensional orthogonal coordinate system that results from rotating the two-dimensional elliptic coordinate system about the non-focal axis of the ellipse, i.e., the symmetry axis that separates the foci. Thus, the two foci are transformed into a ring of radius in the x-y plane. Oblate spheroidal coordinates can also be considered as a limiting case of ellipsoidal coordinates in which the two largest semi-axes are equal in length.

<span class="mw-page-title-main">Sensitivity index</span>

The sensitivity index or discriminability index or detectability index is a dimensionless statistic used in signal detection theory. A higher index indicates that the signal can be more readily detected.

Bayesian linear regression is a type of conditional modeling in which the mean of one variable is described by a linear combination of other variables, with the goal of obtaining the posterior probability of the regression coefficients and ultimately allowing the out-of-sample prediction of the regressandconditional on observed values of the regressors. The simplest and most widely used version of this model is the normal linear model, in which given is distributed Gaussian. In this model, and under a particular choice of prior probabilities for the parameters—so-called conjugate priors—the posterior can be found analytically. With more arbitrarily chosen priors, the posteriors generally have to be approximated.

In statistics, the multivariate t-distribution is a multivariate probability distribution. It is a generalization to random vectors of the Student's t-distribution, which is a distribution applicable to univariate random variables. While the case of a random matrix could be treated within this structure, the matrix t-distribution is distinct and makes particular use of the matrix structure.

A ratio distribution is a probability distribution constructed as the distribution of the ratio of random variables having two other known distributions. Given two random variables X and Y, the distribution of the random variable Z that is formed as the ratio Z = X/Y is a ratio distribution.

<span class="mw-page-title-main">Normal-inverse-gamma distribution</span>

In probability theory and statistics, the normal-inverse-gamma distribution is a four-parameter family of multivariate continuous probability distributions. It is the conjugate prior of a normal distribution with unknown mean and variance.

In probability theory, the family of complex normal distributions, denoted or , characterizes complex random variables whose real and imaginary parts are jointly normal. The complex normal family has three parameters: location parameter μ, covariance matrix , and the relation matrix . The standard complex normal is the univariate distribution with , , and .

In probability theory and statistics, the negative multinomial distribution is a generalization of the negative binomial distribution to more than two outcomes.

<span class="mw-page-title-main">Logit-normal distribution</span>

In probability theory, a logit-normal distribution is a probability distribution of a random variable whose logit has a normal distribution. If Y is a random variable with a normal distribution, and P is the standard logistic function, then X = P(Y) has a logit-normal distribution; likewise, if X is logit-normally distributed, then Y = logit(X)= log is normally distributed. It is also known as the logistic normal distribution, which often refers to a multinomial logit version (e.g.).

References

  1. 1 2 3 4 5 6 7 8 9 10 11 12 13 Kotz. Samuel; Kozubowski, Tomasz J.; Podgorski, Krzysztof (2001). The Laplace Distribution and Generalizations. Birkhauser. pp. 229–245. ISBN   0817641661.
  2. Fragiadakis, Konstantinos & Meintanis, Simos G. (March 2011). "Goodness-of-fit tests for multivariate Laplace distributions". Mathematical and Computer Modelling. 53 (5–6): 769–779. doi: 10.1016/j.mcm.2010.10.014 .{{cite journal}}: CS1 maint: multiple names: authors list (link)
  3. 1 2 Kozubowski, Tomasz J.; Podgorski, Krzysztof; Rychlik, Igor (2010). "Multivariate Generalize Laplace Distributions and Related Random Fields" (PDF). University of Gothenburg. Retrieved 2017-05-28.