Free independence

Last updated

In the mathematical theory of free probability, the notion of free independence was introduced by Dan Voiculescu. [1] The definition of free independence is parallel to the classical definition of independence, except that the role of Cartesian products of measure spaces (corresponding to tensor products of their function algebras) is played by the notion of a free product of (non-commutative) probability spaces.

Contents

In the context of Voiculescu's free probability theory, many classical-probability theorems or phenomena have free probability analogs: the same theorem or phenomenon holds (perhaps with slight modifications) if the classical notion of independence is replaced by free independence. Examples of this include: the free central limit theorem; notions of free convolution; existence of free stochastic calculus and so on.

Let be a non-commutative probability space, i.e. a unital algebra over equipped with a unital linear functional . As an example, one could take, for a probability measure ,

Another example may be , the algebra of matrices with the functional given by the normalized trace . Even more generally, could be a von Neumann algebra and a state on . A final example is the group algebra of a (discrete) group with the functional given by the group trace .

Let be a family of unital subalgebras of .

Definition. The family is called freely independent if whenever , and .

If , is a family of elements of (these can be thought of as random variables in ), they are called

freely independent if the algebras generated by and are freely independent.

Examples of free independence

Related Research Articles

In probability theory, the central limit theorem (CLT) establishes that, in some situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution even if the original variables themselves are not normally distributed. The theorem is a key concept in probability theory because it implies that probabilistic and statistical methods that work for normal distributions can be applicable to many problems involving other types of distributions.

Dirac equation Relativistic quantum mechanical wave equation

In particle physics, the Dirac equation is a relativistic wave equation derived by British physicist Paul Dirac in 1928. In its free form, or including electromagnetic interactions, it describes all spin-1/2 massive particles such as electrons and quarks for which parity is a symmetry. It is consistent with both the principles of quantum mechanics and the theory of special relativity, and was the first theory to account fully for special relativity in the context of quantum mechanics. It was validated by accounting for the fine details of the hydrogen spectrum in a completely rigorous way.

In quantum field theory, the Dirac spinor is the bispinor in the plane-wave solution

The Gram–Charlier A series, and the Edgeworth series are series that approximate a probability distribution in terms of its cumulants. The series are the same; but, the arrangement of terms differ. The key idea of these expansions is to write the characteristic function of the distribution whose probability density function f is to be approximated in terms of the characteristic function of a distribution with known and suitable properties, and to recover f through the inverse Fourier transform.

Stable distribution distribution of variables which satisfies a stability property under linear combinations

In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be stable if its distribution is stable. The stable distribution family is also sometimes referred to as the Lévy alpha-stable distribution, after Paul Lévy, the first mathematician to have studied it.

Semisimple Lie algebra Direct sum of simple Lie algebras

In mathematics, a Lie algebra is semisimple if it is a direct sum of simple Lie algebras, i.e., non-abelian Lie algebras whose only ideals are {0} and itself.

In mathematics, the Wasserstein or Kantorovich–Rubinstein metric or distance is a distance function defined between probability distributions on a given metric space .

In mathematics, the theory of optimal stopping or early stopping is concerned with the problem of choosing a time to take a particular action, in order to maximise an expected reward or minimise an expected cost. Optimal stopping problems can be found in areas of statistics, economics, and mathematical finance. A key example of an optimal stopping problem is the secretary problem. Optimal stopping problems can often be written in the form of a Bellman equation, and are therefore often solved using dynamic programming.

A ratio distribution is a probability distribution constructed as the distribution of the ratio of random variables having two other known distributions. Given two random variables X and Y, the distribution of the random variable Z that is formed as the ratio Z = X/Y is a ratio distribution.

Skew normal distribution continuous probability distribution

In probability theory and statistics, the skew normal distribution is a continuous probability distribution that generalises the normal distribution to allow for non-zero skewness.

In the mathematical analysis, and especially in real and harmonic analysis, a Birnbaum–Orlicz space is a type of function space which generalizes the Lp spaces. Like the Lp spaces, they are Banach spaces. The spaces are named for Władysław Orlicz and Zygmunt William Birnbaum, who first defined them in 1931.

Financial models with long-tailed distributions and volatility clustering have been introduced to overcome problems with the realism of classical financial models. These classical models of financial time series typically assume homoskedasticity and normality cannot explain stylized phenomena such as skewness, heavy tails, and volatility clustering of the empirical asset returns in finance. In 1963, Benoit Mandelbrot first used the stable distribution to model the empirical distributions which have the skewness and heavy-tail property. Since -stable distributions have infinite -th moments for all , the tempered stable processes have been proposed for overcoming this limitation of the stable distribution.

In mathematics, Maass forms or Maass wave forms are studied in the theory of automorphic forms. Maass forms are complex-valued smooth functions of the upper half plane, which transform in a similar way under the operation of a discrete subgroup of as modular forms. They are Eigenforms of the hyperbolic Laplace Operator defined on and satisfy certain growth conditions at the cusps of a fundamental domain of . In contrast to the modular forms the Maass forms need not be holomorphic. They were studied first by Hans Maass in 1949.

Free convolution is the free probability analog of the classical notion of convolution of probability measures. Due to the non-commutative nature of free probability theory, one has to talk separately about additive and multiplicative free convolution, which arise from addition and multiplication of free random variables. These operations have some interpretations in terms of empirical spectral measures of random matrices.

In mathematics, the Fortuin–Kasteleyn–Ginibre (FKG) inequality is a correlation inequality, a fundamental tool in statistical mechanics and probabilistic combinatorics, due to Cees M. Fortuin, Pieter W. Kasteleyn, and Jean Ginibre (1971). Informally, it says that in many random systems, increasing events are positively correlated, while an increasing and a decreasing event are negatively correlated. It was obtained by studying the random cluster model.

A product distribution is a probability distribution constructed as the distribution of the product of random variables having two other known distributions. Given two statistically independent random variables X and Y, the distribution of the random variable Z that is formed as the product

Coherent states have been introduced in a physical context, first as quasi-classical states in quantum mechanics, then as the backbone of quantum optics and they are described in that spirit in the article Coherent states. However, they have generated a huge variety of generalizations, which have led to a tremendous literature in mathematical physics. In this article, we sketch the main directions of research on this line. For further details, we refer to several existing surveys.

Arithmetic Fuchsian groups are a special class of Fuchsian groups constructed using orders in quaternion algebras. They are particular instances of arithmetic groups. The prototypical example of an arithmetic Fuchsian group is the modular group . They, and the hyperbolic surface associated to their action on the hyperbolic plane often exhibit particularly regular behaviour among Fuchsian groups and hyperbolic surfaces.

In representation theory of mathematics, the Waldspurger formula relates the special values of two L-functions of two related admissible irreducible representations. Let k be the base field, f be an automorphic form over k, π be the representation associated via the Jacquet–Langlands correspondence with f. Goro Shimura (1976) proved this formula, when and f is a cusp form; Günter Harder made the same discovery at the same time in an unpublished paper. Marie-France Vignéras (1980) proved this formula, when and f is a newform. Jean-Loup Waldspurger, for whom the formula is named, reproved and generalized the result of Vignéras in 1985 via a totally different method which was widely used thereafter by mathematicians to prove similar formulas.

In computational statistics, the preconditioned Crank–Nicolson algorithm (pCN) is a Markov chain Monte Carlo (MCMC) method for obtaining random samples – sequences of random observations – from a target probability distribution for which direct sampling is difficult.

References

  1. D. Voiculescu, K. Dykema, A. Nica, "Free Random Variables", CIRM Monograph Series, AMS, Providence, RI, 1992

Sources