Free convolution is the free probability analog of the classical notion of convolution of probability measures. Due to the non-commutative nature of free probability theory, one has to talk separately about additive and multiplicative free convolution, which arise from addition and multiplication of free random variables (see below; in the classical case, what would be the analog of free multiplicative convolution can be reduced to additive convolution by passing to logarithms of random variables). These operations have some interpretations in terms of empirical spectral measures of random matrices. [1]
Free probability is a mathematical theory that studies non-commutative random variables. The "freeness" or free independence property is the analogue of the classical notion of independence, and it is connected with free products. This theory was initiated by Dan Voiculescu around 1986 in order to attack the free group factors isomorphism problem, an important unsolved problem in the theory of operator algebras. Given a free group on some number of generators, we can consider the von Neumann algebra generated by the group algebra, which is a type II1 factor. The isomorphism problem asks whether these are isomorphic for different numbers of generators. It is not even known if any two free group factors are isomorphic. This is similar to Tarski's free group problem, which asks whether two different non-abelian finitely generated free groups have the same elementary theory.
In mathematics convolution is a mathematical operation on two functions to produce a third function that expresses how the shape of one is modified by the other. The term convolution refers to both the result function and to the process of computing it. Convolution is similar to cross-correlation. For real-valued functions, of a continuous or discrete variable, it differs from cross-correlation only in that either f (x) or g(x) is reflected about the y-axis; thus it is a cross-correlation of f (x) and g(−x), or f (−x) and g(x). For continuous functions, the cross-correlation operator is the adjoint of the convolution operator.
The notion of free convolution was introduced by Voiculescu. [2] [3]
Let and be two probability measures on the real line, and assume that is a random variable in a non commutative probability space with law and is a random variable in the same non commutative probability space with law . Assume finally that and are freely independent. Then the free additive convolution is the law of . Random matrices interpretation: if and are some independent by Hermitian (resp. real symmetric) random matrices such that at least one of them is invariant, in law, under conjugation by any unitary (resp. orthogonal) matrix and such that the empirical spectral measures of and tend respectively to and as tends to infinity, then the empirical spectral measure of tends to . [4]
In the mathematical theory of free probability, the notion of free independence was introduced by Dan Voiculescu. The definition of free independence is parallel to the classical definition of independence, except that the role of Cartesian products of measure spaces is played by the notion of a free product of (non-commutative) probability spaces.
In many cases, it is possible to compute the probability measure explicitly by using complex-analytic techniques and the R-transform of the measures and .
The rectangular free additive convolution (with ratio ) has also been defined in the non commutative probability framework by Benaych-Georges [5] and admits the following random matrices interpretation. For , for and are some independent by complex (resp. real) random matrices such that at least one of them is invariant, in law, under multiplication on the left and on the right by any unitary (resp. orthogonal) matrix and such that the empirical singular values distribution of and tend respectively to and as and tend to infinity in such a way that tends to , then the empirical singular values distribution of tends to . [6]
In many cases, it is possible to compute the probability measure explicitly by using complex-analytic techniques and the rectangular R-transform with ratio of the measures and .
Let and be two probability measures on the interval , and assume that is a random variable in a non commutative probability space with law and is a random variable in the same non commutative probability space with law . Assume finally that and are freely independent. Then the free multiplicative convolution is the law of (or, equivalently, the law of . Random matrices interpretation: if and are some independent by non negative Hermitian (resp. real symmetric) random matrices such that at least one of them is invariant, in law, under conjugation by any unitary (resp. orthogonal) matrix and such that the empirical spectral measures of and tend respectively to and as tends to infinity, then the empirical spectral measure of tends to . [7]
A similar definition can be made in the case of laws supported on the unit circle , with an orthogonal or unitary random matrices interpretation.
Explicit computations of multiplicative free convolution can be carried out using complex-analytic techniques and the S-transform.
Through its applications to random matrices, free convolution has some strong connections with other works on G-estimation of Girko.
The applications in wireless communications, finance and biology have provided a useful framework when the number of observations is of the same order as the dimensions of the system.
In probability theory, the normaldistribution is a very common continuous probability distribution. Normal distributions are important in statistics and are often used in the natural and social sciences to represent real-valued random variables whose distributions are not known. A random variable with a Gaussian distribution is said to be normally distributed and is called a normal deviate.
In mathematical analysis, the Haar measure assigns an "invariant volume" to subsets of locally compact topological groups, consequently defining an integral for functions on those groups.
In mathematics, the Kronecker delta is a function of two variables, usually just non-negative integers. The function is 1 if the variables are equal, and 0 otherwise:
In mathematics, a Lie superalgebra is a generalisation of a Lie algebra to include a Z2-grading. Lie superalgebras are important in theoretical physics where they are used to describe the mathematics of supersymmetry. In most of these theories, the even elements of the superalgebra correspond to bosons and odd elements to fermions.
In mathematics, the Radon–Nikodym theorem is a result in measure theory. It involves a measurable space on which two σ-finite measures are defined, and . It states that, if , then there is a measurable function , such that for any measurable set ,
In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be stable if its distribution is stable. The stable distribution family is also sometimes referred to as the Lévy alpha-stable distribution, after Paul Lévy, the first mathematician to have studied it.
In mathematics, the Bernoulli scheme or Bernoulli shift is a generalization of the Bernoulli process to more than two possible outcomes. Bernoulli schemes are important in the study of dynamical systems, as most such systems exhibit a repellor that is the product of the Cantor set and a smooth manifold, and the dynamics on the Cantor set are isomorphic to that of the Bernoulli shift. This is essentially the Markov partition. The term shift is in reference to the shift operator, which may be used to study Bernoulli schemes. The Ornstein isomorphism theorem shows that Bernoulli shifts are isomorphic when their entropy is equal.
In mathematics, the Wasserstein or Kantorovich-Rubinstein metric or distance is a distance function defined between probability distributions on a given metric space .
In statistics, the multivariate t-distribution is a multivariate probability distribution. It is a generalization to random vectors of the Student's t-distribution, which is a distribution applicable to univariate random variables. While the case of a random matrix could be treated within this structure, the matrix t-distribution is distinct and makes particular use of the matrix structure.
In probability and statistics, a natural exponential family (NEF) is a class of probability distributions that is a special case of an exponential family (EF). Every distribution possessing a moment-generating function is a member of a natural exponential family, and the use of such distributions simplifies the theory and computation of generalized linear models.
Financial models with long-tailed distributions and volatility clustering have been introduced to overcome problems with the realism of classical financial models. These classical models of financial time series typically assume homoskedasticity and normality cannot explain stylized phenomena such as skewness, heavy tails, and volatility clustering of the empirical asset returns in finance. In 1963, Benoit Mandelbrot first used the stable distribution to model the empirical distributions which have the skewness and heavy-tail property. Since -stable distributions have infinite -th moments for all , the tempered stable processes have been proposed for overcoming this limitation of the stable distribution.
In probability theory and statistics, there are several relationships among probability distributions. These relations can be categorized in the following groups:
In the mathematical theory of random matrices, the Marchenko–Pastur distribution, or Marchenko–Pastur law, describes the asymptotic behavior of singular values of large rectangular random matrices. The theorem is named after Ukrainian mathematicians Vladimir Marchenko and Leonid Pastur who proved this result in 1967.
In the mathematics of free probability theory, the free Poisson distribution is a counterpart of the Poisson distribution in conventional probability theory.
In probability theory, more specifically the study of random matrices, the circular law concerns the distribution of eigenvalues of an n × n random matrix with independent and identically distributed entries in the limit n → ∞.
In probability theory, a fractional Poisson process is a stochastic process to model the long-memory dynamics of a stream of counts. The time interval between each pair of consecutive counts follows the non-exponential power-law distribution with parameter , which has physical dimension , where . In other words, fractional Poisson process is non-Markov counting stochastic process which exhibits non-exponential distribution of interarrival times. The fractional Poisson process is a continuous-time process which can be thought of as natural generalization of the well-known Poisson process. Fractional Poisson probability distribution is a new member of discrete probability distributions.
In mathematics, the Poisson boundary is a measure space associated to a random walk. It is an object designed to encode the asymptotic behaviour of the random walk, i.e. how trajectories diverge when the number of steps goes to infinity. Despite being called a boundary it is in general a purely measure-theoretical object and not a boundary in the topological sense. However, in the case where the random walk is on a topological space the Poisson boundary can be related to the Martin boundary which is an analytic construction yielding a genuine topological boundary. Both boundaries are related to harmonic functions on the space via generalisations of the Poisson formula.