Gaussian probability space

Last updated

In probability theory particularly in the Malliavin calculus, a Gaussian probability space is a probability space together with a Hilbert space of mean zero, real-valued Gaussian random variables. Important examples include the classical or abstract Wiener space with some suitable collection of Gaussian random variables. [1] [2]

Contents

Definition

A Gaussian probability space consists of

[3]

Irreducibility

A Gaussian probability space is called irreducible if . Such spaces are denoted as . Non-irreducible spaces are used to work on subspaces or to extend a given probability space. [3] Irreducible Gaussian probability spaces are classified by the dimension of the Gaussian space . [4]

Subspaces

A subspace of a Gaussian probability space consists of

Example:

Let be a Gaussian probability space with a closed subspace . Let be the orthogonal complement of in . Since orthogonality implies independence between and , we have that is independent of . Define via .

Remark

For we have .

Fundamental algebra

Given a Gaussian probability space one defines the algebra of cylindrical random variables

where is a polynomial in and calls the fundamental algebra. For any it is true that .

For an irreducible Gaussian probability the fundamental algebra is a dense set in for all . [4]

Numerical and Segal model

An irreducible Gaussian probability where a basis was chosen for is called a numerical model. Two numerical models are isomorphic if their Gaussian spaces have the same dimension. [4]

Given a separable Hilbert space , there exists always a canoncial irreducible Gaussian probability space called the Segal model with as a Gaussian space. In this setting, one usually writes for an element the associated Gaussian random variable in the Segal model as . The notation is that of an isornomal Gaussian process and typically the Gaussian space is defined through one. One can then easily choose an arbitrary Hilbert space and have the Gaussian space as . [5]

Literature

Related Research Articles

<span class="mw-page-title-main">Random variable</span> Variable representing a random phenomenon

A random variable is a mathematical formalization of a quantity or object which depends on random events. The term 'random variable' in its mathematical definition refers to neither randomness nor variability but instead is a mathematical function in which

<span class="mw-page-title-main">Independence (probability theory)</span> When the occurrence of one event does not affect the likelihood of another

Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes. Two events are independent, statistically independent, or stochastically independent if, informally speaking, the occurrence of one does not affect the probability of occurrence of the other or, equivalently, does not affect the odds. Similarly, two random variables are independent if the realization of one does not affect the probability distribution of the other.

In mathematical analysis and in probability theory, a σ-algebra on a set X is a nonempty collection Σ of subsets of X closed under complement, countable unions, and countable intersections. The ordered pair is called a measurable space.

<span class="mw-page-title-main">Probability space</span> Mathematical concept

In probability theory, a probability space or a probability triple is a mathematical construct that provides a formal model of a random process or "experiment". For example, one can define a probability space which models the throwing of a die.

<span class="mw-page-title-main">Bernoulli process</span> Random process of binary (boolean) random variables

In probability and statistics, a Bernoulli process is a finite or infinite sequence of binary random variables, so it is a discrete-time stochastic process that takes only two values, canonically 0 and 1. The component Bernoulli variablesXi are identically distributed and independent. Prosaically, a Bernoulli process is a repeated coin flipping, possibly with an unfair coin. Every variable Xi in the sequence is associated with a Bernoulli trial or experiment. They all have the same Bernoulli distribution. Much of what can be said about the Bernoulli process can also be generalized to more than two outcomes ; this generalization is known as the Bernoulli scheme.

In probability theory, the conditional expectation, conditional expected value, or conditional mean of a random variable is its expected value evaluated with respect to the conditional probability distribution. If the random variable can take on only a finite number of values, the "conditions" are that the variable can only take on a subset of those values. More formally, in the case when the random variable is defined over a discrete probability space, the "conditions" are a partition of this probability space.

In probability theory and statistics, the conditional probability distribution is a probability distribution that describes the probability of an outcome given the occurrence of a particular event. Given two jointly distributed random variables and , the conditional probability distribution of given is the probability distribution of when is known to be a particular value; in some cases the conditional probabilities may be expressed as functions containing the unspecified value of as a parameter. When both and are categorical variables, a conditional probability table is typically used to represent the conditional probability. The conditional distribution contrasts with the marginal distribution of a random variable, which is its distribution without reference to the value of the other variable.

In probability theory and related fields, Malliavin calculus is a set of mathematical techniques and ideas that extend the mathematical field of calculus of variations from deterministic functions to stochastic processes. In particular, it allows the computation of derivatives of random variables. Malliavin calculus is also called the stochastic calculus of variations. P. Malliavin first initiated the calculus on infinite dimensional space. Then, the significant contributors such as S. Kusuoka, D. Stroock, J-M. Bismut, Shinzo Watanabe, I. Shigekawa, and so on finally completed the foundations.

In statistics and probability theory, a point process or point field is a set of a random number of mathematical points randomly located on a mathematical space such as the real line or Euclidean space.

A Dynkin system, named after Eugene Dynkin, is a collection of subsets of another universal set satisfying a set of axioms weaker than those of 𝜎-algebra. Dynkin systems are sometimes referred to as 𝜆-systems or d-system. These set families have applications in measure theory and probability.

In physics and mathematics, the Gibbs measure, named after Josiah Willard Gibbs, is a probability measure frequently seen in many problems of probability theory and statistical mechanics. It is a generalization of the canonical ensemble to infinite systems. The canonical ensemble gives the probability of the system X being in state x as

In mathematics, a π-system on a set is a collection of certain subsets of such that

In probability theory, random element is a generalization of the concept of random variable to more complicated spaces than the simple real line. The concept was introduced by Maurice Fréchet who commented that the “development of probability theory and expansion of area of its applications have led to necessity to pass from schemes where (random) outcomes of experiments can be described by number or a finite set of numbers, to schemes where outcomes of experiments represent, for example, vectors, functions, processes, fields, series, transformations, and also sets or collections of sets.”

In the study of stochastic processes, a stochastic process is adapted if information about the value of the process at a given time is available at that same time. An informal interpretation is that X is adapted if and only if, for every realisation and every n, Xn is known at time n. The concept of an adapted process is essential, for instance, in the definition of the Itō integral, which only makes sense if the integrand is an adapted process.

In mathematics, the Kolmogorov extension theorem is a theorem that guarantees that a suitably "consistent" collection of finite-dimensional distributions will define a stochastic process. It is credited to the English mathematician Percy John Daniell and the Russian mathematician Andrey Nikolaevich Kolmogorov.

In probability theory, a random measure is a measure-valued random element. Random measures are for example used in the theory of random processes, where they form many important point processes such as Poisson point processes and Cox processes.

In probability theory, a standard probability space, also called Lebesgue–Rokhlin probability space or just Lebesgue space is a probability space satisfying certain assumptions introduced by Vladimir Rokhlin in 1940. Informally, it is a probability space consisting of an interval and/or a finite or countable number of atoms.

In mathematics, the Skorokhod integral, also named Hitsuda–Skorokhod integral, often denoted , is an operator of great importance in the theory of stochastic processes. It is named after the Ukrainian mathematician Anatoliy Skorokhod and Japanese mathematician Masuyuki Hitsuda. Part of its importance is that it unifies several concepts:

In the theory of stochastic processes, a subdiscipline of probability theory, filtrations are totally ordered collections of subsets that are used to model the information that is available at a given point and therefore play an important role in the formalization of random (stochastic) processes.

In functional analysis, every C*-algebra is isomorphic to a subalgebra of the C*-algebra of bounded linear operators on some Hilbert space This article describes the spectral theory of closed normal subalgebras of . A subalgebra of is called normal if it is commutative and closed under the operation: for all , we have and that .

References

  1. Malliavin, Paul (1997). Stochastic analysis. Berlin, Heidelberg: Springer. doi:10.1007/978-3-642-15074-6. ISBN   3-540-57024-1.
  2. Nualart, David (2013). The Malliavin calculus and related topics. New York: Springer. p. 3. doi:10.1007/978-1-4757-2437-0.
  3. 1 2 3 Malliavin, Paul (1997). Stochastic analysis. Berlin, Heidelberg: Springer. pp. 4–5. doi:10.1007/978-3-642-15074-6. ISBN   3-540-57024-1.
  4. 1 2 3 Malliavin, Paul (1997). Stochastic analysis. Berlin, Heidelberg: Springer. pp. 13–14. doi:10.1007/978-3-642-15074-6. ISBN   3-540-57024-1.
  5. Malliavin, Paul (1997). Stochastic analysis. Berlin, Heidelberg: Springer. p. 16. doi:10.1007/978-3-642-15074-6. ISBN   3-540-57024-1.