Brownian sheet

Last updated

In mathematics, a Brownian sheet or multiparametric Brownian motion is a multiparametric generalization of the Brownian motion to a Gaussian random field. This means we generalize the "time" parameter of a Brownian motion from to .

Contents

The exact dimension of the space of the new time parameter varies from authors. We follow John B. Walsh and define the -Brownian sheet, while some authors define the Brownian sheet specifically only for , what we call the -Brownian sheet. [1]

This definition is due to Nikolai Chentsov, there exist a slightly different version due to Paul Lévy.

(n,d)-Brownian sheet

A -dimensional gaussian process is called a -Brownian sheet if

for . [2]

Properties

From the definition follows

almost surely.

Examples

Lévy's definition of the multiparametric Brownian motion

In Lévy's definition one replaces the covariance condition above with the following condition

where is the Euclidean metric on . [3]

Existence of abstract Wiener measure

Consider the space of continuous functions of the form satisfying This space becomes a separable Banach space when equipped with the norm

Notice this space includes densely the space of zero at infinity equipped with the uniform norm, since one can bound the uniform norm with the norm of from above through the Fourier inversion theorem.

Let be the space of tempered distributions. One can then show that there exist a suitable separable Hilbert space (and Sobolev space)

that is continuously embbeded as a dense subspace in and thus also in and that there exist a probability measure on such that the triple is an abstract Wiener space.

A path is -almost surely

This handles of a Brownian sheet in the case . For higher dimensional , the construction is similar.

See also

Literature

Related Research Articles

In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statistical model, the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of statistical inference.

<span class="mw-page-title-main">Wiener process</span> Stochastic process generalizing Brownian motion

In mathematics, the Wiener process is a real-valued continuous-time stochastic process discovered by Norbert Wiener. It is one of the best known Lévy processes. It occurs frequently in pure and applied mathematics, economics, quantitative finance, evolutionary biology, and physics.

In mechanics and geometry, the 3D rotation group, often denoted SO(3), is the group of all rotations about the origin of three-dimensional Euclidean space under the operation of composition.

In mathematics, the Laplace operator or Laplacian is a differential operator given by the divergence of the gradient of a scalar function on Euclidean space. It is usually denoted by the symbols , (where is the nabla operator), or . In a Cartesian coordinate system, the Laplacian is given by the sum of second partial derivatives of the function with respect to each independent variable. In other coordinate systems, such as cylindrical and spherical coordinates, the Laplacian also has a useful form. Informally, the Laplacian Δf (p) of a function f at a point p measures by how much the average value of f over small spheres or balls centered at p deviates from f (p).

In mathematics, a Gaussian function, often simply referred to as a Gaussian, is a function of the base form and with parametric extension for arbitrary real constants a, b and non-zero c. It is named after the mathematician Carl Friedrich Gauss. The graph of a Gaussian is a characteristic symmetric "bell curve" shape. The parameter a is the height of the curve's peak, b is the position of the center of the peak, and c controls the width of the "bell".

In probability theory and statistics, a Gaussian process is a stochastic process, such that every finite collection of those random variables has a multivariate normal distribution. The distribution of a Gaussian process is the joint distribution of all those random variables, and as such, it is a distribution over functions with a continuous domain, e.g. time or space.

In complex analysis, the Hardy spaces are spaces of holomorphic functions on the unit disk or upper half plane. They were introduced by Frigyes Riesz, who named them after G. H. Hardy, because of the paper. In real analysis Hardy spaces are spaces of distributions on the real n-space , defined as boundary values of the holomorphic functions. are related to the Lp spaces. For these Hardy spaces are subsets of spaces, while for the spaces have some undesirable properties, and the Hardy spaces are much better behaved. Hence, spaces can be considered extensions of spaces.

<span class="mw-page-title-main">Cramér–Rao bound</span> Lower bound on variance of an estimator

In estimation theory and statistics, the Cramér–Rao bound (CRB) relates to estimation of a deterministic parameter. The result is named in honor of Harald Cramér and Calyampudi Radhakrishna Rao, but has also been derived independently by Maurice Fréchet, Georges Darmois, and by Alexander Aitken and Harold Silverstone. It is also known as Fréchet-Cramér–Rao or Fréchet-Darmois-Cramér-Rao lower bound. It states that the precision of any unbiased estimator is at most the Fisher information; or (equivalently) the reciprocal of the Fisher information is a lower bound on its variance.

In mathematical statistics, the Fisher information is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information.

In probability theory, a Lévy process, named after the French mathematician Paul Lévy, is a stochastic process with independent, stationary increments: it represents the motion of a point whose successive displacements are random, in which displacements in pairwise disjoint time intervals are independent, and displacements in different time intervals of the same length have identical probability distributions. A Lévy process may thus be viewed as the continuous-time analog of a random walk.

<span class="mw-page-title-main">Ornstein–Uhlenbeck process</span> Stochastic process modeling random walk with friction

In mathematics, the Ornstein–Uhlenbeck process is a stochastic process with applications in financial mathematics and the physical sciences. Its original application in physics was as a model for the velocity of a massive Brownian particle under the influence of friction. It is named after Leonard Ornstein and George Eugene Uhlenbeck.

In statistics, the delta method is a method of deriving the asymptotic distribution of a random variable. It is applicable when the random variable being considered can be defined as a differentiable function of a random variable which is asymptotically Gaussian.

<span class="mw-page-title-main">Donsker's theorem</span> Statement in probability theory

In probability theory, Donsker's theorem, named after Monroe D. Donsker, is a functional extension of the central limit theorem for empirical distribution functions. Specifically, the theorem states that an appropriately centered and scaled version of the empirical distribution function converges to a Gaussian process.

<span class="mw-page-title-main">Inverse Gaussian distribution</span> Family of continuous probability distributions

In probability theory, the inverse Gaussian distribution is a two-parameter family of continuous probability distributions with support on (0,∞).

A ratio distribution is a probability distribution constructed as the distribution of the ratio of random variables having two other known distributions. Given two random variables X and Y, the distribution of the random variable Z that is formed as the ratio Z = X/Y is a ratio distribution.

In mathematics — specifically, in stochastic analysis — the infinitesimal generator of a Feller process is a Fourier multiplier operator that encodes a great deal of information about the process.

In statistics, the Chapman–Robbins bound or Hammersley–Chapman–Robbins bound is a lower bound on the variance of estimators of a deterministic parameter. It is a generalization of the Cramér–Rao bound; compared to the Cramér–Rao bound, it is both tighter and applicable to a wider range of problems. However, it is usually more difficult to compute.

A product distribution is a probability distribution constructed as the distribution of the product of random variables having two other known distributions. Given two statistically independent random variables X and Y, the distribution of the random variable Z that is formed as the product is a product distribution.

In mathematics the symmetrization methods are algorithms of transforming a set to a ball with equal volume and centered at the origin. B is called the symmetrized version of A, usually denoted . These algorithms show up in solving the classical isoperimetric inequality problem, which asks: Given all two-dimensional shapes of a given area, which of them has the minimal perimeter. The conjectured answer was the disk and Steiner in 1838 showed this to be true using the Steiner symmetrization method. From this many other isoperimetric problems sprung and other symmetrization algorithms. For example, Rayleigh's conjecture is that the first eigenvalue of the Dirichlet problem is minimized for the ball. Another problem is that the Newtonian capacity of a set A is minimized by and this was proved by Polya and G. Szego (1951) using circular symmetrization.

Exponential Tilting (ET), Exponential Twisting, or Exponential Change of Measure (ECM) is a distribution shifting technique used in many parts of mathematics. The different exponential tiltings of a random variable is known as the natural exponential family of .

References

  1. Walsh, John B. (1986). An introduction to stochastic partial differential equations. Springer Berlin Heidelberg. p. 269. ISBN   978-3-540-39781-6.
  2. Davar Khoshnevisan und Yimin Xiao (2004), Images of the Brownian Sheet, arXiv: math/0409491
  3. Ossiander, Mina; Pyke, Ronald (1985). "Lévy's Brownian motion as a set-indexed process and a related central limit theorem". Stochastic Processes and their Applications. 21 (1): 133–145. doi:10.1016/0304-4149(85)90382-5.
  4. Stroock, Daniel (2011), Probability theory: an analytic view (2nd ed.), Cambridge, p. 349-352