Brownian bridge

Last updated
Brownian motion, pinned at both ends. This represents a Brownian bridge. Brownian bridge.png
Brownian motion, pinned at both ends. This represents a Brownian bridge.

A Brownian bridge is a continuous-time gaussian process B(t) whose probability distribution is the conditional probability distribution of a standard Wiener process W(t) (a mathematical model of Brownian motion) subject to the condition (when standardized) that W(T) = 0, so that the process is pinned to the same value at both t = 0 and t = T. More precisely:

Contents

The expected value of the bridge at any t in the interval [0,T] is zero, with variance , implying that the most uncertainty is in the middle of the bridge, with zero uncertainty at the nodes. The covariance of B(s) and B(t) is , or s(T  t)/T if s < t. The increments in a Brownian bridge are not independent.

Relation to other stochastic processes

If W(t) is a standard Wiener process (i.e., for t  0, W(t) is normally distributed with expected value 0 and variance t, and the increments are stationary and independent), then

is a Brownian bridge for t  [0, T]. It is independent of W(T) [1]

Conversely, if B(t) is a Brownian bridge and Z is a standard normal random variable independent of B, then the process

is a Wiener process for t  [0, 1]. More generally, a Wiener process W(t) for t  [0, T] can be decomposed into

Another representation of the Brownian bridge based on the Brownian motion is, for t  [0, T]

Conversely, for t  [0, ∞]

The Brownian bridge may also be represented as a Fourier series with stochastic coefficients, as

where are independent identically distributed standard normal random variables (see the Karhunen–Loève theorem).

A Brownian bridge is the result of Donsker's theorem in the area of empirical processes. It is also used in the Kolmogorov–Smirnov test in the area of statistical inference.

Let , then the cumulative distribution function of K is given by [2]

Intuitive remarks

A standard Wiener process satisfies W(0) = 0 and is therefore "tied down" to the origin, but other points are not restricted. In a Brownian bridge process on the other hand, not only is B(0) = 0 but we also require that B(T) = 0, that is the process is "tied down" at t = T as well. Just as a literal bridge is supported by pylons at both ends, a Brownian Bridge is required to satisfy conditions at both ends of the interval [0,T]. (In a slight generalization, one sometimes requires B(t1) = a and B(t2) = b where t1, t2, a and b are known constants.)

Suppose we have generated a number of points W(0), W(1), W(2), W(3), etc. of a Wiener process path by computer simulation. It is now desired to fill in additional points in the interval [0,T], that is to interpolate between the already generated points W(0) and W(T). The solution is to use a Brownian bridge that is required to go through the values W(0) and W(T).

General case

For the general case when W(t1) = a and W(t2) = b, the distribution of B at time t  (t1, t2) is normal, with mean

and variance

and the covariance between B(s) and B(t), with s < t is

Related Research Articles

<span class="mw-page-title-main">Brownian motion</span> Random motion of particles suspended in a fluid

Brownian motion is the random motion of particles suspended in a medium.

<span class="mw-page-title-main">Maxwell–Boltzmann distribution</span> Specific probability distribution function, important in physics

In physics, the Maxwell–Boltzmann distribution, or Maxwell(ian) distribution, is a particular probability distribution named after James Clerk Maxwell and Ludwig Boltzmann.

<span class="mw-page-title-main">Normal distribution</span> Probability distribution

In probability theory and statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is The parameter is the mean or expectation of the distribution, while the parameter is the variance. The standard deviation of the distribution is . A random variable with a Gaussian distribution is said to be normally distributed, and is called a normal deviate.

<span class="mw-page-title-main">Student's t-distribution</span> Probability distribution

In probability and statistics, Student's t distribution is a continuous probability distribution that generalizes the standard normal distribution. Like the latter, it is symmetric around zero and bell-shaped.

<span class="mw-page-title-main">Wiener process</span> Stochastic process generalizing Brownian motion

In mathematics, the Wiener process is a real-valued continuous-time stochastic process named in honor of American mathematician Norbert Wiener for his investigations on the mathematical properties of the one-dimensional Brownian motion. It is often also called Brownian motion due to its historical connection with the physical process of the same name originally observed by Scottish botanist Robert Brown. It is one of the best known Lévy processes and occurs frequently in pure and applied mathematics, economics, quantitative finance, evolutionary biology, and physics.

In mathematics, the Hermite polynomials are a classical orthogonal polynomial sequence.

<span class="mw-page-title-main">Random walk</span> Process forming a path from many random steps

In mathematics, a random walk, sometimes known as a drunkard's walk, is a stochastic process that describes a path that consists of a succession of random steps on some mathematical space.

In mathematics, a Gaussian function, often simply referred to as a Gaussian, is a function of the base form and with parametric extension for arbitrary real constants a, b and non-zero c. It is named after the mathematician Carl Friedrich Gauss. The graph of a Gaussian is a characteristic symmetric "bell curve" shape. The parameter a is the height of the curve's peak, b is the position of the center of the peak, and c controls the width of the "bell".

In probability theory, a Lévy process, named after the French mathematician Paul Lévy, is a stochastic process with independent, stationary increments: it represents the motion of a point whose successive displacements are random, in which displacements in pairwise disjoint time intervals are independent, and displacements in different time intervals of the same length have identical probability distributions. A Lévy process may thus be viewed as the continuous-time analog of a random walk.

<span class="mw-page-title-main">Stable distribution</span> Distribution of variables which satisfies a stability property under linear combinations

In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be stable if its distribution is stable. The stable distribution family is also sometimes referred to as the Lévy alpha-stable distribution, after Paul Lévy, the first mathematician to have studied it.

<span class="mw-page-title-main">Lévy distribution</span> Probability distribution

In probability theory and statistics, the Lévy distribution, named after Paul Lévy, is a continuous probability distribution for a non-negative random variable. In spectroscopy, this distribution, with frequency as the dependent variable, is known as a van der Waals profile. It is a special case of the inverse-gamma distribution. It is a stable distribution.

<span class="mw-page-title-main">Directional statistics</span>

Directional statistics is the subdiscipline of statistics that deals with directions, axes or rotations in Rn. More generally, directional statistics deals with observations on compact Riemannian manifolds including the Stiefel manifold.

<span class="mw-page-title-main">Inverse Gaussian distribution</span> Family of continuous probability distributions

In probability theory, the inverse Gaussian distribution is a two-parameter family of continuous probability distributions with support on (0,∞).

In probability theory and statistics, a normal variance-mean mixture with mixing probability density is the continuous probability distribution of a random variable of the form

In mathematics, a local martingale is a type of stochastic process, satisfying the localized version of the martingale property. Every martingale is a local martingale; every bounded local martingale is a martingale; in particular, every local martingale that is bounded from below is a supermartingale, and every local martingale that is bounded from above is a submartingale; however, a local martingale is not in general a martingale, because its expectation can be distorted by large values of small probability. In particular, a driftless diffusion process is a local martingale, but not necessarily a martingale.

<span class="mw-page-title-main">Black–Scholes equation</span> Partial differential equation in mathematical finance

In mathematical finance, the Black–Scholes equation, also called the Black–Scholes–Merton equation, is a partial differential equation (PDE) governing the price evolution of derivatives under the Black–Scholes model. Broadly speaking, the term may refer to a similar PDE that can be derived for a variety of options, or more generally, derivatives.

<span class="mw-page-title-main">Brownian excursion</span> Stochastic process

In probability theory a Brownian excursion process is a stochastic process that is closely related to a Wiener process. Realisations of Brownian excursion processes are essentially just realizations of a Wiener process selected to satisfy certain conditions. In particular, a Brownian excursion process is a Wiener process conditioned to be positive and to take the value 0 at time 1. Alternatively, it is a Brownian bridge process conditioned to be positive. BEPs are important because, among other reasons, they naturally arise as the limit process of a number of conditional functional central limit theorems.

A product distribution is a probability distribution constructed as the distribution of the product of random variables having two other known distributions. Given two statistically independent random variables X and Y, the distribution of the random variable Z that is formed as the product is a product distribution.

In physics and mathematics, the Klein–Kramers equation or sometimes referred as Kramers–Chandrasekhar equation is a partial differential equation that describes the probability density function f of a Brownian particle in phase space (r, p). It is a special case of the Fokker–Planck equation.

References

  1. Aspects of Brownian motion, Springer, 2008, R. Mansuy, M. Yor page 2
  2. Marsaglia G, Tsang WW, Wang J (2003). "Evaluating Kolmogorov's Distribution". Journal of Statistical Software. 8 (18): 1–4. doi: 10.18637/jss.v008.i18 .