Dimension function

Last updated

In mathematics, the notion of an (exact) dimension function (also known as a gauge function) is a tool in the study of fractals and other subsets of metric spaces. Dimension functions are a generalisation of the simple "diameter to the dimension" power law used in the construction of s-dimensional Hausdorff measure.

Contents

Motivation: s-dimensional Hausdorff measure

Consider a metric space (X, d) and a subset E of X. Given a number s  0, the s-dimensional Hausdorff measure of E, denoted μs(E), is defined by

where

μδs(E) can be thought of as an approximation to the "true" s-dimensional area/volume of E given by calculating the minimal s-dimensional area/volume of a covering of E by sets of diameter at most δ.

As a function of increasing s, μs(E) is non-increasing. In fact, for all values of s, except possibly one, Hs(E) is either 0 or +∞; this exceptional value is called the Hausdorff dimension of E, here denoted dimH(E). Intuitively speaking, μs(E) = +∞ for s < dimH(E) for the same reason as the 1-dimensional linear length of a 2-dimensional disc in the Euclidean plane is +∞; likewise, μs(E) = 0 for s > dimH(E) for the same reason as the 3-dimensional volume of a disc in the Euclidean plane is zero.

The idea of a dimension function is to use different functions of diameter than just diam(C)s for some s, and to look for the same property of the Hausdorff measure being finite and non-zero.

Definition

Let (X, d) be a metric space and E  X. Let h : [0, +∞)  [0, +∞] be a function. Define μh(E) by

where

Then h is called an (exact) dimension function (or gauge function) for E if μh(E) is finite and strictly positive. There are many conventions as to the properties that h should have: Rogers (1998), for example, requires that h should be monotonically increasing for t  0, strictly positive for t > 0, and continuous on the right for all t  0.

Packing dimension

Packing dimension is constructed in a very similar way to Hausdorff dimension, except that one "packs" E from inside with pairwise disjoint balls of diameter at most δ. Just as before, one can consider functions h : [0, +∞)  [0, +∞] more general than h(δ) = δs and call h an exact dimension function for E if the h-packing measure of E is finite and strictly positive.

Example

Almost surely, a sample path X of Brownian motion in the Euclidean plane has Hausdorff dimension equal to 2, but the 2-dimensional Hausdorff measure μ2(X) is zero. The exact dimension function h is given by the logarithmic correction

I.e., with probability one, 0 < μh(X) < +∞ for a Brownian path X in R2. For Brownian motion in Euclidean n-space Rn with n  3, the exact dimension function is

Related Research Articles

Brownian motion Random motion of particles suspended in a fluid

Brownian motion, or pedesis, is the random motion of particles suspended in a medium.

Hausdorff dimension Invariant

In mathematics, Hausdorff dimension is a measure of roughness, or more specifically, fractal dimension, that was first introduced in 1918 by mathematician Felix Hausdorff. For instance, the Hausdorff dimension of a single point is zero, of a line segment is 1, of a square is 2, and of a cube is 3. That is, for sets of points that define a smooth shape or a shape that has a small number of corners—the shapes of traditional geometry and science—the Hausdorff dimension is an integer agreeing with the usual sense of dimension, also known as the topological dimension. However, formulas have also been developed that allow calculation of the dimension of other less simple objects, where, solely on the basis of their properties of scaling and self-similarity, one is led to the conclusion that particular objects—including fractals—have non-integer Hausdorff dimensions. Because of the significant technical advances made by Abram Samoilovitch Besicovitch allowing computation of dimensions for highly irregular or "rough" sets, this dimension is also commonly referred to as the Hausdorff–Besicovitch dimension.

Entropy (information theory) Expected amount of information needed to specify the output of a stochastic data source

In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , with possible outcomes , which occur with probability the entropy of is formally defined as:

Jensens inequality Theorem of convex functions

In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proved by Jensen in 1906. Given its generality, the inequality appears in many forms depending on the context, some of which are presented below. In its simplest form the inequality states that the convex transformation of a mean is less than or equal to the mean applied after convex transformation; it is a simple corollary that the opposite is true of concave transformations.

In the mathematical field of measure theory, an outer measure or exterior measure is a function defined on all subsets of a given set with values in the extended real numbers satisfying some additional technical conditions. The theory of outer measures was first introduced by Constantin Carathéodory to provide an abstract basis for the theory of measurable sets and countably additive measures. Carathéodory's work on outer measures found many applications in measure-theoretic set theory, and was used in an essential way by Hausdorff to define a dimension-like metric invariant now called Hausdorff dimension. Outer measures are commonly used in the field of geometric measure theory.

In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space. It can be used to calculate the informational difference between measurements.

In mathematics, Hausdorff measure is a generalization of the traditional notions of area and volume to non-integer dimensions, specifically fractals and their Hausdorff dimensions. It is a type of outer measure, named for Felix Hausdorff, that assigns a number in [0,∞] to each set in or, more generally, in any metric space.

In probability theory, the Chernoff bound gives exponentially decreasing bounds on tail distributions of sums of independent random variables. Despite being named after Herman Chernoff, the author of the paper it first appeared in, the result is due to Herman Rubin. It is a sharper bound than the known first- or second-moment-based tail bounds such as Markov's inequality or Chebyshev's inequality, which only yield power-law bounds on tail decay. However, the Chernoff bound requires that the variates be independent – a condition that neither Markov's inequality nor Chebyshev's inequality require, although Chebyshev's inequality does require the variates to be pairwise independent.

In mathematics — specifically, in large deviations theory — a rate function is a function used to quantify the probabilities of rare events. It is required to have several properties which assist in the formulation of the large deviation principle. In some sense, the large deviation principle is an analogue of weak convergence of probability measures, but one which takes account of how well the rare events behave.

In mathematics, tightness is a concept in measure theory. The intuitive idea is that a given collection of measures does not "escape to infinity".

In mathematics, the Wasserstein distance or Kantorovich–Rubinstein metric is a distance function defined between probability distributions on a given metric space . It is named after Leonid Vaseršteĭn.

Classical Wiener space

In mathematics, classical Wiener space is the collection of all continuous functions on a given domain, taking values in a metric space. Classical Wiener space is useful in the study of stochastic processes whose sample paths are continuous functions. It is named after the American mathematician Norbert Wiener.

In measure theory, tangent measures are used to study the local behavior of Radon measures, in much the same way as tangent spaces are used to study the local behavior of differentiable manifolds. Tangent measures are a useful tool in geometric measure theory. For example, they are used in proving Marstrand's theorem and Preiss' theorem.

In mathematics, a metric outer measure is an outer measure μ defined on the subsets of a given metric space (Xd) such that

In mathematics – specifically, in stochastic analysis – an Itô diffusion is a solution to a specific type of stochastic differential equation. That equation is similar to the Langevin equation used in physics to describe the Brownian motion of a particle subjected to a potential in a viscous fluid. Itô diffusions are named after the Japanese mathematician Kiyosi Itô.

In mathematics, the packing dimension is one of a number of concepts that can be used to define the dimension of a subset of a metric space. Packing dimension is in some sense dual to Hausdorff dimension, since packing dimension is constructed by "packing" small open balls inside the given subset, whereas Hausdorff dimension is constructed by covering the given subset by such small open balls. The packing dimension was introduced by C. Tricot Jr. in 1982.

In mathematics, and more specifically, in the theory of fractal dimensions, Frostman's lemma provides a convenient tool for estimating the Hausdorff dimension of sets.

In mathematics, a quasisymmetric homeomorphism between metric spaces is a map that generalizes bi-Lipschitz maps. While bi-Lipschitz maps shrink or expand the diameter of a set by no more than a multiplicative factor, quasisymmetric maps satisfy the weaker geometric property that they preserve the relative sizes of sets: if two sets A and B have diameters t and are no more than distance t apart, then the ratio of their sizes changes by no more than a multiplicative constant. These maps are also related to quasiconformal maps, since in many circumstances they are in fact equivalent.

In statistical mechanics, the mean squared displacement is a measure of the deviation of the position of a particle with respect to a reference position over time. It is the most common measure of the spatial extent of random motion, and can be thought of as measuring the portion of the system "explored" by the random walker. In the realm of biophysics and environmental engineering, the Mean Squared Displacement is measured over time to determine if a particle is spreading solely due to diffusion, or if an advective force is also contributing. Another relevant concept, the Variance-Related Diameter, is also used in studying the transportation and mixing phenomena in the realm of environmental engineering. It prominently appears in the Debye–Waller factor and in the Langevin equation.

In physics and mathematics, the Klein–Kramers equation is a partial differential equation that describes the probability density function f of a Brownian particle in phase space (r, p).

References