Topological entropy

Last updated

In mathematics, the topological entropy of a topological dynamical system is a nonnegative extended real number that is a measure of the complexity of the system. Topological entropy was first introduced in 1965 by Adler, Konheim and McAndrew. Their definition was modelled after the definition of the Kolmogorov–Sinai, or metric entropy. Later, Dinaburg and Rufus Bowen gave a different, weaker definition reminiscent of the Hausdorff dimension. The second definition clarified the meaning of the topological entropy: for a system given by an iterated function, the topological entropy represents the exponential growth rate of the number of distinguishable orbits of the iterates. An important variational principle relates the notions of topological and measure-theoretic entropy.

Contents

Definition

A topological dynamical system consists of a Hausdorff topological space X (usually assumed to be compact) and a continuous self-map f : X  X. Its topological entropy is a nonnegative extended real number that can be defined in various ways, which are known to be equivalent.

Definition of Adler, Konheim, and McAndrew

Let X be a compact Hausdorff topological space. For any finite open cover C of X, let H(C) be the logarithm (usually to base 2) of the smallest number of elements of C that cover X. [1] For two covers C and D, let be their (minimal) common refinement, which consists of all the non-empty intersections of a set from C with a set from D, and similarly for multiple covers.

For any continuous map f: X  X, the following limit exists:

Then the topological entropy of f, denoted h(f), is defined to be the supremum of H(f,C) over all possible finite covers C of X.

Interpretation

The parts of C may be viewed as symbols that (partially) describe the position of a point x in X: all points xCi are assigned the symbol Ci . Imagine that the position of x is (imperfectly) measured by a certain device and that each part of C corresponds to one possible outcome of the measurement. then represents the logarithm of the minimal number of "words" of length n needed to encode the points of X according to the behavior of their first n 1 iterates under f, or, put differently, the total number of "scenarios" of the behavior of these iterates, as "seen" by the partition C. Thus the topological entropy is the average (per iteration) amount of information needed to describe long iterations of the map f.

Definition of Bowen and Dinaburg

This definition [2] [3] [4] uses a metric on X (actually, a uniform structure would suffice). This is a narrower definition than that of Adler, Konheim, and McAndrew, [5] as it requires the additional metric structure on the topological space (but is independent of the choice of metrics generating the given topology). However, in practice, the Bowen-Dinaburg topological entropy is usually much easier to calculate.

Let (X, d) be a compact metric space and f: X  X be a continuous map. For each natural number n, a new metric dn is defined on X by the formula

Given any ε > 0 and n 1, two points of X are ε-close with respect to this metric if their first n iterates are ε-close. This metric allows one to distinguish in a neighborhood of an orbit the points that move away from each other during the iteration from the points that travel together. A subset E of X is said to be (n, ε)-separated if each pair of distinct points of E is at least ε apart in the metric dn. Denote by N(n, ε) the maximum cardinality of an (n, ε)-separated set. The topological entropy of the map f is defined by

Interpretation

Since X is compact, N(n, ε) is finite and represents the number of distinguishable orbit segments of length n, assuming that we cannot distinguish points within ε of one another. A straightforward argument shows that the limit defining h(f) always exists in the extended real line (but could be infinite). This limit may be interpreted as the measure of the average exponential growth of the number of distinguishable orbit segments. In this sense, it measures complexity of the topological dynamical system (X, f). Rufus Bowen extended this definition of topological entropy in a way which permits X to be non-compact under the assumption that the map f is uniformly continuous.

Properties

.

Examples

. The measure-theoretic entropy of the Bernoulli -measure is also . Hence it is a measure of maximal entropy. Further on it can be shown that no other measures of maximal entropy exist.

Notes

  1. Since X is compact, H(C) is always finite, even for an infinite cover C. The use of arbitrary covers yields the same value of entropy.
  2. Bowen, Rufus (1971). "Entropy for Group Endomorphisms and Homogeneous Spaces". Transactions of the American Mathematical Society. 153: 401–414. doi: 10.1090/S0002-9947-1971-0274707-X . ISSN   0002-9947.
  3. Bowen, Rufus (1971). "Periodic Points and Measures for Axiom A Diffeomorphisms". Transactions of the American Mathematical Society. 154: 377–397. doi:10.2307/1995452. ISSN   0002-9947. JSTOR   1995452.
  4. Dinaburg, Efim (1970). "RELATIONSHIP BETWEEN TOPOLOGICAL ENTROPY AND METRIC ENTROPY". Doklady Akademii Nauk SSSR. 170: 19.
  5. Adler, R. L.; Konheim, A. G.; McAndrew, M. H. (1965). "Topological Entropy". Transactions of the American Mathematical Society. 114 (2): 309. doi: 10.1090/S0002-9947-1965-0175106-9 . ISSN   0002-9947.
  6. Goodman, T. N. T. (1971). "Relating Topological Entropy and Measure Entropy" . Bulletin of the London Mathematical Society. 3 (2): 176–180. doi:10.1112/blms/3.2.176. ISSN   1469-2120.

See also

Related Research Articles

In mathematics, specifically in measure theory, a Borel measure on a topological space is a measure that is defined on all open sets. Some authors require additional restrictions on the measure, as described below.

<span class="mw-page-title-main">Measure (mathematics)</span> Generalization of mass, length, area and volume

In mathematics, the concept of a measure is a generalization and formalization of geometrical measures and other common notions, such as magnitude, mass, and probability of events. These seemingly distinct concepts have many similarities and can often be treated together in a single mathematical context. Measures are foundational in probability theory, integration theory, and can be generalized to assume negative values, as with electrical charge. Far-reaching generalizations of measure are widely used in quantum physics and physics in general.

In mathematical analysis, the Haar measure assigns an "invariant volume" to subsets of locally compact topological groups, consequently defining an integral for functions on those groups.

Ergodic theory is a branch of mathematics that studies statistical properties of deterministic dynamical systems; it is the study of ergodicity. In this context, "statistical properties" refers to properties which are expressed through the behavior of time averages of various functions along trajectories of dynamical systems. The notion of deterministic dynamical systems assumes that the equations determining the dynamics do not contain any random perturbations, noise, etc. Thus, the statistics with which we are concerned are properties of the dynamics.

In mathematics, the Radon–Nikodym theorem is a result in measure theory that expresses the relationship between two measures defined on the same measurable space. A measure is a set function that assigns a consistent magnitude to the measurable subsets of a measurable space. Examples of a measure include area and volume, where the subsets are sets of points; or the probability of an event, which is a subset of possible outcomes within a wider probability space.

In mathematics, a measure-preserving dynamical system is an object of study in the abstract formulation of dynamical systems, and ergodic theory in particular. Measure-preserving systems obey the Poincaré recurrence theorem, and are a special case of conservative systems. They provide the formal, mathematical basis for a broad range of physical systems, and, in particular, many systems from classical mechanics as well as systems in thermodynamic equilibrium.

In mathematical statistics, the Fisher information is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information.

Complex dynamics, or holomorphic dynamics, is the study of dynamical systems obtained by iterating a complex analytic mapping. This article focuses on the case of algebraic dynamics, where a polynomial or rational function is iterated. In geometric terms, that amounts to iterating a mapping from some algebraic variety to itself. The related theory of arithmetic dynamics studies iteration over the rational numbers or the p-adic numbers instead of the complex numbers.

In general relativity, the Gibbons–Hawking–York boundary term is a term that needs to be added to the Einstein–Hilbert action when the underlying spacetime manifold has a boundary.

In mathematics, a positive (or signed) measure μ defined on a σ-algebra Σ of subsets of a set X is called a finite measure if μ(X) is a finite real number (rather than ∞). A set A in Σ is of finite measure if μ(A) < ∞. The measure μ is called σ-finite if X is a countable union of measurable sets each with finite measure. A set in a measure space is said to have σ-finite measure if it is a countable union of measurable sets with finite measure. A measure being σ-finite is a weaker condition than being finite, i.e. all finite measures are σ-finite but there are (many) σ-finite measures that are not finite.

In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy.

In mathematics, tightness is a concept in measure theory. The intuitive idea is that a given collection of measures does not "escape to infinity".

In mathematics, the support of a measure on a measurable topological space is a precise notion of where in the space the measure "lives". It is defined to be the largest (closed) subset of for which every open neighbourhood of every point of the set has positive measure.

In mathematics, more specifically measure theory, there are various notions of the convergence of measures. For an intuitive general sense of what is meant by convergence of measures, consider a sequence of measures μn on a space, sharing a common collection of measurable sets. Such a sequence might represent an attempt to construct 'better and better' approximations to a desired measure μ that is difficult to obtain directly. The meaning of 'better and better' is subject to all the usual caveats for taking limits; for any error tolerance ε > 0 we require there be N sufficiently large for nN to ensure the 'difference' between μn and μ is smaller than ε. Various notions of convergence specify precisely what the word 'difference' should mean in that description; these notions are not equivalent to one another, and vary in strength.

Convergence in measure is either of two distinct mathematical concepts both of which generalize the concept of convergence in probability.

In mathematics, uniform integrability is an important concept in real analysis, functional analysis and measure theory, and plays a vital role in the theory of martingales.

In mathematics, in particular in measure theory, a content is a real-valued function defined on a collection of subsets such that

In mathematics, a càdlàg, RCLL, or corlol function is a function defined on the real numbers that is everywhere right-continuous and has left limits everywhere. Càdlàg functions are important in the study of stochastic processes that admit jumps, unlike Brownian motion, which has continuous sample paths. The collection of càdlàg functions on a given domain is known as Skorokhod space.

Generalized relative entropy is a measure of dissimilarity between two quantum states. It is a "one-shot" analogue of quantum relative entropy and shares many properties of the latter quantity.

In mathematics, the injective tensor product of two topological vector spaces (TVSs) was introduced by Alexander Grothendieck and was used by him to define nuclear spaces. An injective tensor product is in general not necessarily complete, so its completion is called the completed injective tensor products. Injective tensor products have applications outside of nuclear spaces. In particular, as described below, up to TVS-isomorphism, many TVSs that are defined for real or complex valued functions, for instance, the Schwartz space or the space of continuously differentiable functions, can be immediately extended to functions valued in a Hausdorff locally convex TVS without any need to extend definitions from real/complex-valued functions to -valued functions.

References

This article incorporates material from Topological Entropy on PlanetMath, which is licensed under the Creative Commons Attribution/Share-Alike License.