Flow (mathematics)

Last updated
Flow in phase space specified by the differential equation of a pendulum. On the horizontal axis, the pendulum position, and on the vertical one its velocity. PenduleEspaceDesPhases.png
Flow in phase space specified by the differential equation of a pendulum. On the horizontal axis, the pendulum position, and on the vertical one its velocity.

In mathematics, a flow formalizes the idea of the motion of particles in a fluid. Flows are ubiquitous in science, including engineering and physics. The notion of flow is basic to the study of ordinary differential equations. Informally, a flow may be viewed as a continuous motion of points over time. More formally, a flow is a group action of the real numbers on a set.

Contents

The idea of a vector flow, that is, the flow determined by a vector field, occurs in the areas of differential topology, Riemannian geometry and Lie groups. Specific examples of vector flows include the geodesic flow, the Hamiltonian flow, the Ricci flow, the mean curvature flow, and Anosov flows. Flows may also be defined for systems of random variables and stochastic processes, and occur in the study of ergodic dynamical systems. The most celebrated of these is perhaps the Bernoulli flow.

Formal definition

A flow on a set X is a group action of the additive group of real numbers on X. More explicitly, a flow is a mapping

such that, for all xX and all real numbers s and t,

It is customary to write φt(x) instead of φ(x, t), so that the equations above can be expressed as (the identity function) and (group law). Then, for all the mapping is a bijection with inverse This follows from the above definition, and the real parameter t may be taken as a generalized functional power, as in function iteration.

Flows are usually required to be compatible with structures furnished on the set X. In particular, if X is equipped with a topology, then φ is usually required to be continuous. If X is equipped with a differentiable structure, then φ is usually required to be differentiable. In these cases the flow forms a one-parameter group of homeomorphisms and diffeomorphisms, respectively.

In certain situations one might also consider local flows, which are defined only in some subset

called the flow domain of φ. This is often the case with the flows of vector fields.

Alternative notations

It is very common in many fields, including engineering, physics and the study of differential equations, to use a notation that makes the flow implicit. Thus, x(t) is written for and one might say that the variable x depends on the time t and the initial condition x = x0. Examples are given below.

In the case of a flow of a vector field V on a smooth manifold X, the flow is often denoted in such a way that its generator is made explicit. For example,

Orbits

Given x in X, the set is called the orbit of x under φ. Informally, it may be regarded as the trajectory of a particle that was initially positioned at x. If the flow is generated by a vector field, then its orbits are the images of its integral curves.

Examples

Algebraic equation

Let be a time-dependent trajectory which is a bijective function. Then a flow can be defined by

Autonomous systems of ordinary differential equations

Let be a (time-independent) vector field and the solution of the initial value problem

Then is the flow of the vector field F. It is a well-defined local flow provided that the vector field is Lipschitz-continuous. Then is also Lipschitz-continuous wherever defined. In general it may be hard to show that the flow φ is globally defined, but one simple criterion is that the vector field F is compactly supported.

Time-dependent ordinary differential equations

In the case of time-dependent vector fields , one denotes where is the solution of

Then is the time-dependent flow of F. It is not a "flow" by the definition above, but it can easily be seen as one by rearranging its arguments. Namely, the mapping

indeed satisfies the group law for the last variable:

One can see time-dependent flows of vector fields as special cases of time-independent ones by the following trick. Define

Then y(t) is the solution of the "time-independent" initial value problem

if and only if x(t) is the solution of the original time-dependent initial value problem. Furthermore, then the mapping φ is exactly the flow of the "time-independent" vector field G.

Flows of vector fields on manifolds

The flows of time-independent and time-dependent vector fields are defined on smooth manifolds exactly as they are defined on the Euclidean space and their local behavior is the same. However, the global topological structure of a smooth manifold is strongly manifest in what kind of global vector fields it can support, and flows of vector fields on smooth manifolds are indeed an important tool in differential topology. The bulk of studies in dynamical systems are conducted on smooth manifolds, which are thought of as "parameter spaces" in applications.

Formally: Let be a differentiable manifold. Let denote the tangent space of a point Let be the complete tangent manifold; that is, Let

be a time-dependent vector field on ; that is, f is a smooth map such that for each and , one has that is, the map maps each point to an element of its own tangent space. For a suitable interval containing 0, the flow of f is a function that satisfies

Solutions of heat equation

Let Ω be a subdomain (bounded or not) of (with n an integer). Denote by Γ its boundary (assumed smooth). Consider the following heat equation on Ω × (0, T), for T > 0,

with the following initial value condition u(0) = u0 in Ω .

The equation u = 0 on Γ × (0, T) corresponds to the Homogeneous Dirichlet boundary condition. The mathematical setting for this problem can be the semigroup approach. To use this tool, we introduce the unbounded operator ΔD defined on by its domain

(see the classical Sobolev spaces with and

is the closure of the infinitely differentiable functions with compact support in Ω for the norm).

For any , we have

With this operator, the heat equation becomes and u(0) = u0. Thus, the flow corresponding to this equation is (see notations above)

where exp(tΔD) is the (analytic) semigroup generated by ΔD.

Solutions of wave equation

Again, let Ω be a subdomain (bounded or not) of (with n an integer). We denote by Γ its boundary (assumed smooth). Consider the following wave equation on (for T > 0),

with the following initial condition u(0) = u1,0 in Ω and

Using the same semigroup approach as in the case of the Heat Equation above. We write the wave equation as a first order in time partial differential equation by introducing the following unbounded operator,

with domain on (the operator ΔD is defined in the previous example).

We introduce the column vectors

(where and ) and

With these notions, the Wave Equation becomes and U(0) = U0.

Thus, the flow corresponding to this equation is

where is the (unitary) semigroup generated by

Bernoulli flow

Ergodic dynamical systems, that is, systems exhibiting randomness, exhibit flows as well. The most celebrated of these is perhaps the Bernoulli flow. The Ornstein isomorphism theorem states that, for any given entropy H, there exists a flow φ(x, t), called the Bernoulli flow, such that the flow at time t = 1, i.e.φ(x, 1), is a Bernoulli shift.

Furthermore, this flow is unique, up to a constant rescaling of time. That is, if ψ(x, t), is another flow with the same entropy, then ψ(x, t) = φ(x, t), for some constant c. The notion of uniqueness and isomorphism here is that of the isomorphism of dynamical systems. Many dynamical systems, including Sinai's billiards and Anosov flows are isomorphic to Bernoulli shifts.

See also

Related Research Articles

<span class="mw-page-title-main">Navier–Stokes equations</span> Equations describing the motion of viscous fluid substances

The Navier–Stokes equations are partial differential equations which describe the motion of viscous fluid substances. They were named after French engineer and physicist Claude-Louis Navier and the Irish physicist and mathematician George Gabriel Stokes. They were developed over several decades of progressively building the theories, from 1822 (Navier) to 1842–1850 (Stokes).

<span class="mw-page-title-main">Noether's theorem</span> Statement relating differentiable symmetries to conserved quantities

Noether's theorem states that every continuous symmetry of the action of a physical system with conservative forces has a corresponding conservation law. This is the first of two theorems proven by mathematician Emmy Noether in 1915 and published in 1918. The action of a physical system is the integral over time of a Lagrangian function, from which the system's behavior can be determined by the principle of least action. This theorem only applies to continuous and smooth symmetries of physical space.

<span class="mw-page-title-main">Hamiltonian mechanics</span> Formulation of classical mechanics using momenta

In physics, Hamiltonian mechanics is a reformulation of Lagrangian mechanics that emerged in 1833. Introduced by Sir William Rowan Hamilton, Hamiltonian mechanics replaces (generalized) velocities used in Lagrangian mechanics with (generalized) momenta. Both theories provide interpretations of classical mechanics and describe the same physical phenomena.

In fluid dynamics, Stokes' law is an empirical law for the frictional force – also called drag force – exerted on spherical objects with very small Reynolds numbers in a viscous fluid. It was derived by George Gabriel Stokes in 1851 by solving the Stokes flow limit for small Reynolds numbers of the Navier–Stokes equations.

In mathematical analysis, a function of bounded variation, also known as BV function, is a real-valued function whose total variation is bounded (finite): the graph of a function having this property is well behaved in a precise sense. For a continuous function of a single variable, being of bounded variation means that the distance along the direction of the y-axis, neglecting the contribution of motion along x-axis, traveled by a point moving along the graph has a finite value. For a continuous function of several variables, the meaning of the definition is the same, except for the fact that the continuous path to be considered cannot be the whole graph of the given function, but can be every intersection of the graph itself with a hyperplane parallel to a fixed x-axis and to the y-axis.

In probability theory and related fields, Malliavin calculus is a set of mathematical techniques and ideas that extend the mathematical field of calculus of variations from deterministic functions to stochastic processes. In particular, it allows the computation of derivatives of random variables. Malliavin calculus is also called the stochastic calculus of variations. P. Malliavin first initiated the calculus on infinite dimensional space. Then, the significant contributors such as S. Kusuoka, D. Stroock, J-M. Bismut, Shinzo Watanabe, I. Shigekawa, and so on finally completed the foundations.

A stochastic differential equation (SDE) is a differential equation in which one or more of the terms is a stochastic process, resulting in a solution which is also a stochastic process. SDEs have many applications throughout pure mathematics and are used to model various behaviours of stochastic models such as stock prices, random growth models or physical systems that are subjected to thermal fluctuations.

<span class="mw-page-title-main">LSZ reduction formula</span> Connection between correlation functions and the S-matrix

In quantum field theory, the Lehmann–Symanzik–Zimmermann (LSZ) reduction formula is a method to calculate S-matrix elements from the time-ordered correlation functions of a quantum field theory. It is a step of the path that starts from the Lagrangian of some quantum field theory and leads to prediction of measurable quantities. It is named after the three German physicists Harry Lehmann, Kurt Symanzik and Wolfhart Zimmermann.

<span class="mw-page-title-main">Stokes flow</span> Type of fluid flow

Stokes flow, also named creeping flow or creeping motion, is a type of fluid flow where advective inertial forces are small compared with viscous forces. The Reynolds number is low, i.e. . This is a typical situation in flows where the fluid velocities are very slow, the viscosities are very large, or the length-scales of the flow are very small. Creeping flow was first studied to understand lubrication. In nature, this type of flow occurs in the swimming of microorganisms and sperm. In technology, it occurs in paint, MEMS devices, and in the flow of viscous polymers generally.

In mathematics, a locally integrable function is a function which is integrable on every compact subset of its domain of definition. The importance of such functions lies in the fact that their function space is similar to Lp spaces, but its members are not required to satisfy any growth restriction on their behavior at the boundary of their domain : in other words, locally integrable functions can grow arbitrarily fast at the domain boundary, but are still manageable in a way similar to ordinary integrable functions.

In the mathematical field of dynamical systems, a random dynamical system is a dynamical system in which the equations of motion have an element of randomness to them. Random dynamical systems are characterized by a state space S, a set of maps from S into itself that can be thought of as the set of all possible equations of motion, and a probability distribution Q on the set that represents the random choice of map. Motion in a random dynamical system can be informally thought of as a state evolving according to a succession of maps randomly chosen according to the distribution Q.

In mathematics, the trace operator extends the notion of the restriction of a function to the boundary of its domain to "generalized" functions in a Sobolev space. This is particularly important for the study of partial differential equations with prescribed boundary conditions, where weak solutions may not be regular enough to satisfy the boundary conditions in the classical sense of functions.

In mathematics, and especially gauge theory, Seiberg–Witten invariants are invariants of compact smooth oriented 4-manifolds introduced by Edward Witten, using the Seiberg–Witten theory studied by Nathan Seiberg and Witten during their investigations of Seiberg–Witten gauge theory.

Lagrangian field theory is a formalism in classical field theory. It is the field-theoretic analogue of Lagrangian mechanics. Lagrangian mechanics is used to analyze the motion of a system of discrete particles each with a finite number of degrees of freedom. Lagrangian field theory applies to continua and fields, which have an infinite number of degrees of freedom.

In machine learning, the kernel embedding of distributions comprises a class of nonparametric methods in which a probability distribution is represented as an element of a reproducing kernel Hilbert space (RKHS). A generalization of the individual data-point feature mapping done in classical kernel methods, the embedding of distributions into infinite-dimensional feature spaces can preserve all of the statistical features of arbitrary distributions, while allowing one to compare and manipulate distributions using Hilbert space operations such as inner products, distances, projections, linear transformations, and spectral analysis. This learning framework is very general and can be applied to distributions over any space on which a sensible kernel function may be defined. For example, various kernels have been proposed for learning from data which are: vectors in , discrete classes/categories, strings, graphs/networks, images, time series, manifolds, dynamical systems, and other structured objects. The theory behind kernel embeddings of distributions has been primarily developed by Alex Smola, Le Song , Arthur Gretton, and Bernhard Schölkopf. A review of recent works on kernel embedding of distributions can be found in.

The Leray projection, named after Jean Leray, is a linear operator used in the theory of partial differential equations, specifically in the fields of fluid dynamics. Informally, it can be seen as the projection on the divergence-free vector fields. It is used in particular to eliminate both the pressure term and the divergence-free term in the Stokes equations and Navier–Stokes equations.

The variational multiscale method (VMS) is a technique used for deriving models and numerical methods for multiscale phenomena. The VMS framework has been mainly applied to design stabilized finite element methods in which stability of the standard Galerkin method is not ensured both in terms of singular perturbation and of compatibility conditions with the finite element spaces.

The streamline upwind Petrov–Galerkin pressure-stabilizing Petrov–Galerkin formulation for incompressible Navier–Stokes equations can be used for finite element computations of high Reynolds number incompressible flow using equal order of finite element space by introducing additional stabilization terms in the Navier–Stokes Galerkin formulation.

The Bueno-Orovio–Cherry–Fenton model, also simply called Bueno-Orovio model, is a minimal ionic model for human ventricular cells. It belongs to the category of phenomenological models, because of its characteristic of describing the electrophysiological behaviour of cardiac muscle cells without taking into account in a detailed way the underlying physiology and the specific mechanisms occurring inside the cells.

In mathematics, and especially differential geometry and mathematical physics, gauge theory is the general study of connections on vector bundles, principal bundles, and fibre bundles. Gauge theory in mathematics should not be confused with the closely related concept of a gauge theory in physics, which is a field theory which admits gauge symmetry. In mathematics theory means a mathematical theory, encapsulating the general study of a collection of concepts or phenomena, whereas in the physical sense a gauge theory is a mathematical model of some natural phenomenon.

References