Flow (mathematics)

Last updated
Flow in phase space specified by the differential equation of a pendulum. On the horizontal axis, the pendulum position, and on the vertical one its velocity. PenduleEspaceDesPhases.png
Flow in phase space specified by the differential equation of a pendulum. On the horizontal axis, the pendulum position, and on the vertical one its velocity.

In mathematics, a flow formalizes the idea of the motion of particles in a fluid. Flows are ubiquitous in science, including engineering and physics. The notion of flow is basic to the study of ordinary differential equations. Informally, a flow may be viewed as a continuous motion of points over time. More formally, a flow is a group action of the real numbers on a set.

Contents

The idea of a vector flow, that is, the flow determined by a vector field, occurs in the areas of differential topology, Riemannian geometry and Lie groups. Specific examples of vector flows include the geodesic flow, the Hamiltonian flow, the Ricci flow, the mean curvature flow, and Anosov flows. Flows may also be defined for systems of random variables and stochastic processes, and occur in the study of ergodic dynamical systems. The most celebrated of these is perhaps the Bernoulli flow.

Formal definition

A flow on a set X is a group action of the additive group of real numbers on X. More explicitly, a flow is a mapping

such that, for all xX and all real numbers s and t,

It is customary to write φt(x) instead of φ(x, t), so that the equations above can be expressed as (the identity function) and (group law). Then, for all the mapping is a bijection with inverse This follows from the above definition, and the real parameter t may be taken as a generalized functional power, as in function iteration.

Flows are usually required to be compatible with structures furnished on the set X. In particular, if X is equipped with a topology, then φ is usually required to be continuous. If X is equipped with a differentiable structure, then φ is usually required to be differentiable. In these cases the flow forms a one-parameter group of homeomorphisms and diffeomorphisms, respectively.

In certain situations one might also consider local flows, which are defined only in some subset

called the flow domain of φ. This is often the case with the flows of vector fields.

Alternative notations

It is very common in many fields, including engineering, physics and the study of differential equations, to use a notation that makes the flow implicit. Thus, x(t) is written for and one might say that the variable x depends on the time t and the initial condition x = x0. Examples are given below.

In the case of a flow of a vector field V on a smooth manifold X, the flow is often denoted in such a way that its generator is made explicit. For example,

Orbits

Given x in X, the set is called the orbit of x under φ. Informally, it may be regarded as the trajectory of a particle that was initially positioned at x. If the flow is generated by a vector field, then its orbits are the images of its integral curves.

Examples

Algebraic equation

Let be a time-dependent trajectory which is a bijective function, i.e, non-periodic function. Then a flow can be defined by

Autonomous systems of ordinary differential equations

Let be a (time-independent) vector field and the solution of the initial value problem

Then is the flow of the vector field F. It is a well-defined local flow provided that the vector field is Lipschitz-continuous. Then is also Lipschitz-continuous wherever defined. In general it may be hard to show that the flow φ is globally defined, but one simple criterion is that the vector field F is compactly supported.

Time-dependent ordinary differential equations

In the case of time-dependent vector fields , one denotes where is the solution of

Then is the time-dependent flow of F. It is not a "flow" by the definition above, but it can easily be seen as one by rearranging its arguments. Namely, the mapping

indeed satisfies the group law for the last variable:

One can see time-dependent flows of vector fields as special cases of time-independent ones by the following trick. Define

Then y(t) is the solution of the "time-independent" initial value problem

if and only if x(t) is the solution of the original time-dependent initial value problem. Furthermore, then the mapping φ is exactly the flow of the "time-independent" vector field G.

Flows of vector fields on manifolds

The flows of time-independent and time-dependent vector fields are defined on smooth manifolds exactly as they are defined on the Euclidean space and their local behavior is the same. However, the global topological structure of a smooth manifold is strongly manifest in what kind of global vector fields it can support, and flows of vector fields on smooth manifolds are indeed an important tool in differential topology. The bulk of studies in dynamical systems are conducted on smooth manifolds, which are thought of as "parameter spaces" in applications.

Formally: Let be a differentiable manifold. Let denote the tangent space of a point Let be the complete tangent manifold; that is, Let

be a time-dependent vector field on ; that is, f is a smooth map such that for each and , one has that is, the map maps each point to an element of its own tangent space. For a suitable interval containing 0, the flow of f is a function that satisfies

Solutions of heat equation

Let Ω be a subdomain (bounded or not) of (with n an integer). Denote by Γ its boundary (assumed smooth). Consider the following heat equation on Ω × (0, T), for T > 0,

with the following initial value condition u(0) = u0 in Ω .

The equation u = 0 on Γ × (0, T) corresponds to the Homogeneous Dirichlet boundary condition. The mathematical setting for this problem can be the semigroup approach. To use this tool, we introduce the unbounded operator ΔD defined on by its domain

(see the classical Sobolev spaces with and

is the closure of the infinitely differentiable functions with compact support in Ω for the norm).

For any , we have

With this operator, the heat equation becomes and u(0) = u0. Thus, the flow corresponding to this equation is (see notations above)

where exp(tΔD) is the (analytic) semigroup generated by ΔD.

Solutions of wave equation

Again, let Ω be a subdomain (bounded or not) of (with n an integer). We denote by Γ its boundary (assumed smooth). Consider the following wave equation on (for T > 0),

with the following initial condition u(0) = u1,0 in Ω and

Using the same semigroup approach as in the case of the Heat Equation above. We write the wave equation as a first order in time partial differential equation by introducing the following unbounded operator,

with domain on (the operator ΔD is defined in the previous example).

We introduce the column vectors

(where and ) and

With these notions, the Wave Equation becomes and U(0) = U0.

Thus, the flow corresponding to this equation is

where is the (unitary) semigroup generated by

Bernoulli flow

Ergodic dynamical systems, that is, systems exhibiting randomness, exhibit flows as well. The most celebrated of these is perhaps the Bernoulli flow. The Ornstein isomorphism theorem states that, for any given entropy H, there exists a flow φ(x, t), called the Bernoulli flow, such that the flow at time t = 1, i.e.φ(x, 1), is a Bernoulli shift.

Furthermore, this flow is unique, up to a constant rescaling of time. That is, if ψ(x, t), is another flow with the same entropy, then ψ(x, t) = φ(x, t), for some constant c. The notion of uniqueness and isomorphism here is that of the isomorphism of dynamical systems. Many dynamical systems, including Sinai's billiards and Anosov flows are isomorphic to Bernoulli shifts.

See also

Related Research Articles

<span class="mw-page-title-main">Navier–Stokes equations</span> Equations describing the motion of viscous fluid substances

The Navier–Stokes equations are partial differential equations which describe the motion of viscous fluid substances, named after French engineer and physicist Claude-Louis Navier and Anglo-Irish physicist and mathematician George Gabriel Stokes. They were developed over several decades of progressively building the theories, from 1822 (Navier) to 1842-1850 (Stokes).

In physics, angular velocity or rotational velocity, also known as angular frequency vector, is a pseudovector representation of how fast the angular position or orientation of an object changes with time. The magnitude of the pseudovector represents the angular speed, the rate at which the object rotates or revolves, and its direction is normal to the instantaneous plane of rotation or angular displacement. The orientation of angular velocity is conventionally specified by the right-hand rule.

<span class="mw-page-title-main">Dispersion (optics)</span> Dependence of phase velocity on frequency

In optics and in wave propagation in general, dispersion is the phenomenon in which the phase velocity of a wave depends on its frequency; sometimes the term chromatic dispersion is used for specificity to optics in particular. A medium having this common property may be termed a dispersive medium.

<span class="mw-page-title-main">Hamiltonian mechanics</span> Formulation of classical mechanics using momenta

Hamiltonian mechanics emerged in 1833 as a reformulation of Lagrangian mechanics. Introduced by Sir William Rowan Hamilton, Hamiltonian mechanics replaces (generalized) velocities used in Lagrangian mechanics with (generalized) momenta. Both theories provide interpretations of classical mechanics and describe the same physical phenomena.

In 1851, George Gabriel Stokes derived an expression, now known as Stokes' law, for the frictional force – also called drag force – exerted on spherical objects with very small Reynolds numbers in a viscous fluid. Stokes' law is derived by solving the Stokes flow limit for small Reynolds numbers of the Navier–Stokes equations.

In mathematical analysis, a function of bounded variation, also known as BV function, is a real-valued function whose total variation is bounded (finite): the graph of a function having this property is well behaved in a precise sense. For a continuous function of a single variable, being of bounded variation means that the distance along the direction of the y-axis, neglecting the contribution of motion along x-axis, traveled by a point moving along the graph has a finite value. For a continuous function of several variables, the meaning of the definition is the same, except for the fact that the continuous path to be considered cannot be the whole graph of the given function, but can be every intersection of the graph itself with a hyperplane parallel to a fixed x-axis and to the y-axis.

In mathematics, a locally integrable function is a function which is integrable on every compact subset of its domain of definition. The importance of such functions lies in the fact that their function space is similar to Lp spaces, but its members are not required to satisfy any growth restriction on their behavior at the boundary of their domain : in other words, locally integrable functions can grow arbitrarily fast at the domain boundary, but are still manageable in a way similar to ordinary integrable functions.

In the mathematical field of dynamical systems, a random dynamical system is a dynamical system in which the equations of motion have an element of randomness to them. Random dynamical systems are characterized by a state space S, a set of maps from S into itself that can be thought of as the set of all possible equations of motion, and a probability distribution Q on the set that represents the random choice of map. Motion in a random dynamical system can be informally thought of as a state evolving according to a succession of maps randomly chosen according to the distribution Q.

In mathematics, the attractor of a random dynamical system may be loosely thought of as a set to which the system evolves after a long enough time. The basic idea is the same as for a deterministic dynamical system, but requires careful treatment because random dynamical systems are necessarily non-autonomous. This requires one to consider the notion of a pullback attractor or attractor in the pullback sense.

In applied mathematics, discontinuous Galerkin methods (DG methods) form a class of numerical methods for solving differential equations. They combine features of the finite element and the finite volume framework and have been successfully applied to hyperbolic, elliptic, parabolic and mixed form problems arising from a wide range of applications. DG methods have in particular received considerable interest for problems with a dominant first-order part, e.g. in electrodynamics, fluid mechanics and plasma physics.

In fluid dynamics, Luke's variational principle is a Lagrangian variational description of the motion of surface waves on a fluid with a free surface, under the action of gravity. This principle is named after J.C. Luke, who published it in 1967. This variational principle is for incompressible and inviscid potential flows, and is used to derive approximate wave models like the mild-slope equation, or using the averaged Lagrangian approach for wave propagation in inhomogeneous media.

<span class="mw-page-title-main">Mild-slope equation</span> Physics phenomenon and formula

In fluid dynamics, the mild-slope equation describes the combined effects of diffraction and refraction for water waves propagating over bathymetry and due to lateral boundaries—like breakwaters and coastlines. It is an approximate model, deriving its name from being originally developed for wave propagation over mild slopes of the sea floor. The mild-slope equation is often used in coastal engineering to compute the wave-field changes near harbours and coasts.

Lagrangian field theory is a formalism in classical field theory. It is the field-theoretic analogue of Lagrangian mechanics. Lagrangian mechanics is used to analyze the motion of a system of discrete particles each with a finite number of degrees of freedom. Lagrangian field theory applies to continua and fields, which have an infinite number of degrees of freedom.

In machine learning, the kernel embedding of distributions comprises a class of nonparametric methods in which a probability distribution is represented as an element of a reproducing kernel Hilbert space (RKHS). A generalization of the individual data-point feature mapping done in classical kernel methods, the embedding of distributions into infinite-dimensional feature spaces can preserve all of the statistical features of arbitrary distributions, while allowing one to compare and manipulate distributions using Hilbert space operations such as inner products, distances, projections, linear transformations, and spectral analysis. This learning framework is very general and can be applied to distributions over any space on which a sensible kernel function may be defined. For example, various kernels have been proposed for learning from data which are: vectors in , discrete classes/categories, strings, graphs/networks, images, time series, manifolds, dynamical systems, and other structured objects. The theory behind kernel embeddings of distributions has been primarily developed by Alex Smola, Le Song , Arthur Gretton, and Bernhard Schölkopf. A review of recent works on kernel embedding of distributions can be found in.

<span class="mw-page-title-main">Averaged Lagrangian</span>

In continuum mechanics, Whitham's averaged Lagrangian method – or in short Whitham's method – is used to study the Lagrangian dynamics of slowly-varying wave trains in an inhomogeneous (moving) medium. The method is applicable to both linear and non-linear systems. As a direct consequence of the averaging used in the method, wave action is a conserved property of the wave motion. In contrast, the wave energy is not necessarily conserved, due to the exchange of energy with the mean motion. However the total energy, the sum of the energies in the wave motion and the mean motion, will be conserved for a time-invariant Lagrangian. Further, the averaged Lagrangian has a strong relation to the dispersion relation of the system.

The Leray projection, named after Jean Leray, is a linear operator used in the theory of partial differential equations, specifically in the fields of fluid dynamics. Informally, it can be seen as the projection on the divergence-free vector fields. It is used in particular to eliminate both the pressure term and the divergence-free term in the Stokes equations and Navier–Stokes equations.

The variational multiscale method (VMS) is a technique used for deriving models and numerical methods for multiscale phenomena. The VMS framework has been mainly applied to design stabilized finite element methods in which stability of the standard Galerkin method is not ensured both in terms of singular perturbation and of compatibility conditions with the finite element spaces.

The streamline upwind Petrov–Galerkin pressure-stabilizing Petrov–Galerkin formulation for incompressible Navier–Stokes equations can be used for finite element computations of high Reynolds number incompressible flow using equal order of finite element space by introducing additional stabilization terms in the Navier–Stokes Galerkin formulation.

The Bueno-Orovio–Cherry–Fenton model, also simply called Bueno-Orovio model, is a minimal ionic model for human ventricular cells. It belongs to the category of phenomenological models, because of its characteristic of describing the electrophysiological behaviour of cardiac muscle cells without taking into account in a detailed way the underlying physiology and the specific mechanisms occurring inside the cells.

In mathematics, and especially differential geometry and mathematical physics, gauge theory is the general study of connections on vector bundles, principal bundles, and fibre bundles. Gauge theory in mathematics should not be confused with the closely related concept of a gauge theory in physics, which is a field theory which admits gauge symmetry. In mathematics theory means a mathematical theory, encapsulating the general study of a collection of concepts or phenomena, whereas in the physical sense a gauge theory is a mathematical model of some natural phenomenon.

References