Marginal stability

Last updated

In the theory of dynamical systems and control theory, a linear time-invariant system is marginally stable if it is neither asymptotically stable nor unstable. Roughly speaking, a system is stable if it always returns to and stays near a particular state (called the steady state), and is unstable if it goes farther and farther away from any state, without being bounded. A marginal system, sometimes referred to as having neutral stability, [1] is between these two types: when displaced, it does not return to near a common steady state, nor does it go away from where it started without limit.

Contents

Marginal stability, like instability, is a feature that control theory seeks to avoid; we wish that, when perturbed by some external force, a system will return to a desired state. This necessitates the use of appropriately designed control algorithms.

In econometrics, the presence of a unit root in observed time series, rendering them marginally stable, can lead to invalid regression results regarding effects of the independent variables upon a dependent variable, unless appropriate techniques are used to convert the system to a stable system.

Continuous time

A homogeneous continuous linear time-invariant system is marginally stable if and only if the real part of every pole (eigenvalue) in the system's transfer-function is non-positive, one or more poles have zero real part, and all poles with zero real part are simple roots (i.e. the poles on the imaginary axis are all distinct from one another). In contrast, if all the poles have strictly negative real parts, the system is instead asymptotically stable. If the system is neither stable nor marginally stable, it is unstable.

If the system is in state space representation, marginal stability can be analyzed by deriving the Jordan normal form: [2] if and only if the Jordan blocks corresponding to poles with zero real part are scalar is the system marginally stable.

Discrete time

A homogeneous discrete time linear time-invariant system is marginally stable if and only if the greatest magnitude of any of the poles (eigenvalues) of the transfer function is 1, and the poles with magnitude equal to 1 are all distinct. That is, the transfer function's spectral radius is 1. If the spectral radius is less than 1, the system is instead asymptotically stable.

A simple example involves a single first-order linear difference equation: Suppose a state variable x evolves according to

with parameter a > 0. If the system is perturbed to the value its subsequent sequence of values is If a < 1, these numbers get closer and closer to 0 regardless of the starting value while if a > 1 the numbers get larger and larger without bound. But if a = 1, the numbers do neither of these: instead, all future values of x equal the value Thus the case a = 1 exhibits marginal stability.

System response

A marginally stable system is one that, if given an impulse of finite magnitude as input, will not "blow up" and give an unbounded output, but neither will the output return to zero. A bounded offset or oscillations in the output will persist indefinitely, and so there will in general be no final steady-state output. If a continuous system is given an input at a frequency equal to the frequency of a pole with zero real part, the system's output will increase indefinitely (this is known as pure resonance [3] ). This explains why for a system to be BIBO stable, the real parts of the poles have to be strictly negative (and not just non-positive).

A continuous system having imaginary poles, i.e. having zero real part in the pole(s), will produce sustained oscillations in the output. For example, an undamped second-order system such as the suspension system in an automobile (a mass–spring–damper system), from which the damper has been removed and spring is ideal, i.e. no friction is there, will in theory oscillate forever once disturbed. Another example is a frictionless pendulum. A system with a pole at the origin is also marginally stable but in this case there will be no oscillation in the response as the imaginary part is also zero (jw = 0 means w = 0 rad/sec). An example of such a system is a mass on a surface with friction. When a sidewards impulse is applied, the mass will move and never returns to zero. The mass will come to rest due to friction however, and the sidewards movement will remain bounded.

Since the locations of the marginal poles must be exactly on the imaginary axis or unit circle (for continuous time and discrete time systems respectively) for a system to be marginally stable, this situation is unlikely to occur in practice unless marginal stability is an inherent theoretical feature of the system.

Stochastic dynamics

Marginal stability is also an important concept in the context of stochastic dynamics. For example, some processes may follow a random walk, given in discrete time as

where is an i.i.d. error term. This equation has a unit root (a value of 1 for the eigenvalue of its characteristic equation), and hence exhibits marginal stability, so special time series techniques must be used in empirically modeling a system containing such an equation.

Marginally stable Markov processes are those that possess null recurrent classes.

See also

Related Research Articles

Control theory is a field of control engineering and applied mathematics that deals with the control of dynamical systems in engineered processes and machines. The objective is to develop a model or algorithm governing the application of system inputs to drive the system to a desired state, while minimizing any delay, overshoot, or steady-state error and ensuring a level of control stability; often with the aim to achieve a degree of optimality.

<span class="mw-page-title-main">Dynamical system</span> Mathematical model of the time dependence of a point in space

In mathematics, a dynamical system is a system in which a function describes the time dependence of a point in an ambient space, such as in a parametric curve. Examples include the mathematical models that describe the swinging of a clock pendulum, the flow of water in a pipe, the random motion of particles in the air, and the number of fish each springtime in a lake. The most general definition unifies several concepts in mathematics such as ordinary differential equations and ergodic theory by allowing different choices of the space and how time is measured. Time can be measured by integers, by real or complex numbers or can be a more general algebraic object, losing the memory of its physical origin, and the space may be a manifold or simply a set, without the need of a smooth space-time structure defined on it.

<span class="mw-page-title-main">Resonance</span> Tendency to oscillate at certain frequencies

Resonance describes the phenomenon of increased amplitude that occurs when the frequency of an applied periodic force is equal or close to a natural frequency of the system on which it acts. When an oscillating force is applied at a resonant frequency of a dynamic system, the system will oscillate at a higher amplitude than when the same force is applied at other, non-resonant frequencies.

In mathematics and signal processing, the Z-transform converts a discrete-time signal, which is a sequence of real or complex numbers, into a complex frequency-domain representation.

<span class="mw-page-title-main">Attractor</span> Concept in dynamical systems

In the mathematical field of dynamical systems, an attractor is a set of states toward which a system tends to evolve, for a wide variety of starting conditions of the system. System values that get close enough to the attractor values remain close even if slightly disturbed.

<span class="mw-page-title-main">Lyapunov stability</span> Property of a dynamical system where solutions near an equilibrium point remain so

Various types of stability may be discussed for the solutions of differential equations or difference equations describing dynamical systems. The most important type is that concerning the stability of solutions near to a point of equilibrium. This may be discussed by the theory of Aleksandr Lyapunov. In simple terms, if the solutions that start out near an equilibrium point stay near forever, then is Lyapunov stable. More strongly, if is Lyapunov stable and all solutions that start out near converge to , then is said to be asymptotically stable. The notion of exponential stability guarantees a minimal rate of decay, i.e., an estimate of how quickly the solutions converge. The idea of Lyapunov stability can be extended to infinite-dimensional manifolds, where it is known as structural stability, which concerns the behavior of different but "nearby" solutions to differential equations. Input-to-state stability (ISS) applies Lyapunov notions to systems with inputs.

In control theory and signal processing, a linear, time-invariant system is said to be minimum-phase if the system and its inverse are causal and stable.

In control engineering, model based fault detection and system identification a state-space representation is a mathematical model of a physical system specified as a set of input, output and variables related by first-order differential equations or difference equations. Such variables, called state variables, evolve over time in a way that depends on the values they have at any given instant and on the externally imposed values of input variables. Output variables’ values depend on the values of the state variables.

Infinite impulse response (IIR) is a property applying to many linear time-invariant systems that are distinguished by having an impulse response which does not become exactly zero past a certain point, but continues indefinitely. This is in contrast to a finite impulse response (FIR) system in which the impulse response does become exactly zero at times for some finite , thus being of finite duration. Common examples of linear time-invariant systems are most electronic and digital filters. Systems with this property are known as IIR systems or IIR filters.

In signal processing, specifically control theory, bounded-input, bounded-output (BIBO) stability is a form of stability for signals and systems that take inputs. If a system is BIBO stable, then the output will be bounded for every input to the system that is bounded.

In control theory, a continuous linear time-invariant system (LTI) is exponentially stable if and only if the system has eigenvalues with strictly negative real parts.. A discrete-time input-to-output LTI system is exponentially stable if and only if the poles of its transfer function lie strictly within the unit circle centered on the origin of the complex plane. Systems that are not LTI are exponentially stable if their convergence is bounded by exponential decay. Exponential stability is a form of asymptotic stability, valid for more general dynamical systems.

<span class="mw-page-title-main">Linear time-invariant system</span> Mathematical model which is both linear and time-invariant

In system analysis, among other fields of study, a linear time-invariant (LTI) system is a system that produces an output signal from any input signal subject to the constraints of linearity and time-invariance; these terms are briefly defined below. These properties apply (exactly or approximately) to many important physical systems, in which case the response y(t) of the system to an arbitrary input x(t) can be found directly using convolution: y(t) = (xh)(t) where h(t) is called the system's impulse response and ∗ represents convolution (not to be confused with multiplication, as is frequently employed by the symbol in computer languages). What's more, there are systematic methods for solving any such system (determining h(t)), whereas systems not meeting both properties are generally more difficult (or impossible) to solve analytically. A good example of an LTI system is any electrical circuit consisting of resistors, capacitors, inductors and linear amplifiers.

<span class="mw-page-title-main">Bifurcation theory</span> Study of sudden qualitative behavior changes caused by small parameter changes

Bifurcation theory is the mathematical study of changes in the qualitative or topological structure of a given family of curves, such as the integral curves of a family of vector fields, and the solutions of a family of differential equations. Most commonly applied to the mathematical study of dynamical systems, a bifurcation occurs when a small smooth change made to the parameter values of a system causes a sudden 'qualitative' or topological change in its behavior. Bifurcations occur in both continuous systems and discrete systems.

Floquet theory is a branch of the theory of ordinary differential equations relating to the class of solutions to periodic linear differential equations of the form

Nonlinear control theory is the area of control theory which deals with systems that are nonlinear, time-variant, or both. Control theory is an interdisciplinary branch of engineering and mathematics that is concerned with the behavior of dynamical systems with inputs, and how to modify the output by changes in the input using feedback, feedforward, or signal filtering. The system to be controlled is called the "plant". One way to make the output of a system follow a desired reference signal is to compare the output of the plant to the desired output, and provide feedback to the plant to modify the output to bring it closer to the desired output.

In mathematics, in the theory of differential equations and dynamical systems, a particular stationary or quasistationary solution to a nonlinear system is called linearly unstable if the linearization of the equation at this solution has the form , where r is the perturbation to the steady state, A is a linear operator whose spectrum contains eigenvalues with positive real part. If all the eigenvalues have negative real part, then the solution is called linearlystable. Other names for linear stability include exponential stability or stability in terms of first approximation. If there exist an eigenvalue with zero real part then the question about stability cannot be solved on the basis of the first approximation and we approach the so-called "centre and focus problem".

In the mathematics of evolving systems, the concept of a center manifold was originally developed to determine stability of degenerate equilibria. Subsequently, the concept of center manifolds was realised to be fundamental to mathematical modelling.

<span class="mw-page-title-main">Stability theory</span> Part of mathematics that addresses the stability of solutions

In mathematics, stability theory addresses the stability of solutions of differential equations and of trajectories of dynamical systems under small perturbations of initial conditions. The heat equation, for example, is a stable partial differential equation because small perturbations of initial data lead to small variations in temperature at a later time as a result of the maximum principle. In partial differential equations one may measure the distances between functions using Lp norms or the sup norm, while in differential geometry one may measure the distance between spaces using the Gromov–Hausdorff distance.

A matrix difference equation is a difference equation in which the value of a vector of variables at one point in time is related to its own value at one or more previous points in time, using matrices. The order of the equation is the maximum time gap between any two indicated values of the variable vector. For example,

<span class="mw-page-title-main">Rayleigh's equation (fluid dynamics)</span>

In fluid dynamics, Rayleigh's equation or Rayleigh stability equation is a linear ordinary differential equation to study the hydrodynamic stability of a parallel, incompressible and inviscid shear flow. The equation is:

References

  1. Gene F. Franklin; J. David Powell; Abbas Emami-Naeini (2006). Feedback Control of Dynamic Systems (5 ed.). Pearson Education. ISBN   0-13-149930-0.
  2. Karl J. Åström and Richard M. Murray. "Linear Systems". Feedback Systems Wiki. Caltech. Retrieved 11 August 2014.
  3. "Pure Resonance". MIT. Retrieved 2 September 2015.