In control theory, we may need to find out whether or not a system such as
is observable, where , , and are, respectively, , , and matrices.
One of the many ways one can achieve such goal is by the use of the Observability Gramian.
Linear Time Invariant (LTI) Systems are those systems in which the parameters , , and are invariant with respect to time.
One can determine if the LTI system is or is not observable simply by looking at the pair . Then, we can say that the following statements are equivalent:
1. The pair is observable.
2. The matrix
is nonsingular for any .
3. The observability matrix
has rank n.
4. The matrix
has full column rank at every eigenvalue of .
If, in addition, all eigenvalues of have negative real parts ( is stable) and the unique solution of
is positive definite, then the system is observable. The solution is called the Observability Gramian and can be expressed as
In the following section we are going to take a closer look at the Observability Gramian.
The Observability Gramian can be found as the solution of the Lyapunov equation given by
In fact, we can see that if we take
as a solution, we are going to find that:
Where we used the fact that at for stable (all its eigenvalues have negative real part). This shows us that is indeed the solution for the Lyapunov equation under analysis.
We can see that is a symmetric matrix, therefore, so is .
We can use again the fact that, if is stable (all its eigenvalues have negative real part) to show that is unique. In order to prove so, suppose we have two different solutions for
and they are given by and . Then we have:
Multiplying by by the left and by by the right, would lead us to
Integrating from to :
using the fact that as :
In other words, has to be unique.
Also, we can see that
is positive for any (assuming the non-degenerate case where is not identically zero), and that makes a positive definite matrix.
More properties of observable systems can be found in, [1] as well as the proof for the other equivalent statements of "The pair is observable" presented in section Observability in LTI Systems.
For discrete time systems as
One can check that there are equivalences for the statement "The pair is observable" (the equivalences are much alike for the continuous time case).
We are interested in the equivalence that claims that, if "The pair is observable" and all the eigenvalues of have magnitude less than ( is stable), then the unique solution of
is positive definite and given by
That is called the discrete Observability Gramian. We can easily see the correspondence between discrete time and the continuous time case, that is, if we can check that is positive definite, and all eigenvalues of have magnitude less than , the system is observable. More properties and proofs can be found in. [2]
Linear time variant (LTV) systems are those in the form:
That is, the matrices , and have entries that varies with time. Again, as well as in the continuous time case and in the discrete time case, one may be interested in discovering if the system given by the pair is observable or not. This can be done in a very similar way of the preceding cases.
The system is observable at time if and only if there exists a finite such that the matrix also called the Observability Gramian is given by
where is the state transition matrix of is nonsingular.
Again, we have a similar method to determine if a system is or not an observable system.
We have that the Observability Gramian have the following property:
that can easily be seen by the definition of and by the property of the state transition matrix that claims that:
More about the Observability Gramian can be found in. [3]
Bra–ket notation, also called Dirac notation, is a notation for linear algebra and linear operators on complex vector spaces together with their dual space both in the finite-dimensional and infinite-dimensional case. It is specifically designed to ease the types of calculations that frequently come up in quantum mechanics. Its use in quantum mechanics is quite widespread.
In mathematics, convolution is a mathematical operation on two functions that produces a third function that expresses how the shape of one is modified by the other. The term convolution refers to both the result function and to the process of computing it. It is defined as the integral of the product of the two functions after one is reflected about the y-axis and shifted. The choice of which function is reflected and shifted before the integral does not change the integral result. The integral is evaluated for all values of shift, producing the convolution function.
In mechanics and geometry, the 3D rotation group, often denoted SO(3), is the group of all rotations about the origin of three-dimensional Euclidean space under the operation of composition.
In physics, an operator is a function over a space of physical states onto another space of physical states. The simplest example of the utility of operators is the study of symmetry. Because of this, they are useful tools in classical mechanics. Operators are even more important in quantum mechanics, where they form an intrinsic part of the formulation of the theory.
In mathematics, theta functions are special functions of several complex variables. They show up in many topics, including Abelian varieties, moduli spaces, quadratic forms, and solitons. As Grassmann algebras, they appear in quantum field theory.
In physics, the S-matrix or scattering matrix relates the initial state and the final state of a physical system undergoing a scattering process. It is used in quantum mechanics, scattering theory and quantum field theory (QFT).
The adiabatic theorem is a concept in quantum mechanics. Its original form, due to Max Born and Vladimir Fock (1928), was stated as follows:
In signal processing, time–frequency analysis comprises those techniques that study a signal in both the time and frequency domains simultaneously, using various time–frequency representations. Rather than viewing a 1-dimensional signal and some transform, time–frequency analysis studies a two-dimensional signal – a function whose domain is the two-dimensional real plane, obtained from the signal via a time–frequency transform.
In control theory, the discrete Lyapunov equation is of the form
In linear algebra, the Gram matrix of a set of vectors in an inner product space is the Hermitian matrix of inner products, whose entries are given by the inner product . If the vectors are the columns of matrix then the Gram matrix is in the general case that the vector coordinates are complex numbers, which simplifies to for the case that the vector coordinates are real numbers.
In system analysis, among other fields of study, a linear time-invariant (LTI) system is a system that produces an output signal from any input signal subject to the constraints of linearity and time-invariance; these terms are briefly defined below. These properties apply (exactly or approximately) to many important physical systems, in which case the response y(t) of the system to an arbitrary input x(t) can be found directly using convolution: y(t) = (x ∗ h)(t) where h(t) is called the system's impulse response and ∗ represents convolution (not to be confused with multiplication). What's more, there are systematic methods for solving any such system (determining h(t)), whereas systems not meeting both properties are generally more difficult (or impossible) to solve analytically. A good example of an LTI system is any electrical circuit consisting of resistors, capacitors, inductors and linear amplifiers.
In control theory, a distributed-parameter system is a system whose state space is infinite-dimensional. Such systems are therefore also known as infinite-dimensional systems. Typical examples are systems described by partial differential equations or by delay differential equations.
In mathematics, delay differential equations (DDEs) are a type of differential equation in which the derivative of the unknown function at a certain time is given in terms of the values of the function at previous times. DDEs are also called time-delay systems, systems with aftereffect or dead-time, hereditary systems, equations with deviating argument, or differential-difference equations. They belong to the class of systems with the functional state, i.e. partial differential equations (PDEs) which are infinite dimensional, as opposed to ordinary differential equations (ODEs) having a finite dimensional state vector. Four points may give a possible explanation of the popularity of DDEs:
In control theory, we may need to find out whether or not a system such as
The theoretical and experimental justification for the Schrödinger equation motivates the discovery of the Schrödinger equation, the equation that describes the dynamics of nonrelativistic particles. The motivation uses photons, which are relativistic particles with dynamics described by Maxwell's equations, as an analogue for all types of particles.
Bilinear time–frequency distributions, or quadratic time–frequency distributions, arise in a sub-field of signal analysis and signal processing called time–frequency signal processing, and, in the statistical analysis of time series data. Such methods are used where one needs to deal with a situation where the frequency composition of a signal may be changing over time; this sub-field used to be called time–frequency signal analysis, and is now more often called time–frequency signal processing due to the progress in using these methods to a wide range of signal-processing problems.
In control theory, the state-transition matrix is a matrix whose product with the state vector at an initial time gives at a later time . The state-transition matrix can be used to obtain the general solution of linear dynamical systems.
Filtering in the context of large eddy simulation (LES) is a mathematical operation intended to remove a range of small scales from the solution to the Navier-Stokes equations. Because the principal difficulty in simulating turbulent flows comes from the wide range of length and time scales, this operation makes turbulent flow simulation cheaper by reducing the range of scales that must be resolved. The LES filter operation is low-pass, meaning it filters out the scales associated with high frequencies.
In control theory, the cross Gramian is a Gramian matrix used to determine how controllable and observable a linear system is.
The Bueno-Orovio–Cherry–Fenton model, also simply called Bueno-Orovio model, is a minimal ionic model for human ventricular cells. It belongs to the category of phenomenological models, because of its characteristic of describing the electrophysiological behaviour of cardiac muscle cells without taking into account in a detailed way the underlying physiology and the specific mechanisms occurring inside the cells.
This article needs additional citations for verification .(July 2008) |