The stretched exponential function
is obtained by inserting a fractional power law into the exponential function. In most applications, it is meaningful only for arguments t between 0 and +∞. With β = 1, the usual exponential function is recovered. With a stretching exponentβ between 0 and 1, the graph of log f versus t is characteristically stretched, hence the name of the function. The compressed exponential function (with β > 1) has less practical importance, with the notable exception of β = 2, which gives the normal distribution.
In statistics, a power law is a functional relationship between two quantities, where a relative change in one quantity results in a proportional relative change in the other quantity, independent of the initial size of those quantities: one quantity varies as a power of another. For instance, considering the area of a square in terms of the length of its side, if the length is doubled, the area is multiplied by a factor of four.
In mathematics, an exponential function is a function of the form
In probability theory, the normaldistribution is a very common continuous probability distribution. Normal distributions are important in statistics and are often used in the natural and social sciences to represent real-valued random variables whose distributions are not known. A random variable with a Gaussian distribution is said to be normally distributed and is called a normal deviate.
In mathematics, the stretched exponential is also known as the complementary cumulative Weibull distribution. The stretched exponential is also the characteristic function, basically the Fourier transform, of the Lévy symmetric alpha-stable distribution.
In probability theory and statistics, the Weibull distribution is a continuous probability distribution. It is named after Swedish mathematician Waloddi Weibull, who described it in detail in 1951, although it was first identified by Fréchet (1927) and first applied by Rosin & Rammler (1933) to describe a particle size distribution.
In probability theory and statistics, the characteristic function of any real-valued random variable completely defines its probability distribution. If a random variable admits a probability density function, then the characteristic function is the Fourier transform of the probability density function. Thus it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or cumulative distribution functions. There are particularly simple results for the characteristic functions of distributions defined by the weighted sums of random variables.
The Fourier transform (FT) decomposes a function of time into its constituent frequencies. This is similar to the way a musical chord can be expressed in terms of the volumes and frequencies of its constituent notes. The term Fourier transform refers to both the frequency domain representation and the mathematical operation that associates the frequency domain representation to a function of time. The Fourier transform of a function of time is itself a complex-valued function of frequency, whose magnitude component represents the amount of that frequency present in the original function, and whose complex argument is the phase offset of the basic sinusoid in that frequency. The Fourier transform is not limited to functions of time, but the domain of the original function is commonly referred to as the time domain. There is also an inverse Fourier transform that mathematically synthesizes the original function from its frequency domain representation.
In physics, the stretched exponential function is often used as a phenomenological description of relaxation in disordered systems. It was first introduced by Rudolf Kohlrausch in 1854 to describe the discharge of a capacitor;therefore it is also called the Kohlrausch function. In 1970, G. Williams and D.C. Watts used the Fourier transform of the stretched exponential to describe dielectric spectra of polymers; in this context, the stretched exponential or its Fourier transform are also called the Kohlrausch-Williams-Watts (KWW) function.
In the physical sciences, relaxation usually means the return of a perturbed system into equilibrium. Each relaxation process can be categorized by a relaxation time τ. The simplest theoretical description of relaxation as function of time t is an exponential law exp(-t/τ).
Rudolf Hermann Arndt Kohlrausch was a German physicist.
Dielectric spectroscopy measures the dielectric properties of a medium as a function of frequency. It is based on the interaction of an external field with the electric dipole moment of the sample, often expressed by permittivity.
In phenomenological applications, it is often not clear whether the stretched exponential function should apply to the differential or to the integral distribution function—or to neither. In each case one gets the same asymptotic decay, but a different power law prefactor, which makes fits more ambiguous than for simple exponentials. In a few casesit can be shown that the asymptotic decay is a stretched exponential, but the prefactor is usually an unrelated power.
Following the usual physical interpretation, we interpret the function argument t as a time, and fβ(t) is the differential distribution. The area under the curve is therefore interpreted as a mean relaxation time. One finds
where Γ is the gamma function. For exponential decay, 〈τ〉 = τK is recovered.
In mathematics, the gamma function is one of the extensions of the factorial function with its argument shifted down by 1, to real and complex numbers. Derived by Daniel Bernoulli, if n is a positive integer,
The higher moments of the stretched exponential function are:
In mathematics, a moment is a specific quantitative measure of the shape of a function. It is used in both mechanics and statistics. If the function represents physical density, then the zeroth moment is the total mass, the first moment divided by the total mass is the center of mass, and the second moment is the rotational inertia. If the function is a probability distribution, then the zeroth moment is the total probability, the first moment is the mean, the second central moment is the variance, the third standardized moment is the skewness, and the fourth standardized moment is the kurtosis. The mathematical concept is closely related to the concept of moment in physics.
In physics, attempts have been made to explain stretched exponential behaviour as a linear superposition of simple exponential decays. This requires a nontrivial distribution of relaxation times, ρ(u), which is implicitly defined by
Alternatively, a distribution
ρ can be computed from the series expansion:
For rational values of β, ρ(u) can be calculated in terms of elementary functions. But the expression is in general too complex to be useful except for the case β = 1/2 where
Figure 2 shows the same results plotted in both a linear and a log representation. The curves converge to a Dirac delta function peaked at u = 1 as β approaches 1, corresponding to the simple exponential function.
|Figure 2. Linear and log-log plots of the stretched exponential distribution function vs |
for values of the stretching parameter β between 0.1 and 0.9.
The moments of the original function can be expressed as
The first logarithmic moment of the distribution of simple-exponential relaxation times is
where Eu is the Euler constant.
To describe results from spectroscopy or inelastic scattering, the sine or cosine Fourier transform of the stretched exponential is needed. It must be calculated either by numeric integration, or from a series expansion.The series here as well as the one for the distribution function are special cases of the Fox-Wright function. For practical purposes, the Fourier transform may be approximated by the Havriliak-Negami function, though nowadays the numeric computation can be done so efficiently that there is no longer any reason not to use the Kohlrausch-Williams-Watts function in the frequency domain.
As said in the introduction, the stretched exponential was introduced by the German physicist Rudolf Kohlrausch in 1854 to describe the discharge of a capacitor (Leyden jar) that used glass as dielectric medium. The next documented usage is by Friedrich Kohlrausch, son of Rudolf, to describe torsional relaxation. A. Werner used it in 1907 to describe complex luminescence decays; Theodor Förster in 1949 as the fluorescence decay law of electronic energy donors.
Outside condensed matter physics, the stretched exponential has been used to describe the removal rates of small, stray bodies in the solar system,the diffusion-weighted MRI signal in the brain, and the production from unconventional gas wells.
If the integrated distribution is a stretched exponential, the normalized probability density function is given by
Note that confusingly some authorshave been known to use the name "stretched exponential" to refer to the Weibull distribution.
A modified stretched exponential function
with a slowly t-dependent exponent β has been used for biological survival curves.
In mathematics convolution is a mathematical operation on two functions to produce a third function that expresses how the shape of one is modified by the other. The term convolution refers to both the result function and to the process of computing it. Some features of convolution are similar to cross-correlation: for real-valued functions, of a continuous or discrete variable, it differs from cross-correlation only in that either f (x) or g(x) is reflected about the y-axis; thus it is a cross-correlation of f (x) and g(−x), or f (−x) and g(x). For continuous functions, the cross-correlation operator is the adjoint of the convolution operator.
In the mathematical field of differential geometry, the Riemann curvature tensor or Riemann–Christoffel tensor is the most common method used to express the curvature of Riemannian manifolds. It assigns a tensor to each point of a Riemannian manifold, that measures the extent to which the metric tensor is not locally isometric to that of Euclidean space. The curvature tensor can also be defined for any pseudo-Riemannian manifold, or indeed any manifold equipped with an affine connection.
Fractional calculus is a branch of mathematical analysis that studies the several different possibilities of defining real number powers or complex number powers of the differentiation operator D
In mathematics, the Hamilton–Jacobi equation (HJE) is a necessary condition describing extremal geometry in generalizations of problems from the calculus of variations, and is a special case of the Hamilton–Jacobi–Bellman equation. It is named for William Rowan Hamilton and Carl Gustav Jacob Jacobi.
Linear time-invariant theory, commonly known as LTI system theory, comes from applied mathematics and has direct applications in NMR spectroscopy, seismology, circuits, signal processing, control theory, and other technical areas. It investigates the response of a linear and time-invariant system to an arbitrary input signal. Trajectories of these systems are commonly measured and tracked as they move through time, but in applications like image processing and field theory, the LTI systems also have trajectories in spatial dimensions. Thus, these systems are also called linear translation-invariant to give the theory the most general reach. In the case of generic discrete-time systems, linear shift-invariant is the corresponding term. A good example of LTI systems are electrical circuits that can be made up of resistors, capacitors, and inductors.
The Havriliak–Negami relaxation is an empirical modification of the Debye relaxation model in electromagnetism. Unlike the Debye model, the Havriliak–Negami relaxation accounts for the asymmetry and broadness of the dielectric dispersion curve. The model was first used to describe the dielectric relaxation of some polymers, by adding two exponential parameters to the Debye equation:
A quasiprobability distribution is a mathematical object similar to a probability distribution but which relaxes some of Kolmogorov's axioms of probability theory. Although quasiprobabilities share several of general features with ordinary probabilities, such as, crucially, the ability to yield expectation values with respect to the weights of the distribution, they all violate the σ-additivity axiom, because regions integrated under them do not represent probabilities of mutually exclusive states. To compensate, some quasiprobability distributions also counterintuitively have regions of negative probability density, contradicting the first axiom. Quasiprobability distributions arise naturally in the study of quantum mechanics when treated in phase space formulation, commonly used in quantum optics, time-frequency analysis, and elsewhere.
The Mason–Weaver equation describes the sedimentation and diffusion of solutes under a uniform force, usually a gravitational field. Assuming that the gravitational field is aligned in the z direction, the Mason–Weaver equation may be written
Dynamic light scattering (DLS) is a technique in physics that can be used to determine the size distribution profile of small particles in suspension or polymers in solution. In the scope of DLS, temporal fluctuations are usually analyzed by means of the intensity or photon auto-correlation function. In the time domain analysis, the autocorrelation function (ACF) usually decays starting from zero delay time, and faster dynamics due to smaller particles lead to faster decorrelation of scattered intensity trace. It has been shown that the intensity ACF is the Fourier transformation of the power spectrum, and therefore the DLS measurements can be equally well performed in the spectral domain. DLS can also be used to probe the behavior of complex fluids such as concentrated polymer solutions.
Continuous wavelets of compact support can be built , which are related to the beta distribution. The process is derived from probability distributions using blur derivative. These new wavelets have just one cycle, so they are termed unicycle wavelets. They can be viewed as a soft variety of Haar wavelets whose shape is fine-tuned by two parameters and . Closed-form expressions for beta wavelets and scale functions as well as their spectra are derived. Their importance is due to the Central Limit Theorem by Gnedenko and Kolmogorov applied for compactly supported signals .
In many-body theory, the term Green's function is sometimes used interchangeably with correlation function, but refers specifically to correlators of field operators or creation and annihilation operators.
Particle decay is the spontaneous process of one unstable subatomic particle transforming into multiple other particles. The particles created in this process must each be less massive than the original, although the total invariant mass of the system must be conserved. A particle is unstable if there is at least one allowed final state that it can decay into. Unstable particles will often have multiple ways of decaying, each with its own associated probability. Decays are mediated by one or several fundamental forces. The particles in the final state may themselves be unstable and subject to further decay.
In probability theory and statistics, the normal-gamma distribution is a bivariate four-parameter family of continuous probability distributions. It is the conjugate prior of a normal distribution with unknown mean and precision.
Resonance fluorescence is the process in which a two-level atom system interacts with the quantum electromagnetic field if the field is driven at a frequency near to the natural frequency of the atom.
Quantile regression is a type of regression analysis used in statistics and econometrics. Whereas the method of least squares results in estimates of the conditional mean of the response variable given certain values of the predictor variables, quantile regression aims at estimating either the conditional median or other quantiles of the response variable. Essentially, quantile regression is the extension of linear regression and we use it when the conditions of linear regression are not applicable.
Bilinear time–frequency distributions, or quadratic time–frequency distributions, arise in a sub-field of signal analysis and signal processing called time–frequency signal processing, and, in the statistical analysis of time series data. Such methods are used where one needs to deal with a situation where the frequency composition of a signal may be changing over time; this sub-field used to be called time–frequency signal analysis, and is now more often called time–frequency signal processing due to the progress in using these methods to a wide range of signal-processing problems.
Nuclear magnetic resonance (NMR) in porous materials covers the application of using NMR as a tool to study the structure of porous media and various processes occurring in them. This technique allows the determination of characteristics such as the porosity and pore size distribution, the permeability, the water saturation, the wettability, etc.
Phonons can scatter through several mechanisms as they travel through the material. These scattering mechanisms are: Umklapp phonon-phonon scattering, phonon-impurity scattering, phonon-electron scattering, and phonon-boundary scattering. Each scattering mechanism can be characterised by a relaxation rate 1/ which is the inverse of the corresponding relaxation time.
The Zwanzig projection operator is a mathematical device used in statistical mechanics. It operates in the linear space of phase space functions and projects onto the linear subspace of "slow" phase space functions. It was introduced by Robert Zwanzig to derive a generic master equation. It is mostly used in this or similar context in a formal way to derive equations of motion for some "slow" collective variables.
Martin Hairer's theory of regularity structures provides a framework for studying a large class of subcritical parabolic stochastic partial differential equations arising from quantum field theory. The framework covers the Kardar–Parisi–Zhang equation, the equation and the parabolic Anderson model, all of which require renormalization in order to have a well-defined notion of solution.