Residual time

Last updated

In the theory of renewal processes, a part of the mathematical theory of probability, the residual time or the forward recurrence time is the time between any given time and the next epoch of the renewal process under consideration. In the context of random walks, it is also known as overshoot. Another way to phrase residual time is "how much more time is there to wait?".

Contents

The residual time is very important in most of the practical applications of renewal processes:

Queueing theory mathematical study of waiting lines, or queues

Queueing theory is the mathematical study of waiting lines, or queues. A queueing model is constructed so that queue lengths and waiting time can be predicted. Queueing theory is generally considered a branch of operations research because the results are often used when making business decisions about the resources needed to provide a service.

In systems engineering, dependability is a measure of a system's availability, reliability, and its maintainability, and maintenance support performance, and, in some cases, other characteristics such as durability, safety and security. In software engineering, dependability is the ability to provide services that can defensibly be trusted within a time-period. This may also encompass mechanisms designed to increase and maintain the dependability of a system or software.

Formal definition

Sample evolution of a renewal process with holding times Si and jump times Jn. Renewal process.reetep.png
Sample evolution of a renewal process with holding timesSi and jump times Jn.

Consider a renewal process , with holding times and jump times (or renewal epochs) , and . The holding times are non-negative, independent, identically distributed random variables and the renewal process is defined as . Then, to a given time , there corresponds uniquely an , such that:

The residual time (or excess time) is given by the time from to the next renewal epoch.

Probability distribution of the residual time

Let the cumulative distribution function of the holding times be and recall that the renewal function of a process is . Then, for a given time , the cumulative distribution function of is calculated as: [2]

Cumulative distribution function probability that random variable X is less than or equal to x.

In probability theory and statistics, the cumulative distribution function (CDF) of a real-valued random variable , or just distribution function of , evaluated at , is the probability that will take a value less than or equal to .

Differentiating with respect to , the probability density function can be written as

where we have substituted From elementary renewal theory, as , where is the mean of the distribution . If we consider the limiting distribution as , assuming that as , we have the limiting pdf as

Likewise, the cumulative distribution of the residual time is

For large , the distribution is independent of , making it a stationary distribution. An interesting fact is that the limiting distribution of forward recurrence time (or residual time) has the same form as the limiting distribution of the backward recurrence time (or age). This distribution is always J-shaped, with mode at zero.

The first two moments of this limiting distribution are:

where is the variance of and and are its second and third moments.

Waiting time paradox

The fact that (for ) is also known variously as the waiting time paradox, inspection paradox, or the paradox of renewal theory. The paradox arises from the fact that the average waiting time until the next renewal, assuming that the reference time point is uniform randomly selected within the inter-renewal interval, is larger than the average inter-renewal interval . The average waiting is only when , that is when the renewals are always punctual or deterministic.

Special case: Markovian holding times

When the holding times are exponentially distributed with , the residual times are also exponentially distributed. That is because and:

This is a known characteristic of the exponential distribution, i.e., its memoryless property. Intuitively, this means that it does not matter how long it has been since the last renewal epoch, the remaining time is still probabilistically the same as in the beginning of the holding time interval.

Exponential distribution probability distribution

In probability theory and statistics, the exponential distribution is the probability distribution of the time between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant average rate. It is a particular case of the gamma distribution. It is the continuous analogue of the geometric distribution, and it has the key property of being memoryless. In addition to being used for the analysis of Poisson point processes it is found in various other contexts.

Renewal theory texts usually also define the spent time or the backward recurrence time (or the current lifetime) as . Its distribution can be calculated in a similar way to that of the residual time. Likewise, the total life time is the sum of backward recurrence time and forward recurrence time.

Related Research Articles

In mathematics, the Lp spaces are function spaces defined using a natural generalization of the p-norm for finite-dimensional vector spaces. They are sometimes called Lebesgue spaces, named after Henri Lebesgue, although according to the Bourbaki group they were first introduced by Frigyes Riesz. Lp spaces form an important class of Banach spaces in functional analysis, and of topological vector spaces. Because of their key role in the mathematical analysis of measure and probability spaces, Lebesgue spaces are used also in the theoretical discussion of problems in physics, statistics, finance, engineering, and other disciplines.

The Liouville function, denoted by λ(n) and named after Joseph Liouville, is an important function in number theory.

Erlang distribution

The Erlang distribution is a two-parameter family of continuous probability distributions with support . The two parameters are:

Jensens inequality Theorem of convex functions

In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proven by Jensen in 1906. Given its generality, the inequality appears in many forms depending on the context, some of which are presented below. In its simplest form the inequality states that the convex transformation of a mean is less than or equal to the mean applied after convex transformation; it is a simple corollary that the opposite is true of concave transformations.

The Gram–Charlier A series, and the Edgeworth series are series that approximate a probability distribution in terms of its cumulants. The series are the same; but, the arrangement of terms differ. The key idea of these expansions is to write the characteristic function of the distribution whose probability density function f is to be approximated in terms of the characteristic function of a distribution with known and suitable properties, and to recover f through the inverse Fourier transform.

In probability theory, a compound Poisson distribution is the probability distribution of the sum of a number of independent identically-distributed random variables, where the number of terms to be added is itself a Poisson-distributed variable. In the simplest cases, the result can be either a continuous or a discrete distribution.

The spectrum of a linear operator that operates on a Banach space consists of all scalars such that the operator does not have a bounded inverse on . The spectrum has a standard decomposition into three parts:

In Bayesian probability, the Jeffreys prior, named after Sir Harold Jeffreys, is a non-informative (objective) prior distribution for a parameter space; it is proportional to the square root of the determinant of the Fisher information matrix:

Noncentral chi-squared distribution

In probability theory and statistics, the noncentral chi-square distribution is a generalization of the chi-square distribution. It often arises in the power analysis of statistical tests in which the null distribution is a chi-square distribution; important examples of such tests are the likelihood-ratio tests.

Inverse Gaussian distribution

In probability theory, the inverse Gaussian distribution is a two-parameter family of continuous probability distributions with support on (0,∞).

In mathematics, the spectral theory of ordinary differential equations is the part of spectral theory concerned with the determination of the spectrum and eigenfunction expansion associated with a linear ordinary differential equation. In his dissertation Hermann Weyl generalized the classical Sturm–Liouville theory on a finite closed interval to second order differential operators with singularities at the endpoints of the interval, possibly semi-infinite or infinite. Unlike the classical case, the spectrum may no longer consist of just a countable set of eigenvalues, but may also contain a continuous part. In this case the eigenfunction expansion involves an integral over the continuous part with respect to a spectral measure, given by the Titchmarsh–Kodaira formula. The theory was put in its final simplified form for singular differential equations of even degree by Kodaira and others, using von Neumann's spectral theorem. It has had important applications in quantum mechanics, operator theory and harmonic analysis on semisimple Lie groups.

Financial models with long-tailed distributions and volatility clustering have been introduced to overcome problems with the realism of classical financial models. These classical models of financial time series typically assume homoskedasticity and normality cannot explain stylized phenomena such as skewness, heavy tails, and volatility clustering of the empirical asset returns in finance. In 1963, Benoit Mandelbrot first used the stable distribution to model the empirical distributions which have the skewness and heavy-tail property. Since -stable distributions have infinite -th moments for all , the tempered stable processes have been proposed for overcoming this limitation of the stable distribution.

In mathematics, the Fortuin–Kasteleyn–Ginibre (FKG) inequality is a correlation inequality, a fundamental tool in statistical mechanics and probabilistic combinatorics, due to Cees M. Fortuin, Pieter W. Kasteleyn, and Jean Ginibre (1971). Informally, it says that in many random systems, increasing events are positively correlated, while an increasing and a decreasing event are negatively correlated. It was obtained by studying the random cluster model.

Poisson distribution discrete probability distribution

In probability theory and statistics, the Poisson distribution, named after French mathematician Siméon Denis Poisson, is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant rate and independently of the time since the last event. The Poisson distribution can also be used for the number of events in other specified intervals such as distance, area or volume.

Marchenko–Pastur distribution

In the mathematical theory of random matrices, the Marchenko–Pastur distribution, or Marchenko–Pastur law, describes the asymptotic behavior of singular values of large rectangular random matrices. The theorem is named after Ukrainian mathematicians Vladimir Marchenko and Leonid Pastur who proved this result in 1967.

In mathematics, the method of steepest descent or stationary-phase method or saddle-point method is an extension of Laplace's method for approximating an integral, where one deforms a contour integral in the complex plane to pass near a stationary point, in roughly the direction of steepest descent or stationary phase. The saddle-point approximation is used with integrals in the complex plane, whereas Laplace’s method is used with real integrals.

In probability theory and statistics, the noncentral beta distribution is a continuous probability distribution that is a generalization of the (central) beta distribution.

In mathematics, in the field of complex analysis, a Nevanlinna function is a complex function which is an analytic function on the open upper half-plane H and has non-negative imaginary part. A Nevanlinna function maps the upper half-plane into itself, but is not necessarily injective or surjective. Functions with this property are sometimes also known as Herglotz, Pick or R functions.

Stochastic portfolio theory (SPT) is a mathematical theory for analyzing stock market structure and portfolio behavior introduced by E. Robert Fernholz in 2002. It is descriptive as opposed to normative, and is consistent with the observed behavior of actual markets. Normative assumptions, which serve as a basis for earlier theories like modern portfolio theory (MPT) and the capital asset pricing model (CAPM), are absent from SPT.

References

  1. William J. Stewart, "Probability, Markov Chains, Queues, and Simulation: The Mathematical Basis of Performance Modeling", Princeton University Press, 2011, ISBN   1-4008-3281-0, 9781400832811
  2. Jyotiprasad Medhi, "Stochastic processes", New Age International, 1994, ISBN   81-224-0549-5, 9788122405491