Markovian arrival process

Last updated

In queueing theory, a discipline within the mathematical theory of probability, a Markovian arrival process (MAP or MArP [1] ) is a mathematical model for the time between job arrivals to a system. The simplest such process is a Poisson process where the time between each arrival is exponentially distributed. [2] [3]

Contents

The processes were first suggested by Marcel F. Neuts in 1979. [2] [4]

Definition

A Markov arrival process is defined by two matrices, D0 and D1 where elements of D0 represent hidden transitions and elements of D1 observable transitions. The block matrix Q below is a transition rate matrix for a continuous-time Markov chain. [5]

The simplest example is a Poisson process where D0 = λ and D1 = λ where there is only one possible transition, it is observable, and occurs at rate λ. For Q to be a valid transition rate matrix, the following restrictions apply to the Di

Special cases

Phase-type renewal process

The phase-type renewal process is a Markov arrival process with phase-type distributed sojourn between arrivals. For example, if an arrival process has an interarrival time distribution PH with an exit vector denoted , the arrival process has generator matrix,

Generalizations

Batch Markov arrival process

The batch Markovian arrival process (BMAP) is a generalisation of the Markovian arrival process by allowing more than one arrival at a time. [6] [7] The homogeneous case has rate matrix,

An arrival of size occurs every time a transition occurs in the sub-matrix . Sub-matrices have elements of , the rate of a Poisson process, such that,

and

Markov-modulated Poisson process

The Markov-modulated Poisson process or MMPP where m Poisson processes are switched between by an underlying continuous-time Markov chain. [8] If each of the m Poisson processes has rate λi and the modulating continuous-time Markov has m × m transition rate matrix R, then the MAP representation is

Fitting

A MAP can be fitted using an expectation–maximization algorithm. [9]

Software

See also

Related Research Articles

<span class="mw-page-title-main">Markov chain</span> Random process independent of past history

A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happens next depends only on the state of affairs now." A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC). It is named after the Russian mathematician Andrey Markov.

In mathematics, a stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability. It is also called a probability matrix, transition matrix, substitution matrix, or Markov matrix. The stochastic matrix was first developed by Andrey Markov at the beginning of the 20th century, and has found use throughout a wide variety of scientific fields, including probability theory, statistics, mathematical finance and linear algebra, as well as computer science and population genetics. There are several different definitions and types of stochastic matrices:

In mathematics, the spectral radius of a square matrix is the maximum of the absolute values of its eigenvalues. More generally, the spectral radius of a bounded linear operator is the supremum of the absolute values of the elements of its spectrum. The spectral radius is often denoted by ρ(·).

A continuous-time Markov chain (CTMC) is a continuous stochastic process in which, for each state, the process will change state according to an exponential random variable and then move to a different state as specified by the probabilities of a stochastic matrix. An equivalent formulation describes the process as changing state according to the least value of a set of exponential random variables, one for each possible state it can move to, with the parameters determined by the current state.

In probability theory the hypoexponential distribution or the generalized Erlang distribution is a continuous distribution, that has found use in the same fields as the Erlang distribution, such as queueing theory, teletraffic engineering and more generally in stochastic processes. It is called the hypoexponetial distribution as it has a coefficient of variation less than one, compared to the hyper-exponential distribution which has coefficient of variation greater than one and the exponential distribution which has coefficient of variation of one.

The birth–death process is a special case of continuous-time Markov process where the state transitions are of only two types: "births", which increase the state variable by one and "deaths", which decrease the state by one. It was introduced by William Feller. The model's name comes from a common application, the use of such models to represent the current size of a population where the transitions are literal births and deaths. Birth–death processes have many applications in demography, queueing theory, performance engineering, epidemiology, biology and other areas. They may be used, for example, to study the evolution of bacteria, the number of people with a disease within a population, or the number of customers in line at the supermarket.

A phase-type distribution is a probability distribution constructed by a convolution or mixture of exponential distributions. It results from a system of one or more inter-related Poisson processes occurring in sequence, or phases. The sequence in which each of the phases occurs may itself be a stochastic process. The distribution can be represented by a random variable describing the time until absorption of a Markov process with one absorbing state. Each of the states of the Markov process represents one of the phases.

<span class="mw-page-title-main">M/M/1 queue</span> Queue with Markov (Poisson) arrival process, exponential service time distribution and one server

In queueing theory, a discipline within the mathematical theory of probability, an M/M/1 queue represents the queue length in a system having a single server, where arrivals are determined by a Poisson process and job service times have an exponential distribution. The model name is written in Kendall's notation. The model is the most elementary of queueing models and an attractive object of study as closed-form expressions can be obtained for many metrics of interest in this model. An extension of this model with more than one server is the M/M/c queue.

<span class="mw-page-title-main">Poisson distribution</span> Discrete probability distribution

In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and independently of the time since the last event. It is named after French mathematician Siméon Denis Poisson. The Poisson distribution can also be used for the number of events in other specified interval types such as distance, area, or volume. It plays an important role for discrete-stable distributions.

In queueing theory, a discipline within the mathematical theory of probability, the M/M/c queue is a multi-server queueing model. In Kendall's notation it describes a system where arrivals form a single queue and are governed by a Poisson process, there are c servers, and job service times are exponentially distributed. It is a generalisation of the M/M/1 queue which considers only a single server. The model with infinitely many servers is the M/M/∞ queue.

In probability theory, uniformization method, is a method to compute transient solutions of finite state continuous-time Markov chains, by approximating the process by a discrete-time Markov chain. The original chain is scaled by the fastest transition rate γ, so that transitions occur at the same rate in every state, hence the name. The method is simple to program and efficiently calculates an approximation to the transient distribution at a single point in time. The method was first introduced by Winfried Grassmann in 1977.

In queueing theory, a discipline within the mathematical theory of probability, an M/G/1 queue is a queue model where arrivals are Markovian, service times have a General distribution and there is a single server. The model name is written in Kendall's notation, and is an extension of the M/M/1 queue, where service times must be exponentially distributed. The classic application of the M/G/1 queue is to model performance of a fixed head hard disk.

In queueing theory, a discipline within the mathematical theory of probability, the M/M/∞ queue is a multi-server queueing model where every arrival experiences immediate service and does not wait. In Kendall's notation it describes a system where arrivals are governed by a Poisson process, there are infinitely many servers, so jobs do not need to wait for a server. Each job has an exponentially distributed service time. It is a limit of the M/M/c queue model where the number of servers c becomes very large.

In queueing theory, a discipline within the mathematical theory of probability, an M/D/1 queue represents the queue length in a system having a single server, where arrivals are determined by a Poisson process and job service times are fixed (deterministic). The model name is written in Kendall's notation. Agner Krarup Erlang first published on this model in 1909, starting the subject of queueing theory. An extension of this model with more than one server is the M/D/c queue.

In probability theory, the matrix analytic method is a technique to compute the stationary probability distribution of a Markov chain which has a repeating structure and a state space which grows unboundedly in no more than one dimension. Such models are often described as M/G/1 type Markov chains because they can describe transitions in an M/G/1 queue. The method is a more complicated version of the matrix geometric method and is the classical solution method for M/G/1 chains.

In probability theory, the matrix geometric method is a method for the analysis of quasi-birth–death processes, continuous-time Markov chain whose transition rate matrices with a repetitive block structure. The method was developed "largely by Marcel F. Neuts and his students starting around 1975."

In queueing theory, a discipline within the mathematical theory of probability, an M/D/c queue represents the queue length in a system having c servers, where arrivals are determined by a Poisson process and job service times are fixed (deterministic). The model name is written in Kendall's notation. Agner Krarup Erlang first published on this model in 1909, starting the subject of queueing theory. The model is an extension of the M/D/1 queue which has only a single server.

In queueing theory, a discipline within the mathematical theory of probability, the G/M/1 queue represents the queue length in a system where interarrival times have a general distribution and service times for each job have an exponential distribution. The system is described in Kendall's notation where the G denotes a general distribution, M the exponential distribution for service times and the 1 that the model has a single server.

<span class="mw-page-title-main">Birth process</span> Type of continuous process in probability theory

In probability theory, a birth process or a pure birth process is a special case of a continuous-time Markov process and a generalisation of a Poisson process. It defines a continuous process which takes values in the natural numbers and can only increase by one or remain unchanged. This is a type of birth–death process with no deaths. The rate at which births occur is given by an exponential random variable whose parameter depends only on the current value of the process

In probability theory, a transition-rate matrix is an array of numbers describing the instantaneous rate at which a continuous-time Markov chain transitions between states.

References

  1. Asmussen, S. R. (2003). "Markov Additive Models". Applied Probability and Queues. Stochastic Modelling and Applied Probability. Vol. 51. pp. 302–339. doi:10.1007/0-387-21525-5_11. ISBN   978-0-387-00211-8.
  2. 1 2 Asmussen, S. (2000). "Matrix-analytic Models and their Analysis". Scandinavian Journal of Statistics. 27 (2): 193–226. doi: 10.1111/1467-9469.00186 . JSTOR   4616600. S2CID   122810934.
  3. Chakravarthy, S. R. (2011). "Markovian Arrival Processes". Wiley Encyclopedia of Operations Research and Management Science. doi:10.1002/9780470400531.eorms0499. ISBN   9780470400531.
  4. Neuts, Marcel F. (1979). "A Versatile Markovian Point Process". Journal of Applied Probability. Applied Probability Trust. 16 (4): 764–779. doi:10.2307/3213143. JSTOR   3213143. S2CID   123525892.
  5. Casale, G. (2011). "Building accurate workload models using Markovian arrival processes". ACM SIGMETRICS Performance Evaluation Review. 39: 357. doi:10.1145/2007116.2007176.
  6. Lucantoni, D. M. (1993). "The BMAP/G/1 queue: A tutorial". Performance Evaluation of Computer and Communication Systems. Lecture Notes in Computer Science. Vol. 729. pp. 330–358. doi:10.1007/BFb0013859. ISBN   3-540-57297-X. S2CID   35110866.
  7. Singh, Gagandeep; Gupta, U. C.; Chaudhry, M. L. (2016). "Detailed computational analysis of queueing-time distributions of the BMAP/G/1 queue using roots". Journal of Applied Probability. 53 (4): 1078–1097. doi:10.1017/jpr.2016.66. S2CID   27505255.
  8. Fischer, W.; Meier-Hellstern, K. (1993). "The Markov-modulated Poisson process (MMPP) cookbook". Performance Evaluation. 18 (2): 149. doi:10.1016/0166-5316(93)90035-S.
  9. Buchholz, P. (2003). "An EM-Algorithm for MAP Fitting from Real Traffic Data". Computer Performance Evaluation. Modelling Techniques and Tools. Lecture Notes in Computer Science. Vol. 2794. pp. 218–236. doi:10.1007/978-3-540-45232-4_14. ISBN   978-3-540-40814-7.
  10. Casale, G.; Zhang, E. Z.; Smirni, E. (2008). "KPC-Toolbox: Simple Yet Effective Trace Fitting Using Markovian Arrival Processes" (PDF). 2008 Fifth International Conference on Quantitative Evaluation of Systems. p. 83. doi:10.1109/QEST.2008.33. ISBN   978-0-7695-3360-5. S2CID   252444.