Matrix analytic method

Last updated

In probability theory, the matrix analytic method is a technique to compute the stationary probability distribution of a Markov chain which has a repeating structure (after some point) and a state space which grows unboundedly in no more than one dimension. [1] [2] Such models are often described as M/G/1 type Markov chains because they can describe transitions in an M/G/1 queue. [3] [4] The method is a more complicated version of the matrix geometric method and is the classical solution method for M/G/1 chains. [5]

Contents

Method description

An M/G/1-type stochastic matrix is one of the form [3]

where Bi and Ai are k × k matrices. (Note that unmarked matrix entries represent zeroes.) Such a matrix describes the embedded Markov chain in an M/G/1 queue. [6] [7] If P is irreducible and positive recurrent then the stationary distribution is given by the solution to the equations [3]

where e represents a vector of suitable dimension with all values equal to 1. Matching the structure of P, π is partitioned to π1, π2, π3, …. To compute these probabilities the column stochastic matrix G is computed such that [3]

G is called the auxiliary matrix. [8] Matrices are defined [3]

then π0 is found by solving [3]

and the πi are given by Ramaswami's formula, [3] a numerically stable relationship first published by Vaidyanathan Ramaswami in 1988. [9]

Computation of G

There are two popular iterative methods for computing G, [10] [11]

Tools

Related Research Articles

<span class="mw-page-title-main">Markov chain</span> Random process independent of past history

A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happens next depends only on the state of affairs now." A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC). It is named after the Russian mathematician Andrey Markov.

In mathematics, a stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability. It is also called a probability matrix, transition matrix, substitution matrix, or Markov matrix. The stochastic matrix was first developed by Andrey Markov at the beginning of the 20th century, and has found use throughout a wide variety of scientific fields, including probability theory, statistics, mathematical finance and linear algebra, as well as computer science and population genetics. There are several different definitions and types of stochastic matrices:

In mathematics, Parseval's theorem usually refers to the result that the Fourier transform is unitary; loosely, that the sum of the square of a function is equal to the sum of the square of its transform. It originates from a 1799 theorem about series by Marc-Antoine Parseval, which was later applied to the Fourier series. It is also known as Rayleigh's energy theorem, or Rayleigh's identity, after John William Strutt, Lord Rayleigh.

A continuous-time Markov chain (CTMC) is a continuous stochastic process in which, for each state, the process will change state according to an exponential random variable and then move to a different state as specified by the probabilities of a stochastic matrix. An equivalent formulation describes the process as changing state according to the least value of a set of exponential random variables, one for each possible state it can move to, with the parameters determined by the current state.

In mathematics, a Markov decision process (MDP) is a discrete-time stochastic control process. It provides a mathematical framework for modeling decision making in situations where outcomes are partly random and partly under the control of a decision maker. MDPs are useful for studying optimization problems solved via dynamic programming. MDPs were known at least as early as the 1950s; a core body of research on Markov decision processes resulted from Ronald Howard's 1960 book, Dynamic Programming and Markov Processes. They are used in many disciplines, including robotics, automatic control, economics and manufacturing. The name of MDPs comes from the Russian mathematician Andrey Markov as they are an extension of Markov chains.

Directional statistics

Directional statistics is the subdiscipline of statistics that deals with directions, axes or rotations in Rn. More generally, directional statistics deals with observations on compact Riemannian manifolds including the Stiefel manifold.

The birth–death process is a special case of continuous-time Markov process where the state transitions are of only two types: "births", which increase the state variable by one and "deaths", which decrease the state by one. The model's name comes from a common application, the use of such models to represent the current size of a population where the transitions are literal births and deaths. Birth–death processes have many applications in demography, queueing theory, performance engineering, epidemiology, biology and other areas. They may be used, for example, to study the evolution of bacteria, the number of people with a disease within a population, or the number of customers in line at the supermarket.

Discrete-time Markov chain Probability concept

In probability, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past. For instance, a machine may have two states, A and E. When it is in state A, there is a 40% chance of it moving to state E and a 60% chance of it remaining in state A. When it is in state E, there is a 70% chance of it moving to A and a 30% chance of it staying in E. The sequence of states of the machine is a Markov chain. If we denote the chain by then is the state which the machine starts in and is the random variable describing its state after 10 transitions. The process continues forever, indexed by the natural numbers.

A phase-type distribution is a probability distribution constructed by a convolution or mixture of exponential distributions. It results from a system of one or more inter-related Poisson processes occurring in sequence, or phases. The sequence in which each of the phases occurs may itself be a stochastic process. The distribution can be represented by a random variable describing the time until absorption of a Markov process with one absorbing state. Each of the states of the Markov process represents one of the phases.

In queueing theory, a discipline within the mathematical theory of probability, a Markovian arrival process is a mathematical model for the time between job arrivals to a system. The simplest such process is a Poisson process where the time between each arrival is exponentially distributed.

In mathematics, a Carleman matrix is a matrix used to convert function composition into matrix multiplication. It is often used in iteration theory to find the continuous iteration of functions which cannot be iterated by pattern recognition alone. Other uses of Carleman matrices occur in the theory of probability generating functions, and Markov chains.

In probability theory, a balance equation is an equation that describes the probability flux associated with a Markov chain in and out of states or set of states.

In queueing theory, a discipline within, the queue is a multi-server queueing model. In Kendall's notation it describes a system where arrivals form a single queue and are governed by a, there are servers, and job service times are exponentially distributed. It is a generalization of which considers only a single server. The model with infinitely many servers is the.

In probability theory, uniformization method, is a method to compute transient solutions of finite state continuous-time Markov chains, by approximating the process by a discrete-time Markov chain. The original chain is scaled by the fastest transition rate γ, so that transitions occur at the same rate in every state, hence the name. The method is simple to program and efficiently calculates an approximation to the transient distribution at a single point in time. The method was first introduced by Winfried Grassmann in 1977.

In queueing theory, a discipline within the mathematical theory of probability, an M/G/1 queue is a queue model where arrivals are Markovian, service times have a General distribution and there is a single server. The model name is written in Kendall's notation, and is an extension of the M/M/1 queue, where service times must be exponentially distributed. The classic application of the M/G/1 queue is to model performance of a fixed head hard disk.

In queueing theory, a discipline within the mathematical theory of probability, a fluid queue is a mathematical model used to describe the fluid level in a reservoir subject to randomly determined periods of filling and emptying. The term dam theory was used in earlier literature for these models. The model has been used to approximate discrete models, model the spread of wildfires, in ruin theory and to model high speed data networks. The model applies the leaky bucket algorithm to a stochastic source.

In probability theory, a transition rate matrix is an array of numbers describing the instantaneous rate at which a continuous time Markov chain transitions between states.

In probability theory, Kelly's lemma states that for a stationary continuous time Markov chain, a process defined as the time-reversed process has the same stationary distribution as the forward-time process. The theorem is named after Frank Kelly.

In probability theory, the matrix geometric method is a method for the analysis of quasi-birth–death processes, continuous-time Markov chain whose transition rate matrices with a repetitive block structure. The method was developed "largely by Marcel F. Neuts and his students starting around 1975."

In queueing theory, a discipline within the mathematical theory of probability, the G/M/1 queue represents the queue length in a system where interarrival times have a general distribution and service times for each job have an exponential distribution. The system is described in Kendall's notation where the G denotes a general distribution, M the exponential distribution for service times and the 1 that the model has a single server.

References

  1. Harchol-Balter, M. (2012). "Phase-Type Distributions and Matrix-Analytic Methods". Performance Modeling and Design of Computer Systems. pp. 359–379. doi:10.1017/CBO9781139226424.028. ISBN   9781139226424.
  2. Neuts, M. F. (1984). "Matrix-analytic methods in queuing theory". European Journal of Operational Research. 15: 2–12. doi:10.1016/0377-2217(84)90034-1.
  3. 1 2 3 4 5 6 7 Meini, B. (1997). "An improved FFT-based version of Ramaswami's formula". Communications in Statistics. Stochastic Models. 13 (2): 223–238. doi:10.1080/15326349708807423.
  4. Stathopoulos, A.; Riska, A.; Hua, Z.; Smirni, E. (2005). "Bridging ETAQA and Ramaswami's formula for the solution of M/G/1-type processes". Performance Evaluation. 62 (1–4): 331–348. CiteSeerX   10.1.1.80.9473 . doi:10.1016/j.peva.2005.07.003.
  5. Riska, A.; Smirni, E. (2002). "M/G/1-Type Markov Processes: A Tutorial" (PDF). Performance Evaluation of Complex Systems: Techniques and Tools . Lecture Notes in Computer Science. Vol. 2459. pp.  36. doi:10.1007/3-540-45798-4_3. ISBN   978-3-540-44252-3.
  6. Bolch, Gunter; Greiner, Stefan; de Meer, Hermann; Shridharbhai Trivedi, Kishor (2006). Queueing Networks and Markov Chains: Modeling and Performance Evaluation with Computer Science Applications (2 ed.). John Wiley & Sons, Inc. p. 250. ISBN   978-0471565253.
  7. Artalejo, Jesús R.; Gómez-Corral, Antonio (2008). "The Matrix-Analytic Formalism". Retrial Queueing Systems. pp. 187–205. doi:10.1007/978-3-540-78725-9_7. ISBN   978-3-540-78724-2.
  8. Riska, A.; Smirni, E. (2002). "Exact aggregate solutions for M/G/1-type Markov processes". ACM SIGMETRICS Performance Evaluation Review. 30: 86. CiteSeerX   10.1.1.109.2225 . doi:10.1145/511399.511346.
  9. Ramaswami, V. (1988). "A stable recursion for the steady state vector in markov chains of m/g/1 type". Communications in Statistics. Stochastic Models. 4: 183–188. doi:10.1080/15326348808807077.
  10. Bini, D. A.; Latouche, G.; Meini, B. (2005). Numerical Methods for Structured Markov Chains. doi:10.1093/acprof:oso/9780198527688.001.0001. ISBN   9780198527688.
  11. Meini, B. (1998). "Solving m/g/l type markov chains: Recent advances and applications". Communications in Statistics. Stochastic Models. 14 (1–2): 479–496. doi:10.1080/15326349808807483.
  12. Riska, A.; Smirni, E. (2002). "MAMSolver: A Matrix Analytic Methods Tool". Computer Performance Evaluation: Modelling Techniques and Tools. Lecture Notes in Computer Science. Vol. 2324. p. 205. CiteSeerX   10.1.1.146.2080 . doi:10.1007/3-540-46029-2_14. ISBN   978-3-540-43539-6.