Piecewise-deterministic Markov process

Last updated

In probability theory, a piecewise-deterministic Markov process (PDMP) is a process whose behaviour is governed by random jumps at points in time, but whose evolution is deterministically governed by an ordinary differential equation between those times. The class of models is "wide enough to include as special cases virtually all the non-diffusion models of applied probability." [1] The process is defined by three quantities: the flow, the jump rate, and the transition measure. [2]

Contents

The model was first introduced in a paper by Mark H. A. Davis in 1984. [1]

Examples

Piecewise linear models such as Markov chains, continuous-time Markov chains, the M/G/1 queue, the GI/G/1 queue and the fluid queue can be encapsulated as PDMPs with simple differential equations. [1]

Applications

PDMPs have been shown useful in ruin theory, [3] queueing theory, [4] [5] for modelling biochemical processes such as DNA replication in eukaryotes and subtilin production by the organism B. subtilis, [6] and for modelling earthquakes. [7] Moreover, this class of processes has been shown to be appropriate for biophysical neuron models with stochastic ion channels. [8]

Properties

Löpker and Palmowski have shown conditions under which a time reversed PDMP is a PDMP. [9] General conditions are known for PDMPs to be stable. [10]

Galtier and Al. [11] studied the law of the trajectories of PDPM and provided a reference measure in order to express a density of a trajectory of the PDMP. Their work opens the way to any application using densities of trajectory. (For instance, they used the density of a trajectories to perform importance sampling, this work was further developed by Chennetier and Al. [12] to estimate the reliability of industrial systems.)

See also

Related Research Articles

<span class="mw-page-title-main">Stochastic process</span> Collection of random variables

In probability theory and related fields, a stochastic or random process is a mathematical object usually defined as a family of random variables. Stochastic processes are widely used as mathematical models of systems and phenomena that appear to vary in a random manner. Examples include the growth of a bacterial population, an electrical current fluctuating due to thermal noise, or the movement of a gas molecule. Stochastic processes have applications in many disciplines such as biology, chemistry, ecology, neuroscience, physics, image processing, signal processing, control theory, information theory, computer science, cryptography and telecommunications. Furthermore, seemingly random changes in financial markets have motivated the extensive use of stochastic processes in finance.

<span class="mw-page-title-main">Queueing theory</span> Mathematical study of waiting lines, or queues

Queueing theory is the mathematical study of waiting lines, or queues. A queueing model is constructed so that queue lengths and waiting time can be predicted. Queueing theory is generally considered a branch of operations research because the results are often used when making business decisions about the resources needed to provide a service.

Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. They are often used in physical and mathematical problems and are most useful when it is difficult or impossible to use other approaches. Monte Carlo methods are mainly used in three problem classes: optimization, numerical integration, and generating draws from a probability distribution.

<span class="mw-page-title-main">Markov chain</span> Random process independent of past history

A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happens next depends only on the state of affairs now." A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC). It is named after the Russian mathematician Andrey Markov.

A hybrid system is a dynamical system that exhibits both continuous and discrete dynamic behavior – a system that can both flow and jump. Often, the term "hybrid dynamical system" is used, to distinguish over hybrid systems such as those that combine neural nets and fuzzy logic, or electrical and mechanical drivelines. A hybrid system has the benefit of encompassing a larger class of systems within its structure, allowing for more flexibility in modeling dynamic phenomena.

A mathematical or physical process is time-reversible if the dynamics of the process remain well-defined when the sequence of time-states is reversed.

In queueing models, a discipline within the mathematical theory of probability, the quasi-birth–death process describes a generalisation of the birth–death process. As with the birth-death process it moves up and down between levels one at a time, but the time between these transitions can have a more complicated distribution encoded in the blocks.

Jump diffusion is a stochastic process that involves jumps and diffusion. It has important applications in magnetic reconnection, coronal mass ejections, condensed matter physics, option pricing, and pattern theory and computational vision.

In probability theory, a product-form solution is a particularly efficient form of solution for determining some metric of a system with distinct sub-components, where the metric for the collection of components can be written as a product of the metric across the different components. Using capital Pi notation a product-form solution has algebraic form

<span class="mw-page-title-main">Fork–join queue</span>

In queueing theory, a discipline within the mathematical theory of probability, a fork–join queue is a queue where incoming jobs are split on arrival for service by numerous servers and joined before departure. The model is often used for parallel computations or systems where products need to be obtained simultaneously from different suppliers. The key quantity of interest in this model is usually the time taken to service a complete job. The model has been described as a "key model for the performance analysis of parallel and distributed systems." Few analytical results exist for fork–join queues, but various approximations are known.

In queueing theory, a discipline within the mathematical theory of probability, a fluid queue is a mathematical model used to describe the fluid level in a reservoir subject to randomly determined periods of filling and emptying. The term dam theory was used in earlier literature for these models. The model has been used to approximate discrete models, model the spread of wildfires, in ruin theory and to model high speed data networks. The model applies the leaky bucket algorithm to a stochastic source.

Mark Herbert Ainsworth Davis was Professor of Mathematics at Imperial College London. He made fundamental contributions to the theory of stochastic processes, stochastic control and mathematical finance.

In probability theory, reflected Brownian motion is a Wiener process in a space with reflecting boundaries. In the physical literature, this process describes diffusion in a confined space and it is often called confined Brownian motion. For example it can describe the motion of hard spheres in water confined between two walls.

In queueing theory, a discipline within the mathematical theory of probability, an M/D/1 queue represents the queue length in a system having a single server, where arrivals are determined by a Poisson process and job service times are fixed (deterministic). The model name is written in Kendall's notation. Agner Krarup Erlang first published on this model in 1909, starting the subject of queueing theory. An extension of this model with more than one server is the M/D/c queue.

In probability theory, the matrix geometric method is a method for the analysis of quasi-birth–death processes, continuous-time Markov chain whose transition rate matrices with a repetitive block structure. The method was developed "largely by Marcel F. Neuts and his students starting around 1975."

In queueing theory, a discipline within the mathematical theory of probability, a fluid limit, fluid approximation or fluid analysis of a stochastic model is a deterministic real-valued process which approximates the evolution of a given stochastic process, usually subject to some scaling or limiting criteria.

Mean-field particle methods are a broad class of interacting type Monte Carlo algorithms for simulating from a sequence of probability distributions satisfying a nonlinear evolution equation. These flows of probability measures can always be interpreted as the distributions of the random states of a Markov process whose transition probabilities depends on the distributions of the current random states. A natural way to simulate these sophisticated nonlinear Markov processes is to sample a large number of copies of the process, replacing in the evolution equation the unknown distributions of the random states by the sampled empirical measures. In contrast with traditional Monte Carlo and Markov chain Monte Carlo methods these mean-field particle techniques rely on sequential interacting samples. The terminology mean-field reflects the fact that each of the samples interacts with the empirical measures of the process. When the size of the system tends to infinity, these random empirical measures converge to the deterministic distribution of the random states of the nonlinear Markov chain, so that the statistical interaction between particles vanishes. In other words, starting with a chaotic configuration based on independent copies of initial state of the nonlinear Markov chain model, the chaos propagates at any time horizon as the size the system tends to infinity; that is, finite blocks of particles reduces to independent copies of the nonlinear Markov process. This result is called the propagation of chaos property. The terminology "propagation of chaos" originated with the work of Mark Kac in 1976 on a colliding mean-field kinetic gas model.

<span class="mw-page-title-main">Richard H. Stockbridge</span>

Richard H. Stockbridge is a Distinguished Professor of Mathematics at the University of Wisconsin-Milwaukee. His contributions to research primarily involve stochastic control theory, optimal stopping and mathematical finance. Most notably, alongside Professors Thomas G. Kurtz, Kurt Helmes, and Chao Zhu, he developed the methodology of using linear programming to solve stochastic control problems.

<span class="mw-page-title-main">Damir Filipović</span> Swiss mathematician

Damir Filipović is a Swiss mathematician specializing in quantitative finance. He holds the Swissquote Chair in Quantitative Finance and is the director of the Swiss Finance Institute at EPFL.

<span class="mw-page-title-main">Eugene A. Feinberg</span> American applied mathematician

Eugene A. Feinberg is an American mathematician and distinguished professor of applied mathematics and statistics at Stony Brook University. He is noted for his work in probability theory, real analysis, and Markov decision processes.

References

  1. 1 2 3 Davis, M. H. A. (1984). "Piecewise-Deterministic Markov Processes: A General Class of Non-Diffusion Stochastic Models". Journal of the Royal Statistical Society. Series B (Methodological). 46 (3): 353–388. doi:10.1111/j.2517-6161.1984.tb01308.x. JSTOR   2345677.
  2. Costa, O. L. V.; Dufour, F. (2010). "Average Continuous Control of Piecewise Deterministic Markov Processes". SIAM Journal on Control and Optimization. 48 (7): 4262. arXiv: 0809.0477 . doi:10.1137/080718541.
  3. Embrechts, P.; Schmidli, H. (1994). "Ruin Estimation for a General Insurance Risk Model". Advances in Applied Probability. 26 (2): 404–422. doi:10.2307/1427443. JSTOR   1427443.
  4. Browne, Sid; Sigman, Karl (1992). "Work-Modulated Queues with Applications to Storage Processes". Journal of Applied Probability. 29 (3): 699–712. doi:10.2307/3214906. JSTOR   3214906.
  5. Boxma, O.; Kaspi, H.; Kella, O.; Perry, D. (2005). "On/off Storage Systems with State-Dependent Input, Output, and Switching Rates". Probability in the Engineering and Informational Sciences. 19. CiteSeerX   10.1.1.556.6718 . doi:10.1017/S0269964805050011.
  6. Cassandras, Christos G.; Lygeros, John (2007). "Chapter 9. Stochastic Hybrid Modeling of Biochemical Processes" (PDF). Stochastic Hybrid Systems. CRC Press. ISBN   9780849390838.
  7. Ogata, Y.; Vere-Jones, D. (1984). "Inference for earthquake models: A self-correcting model". Stochastic Processes and Their Applications. 17 (2): 337. doi: 10.1016/0304-4149(84)90009-7 .
  8. Pakdaman, K.; Thieullen, M.; Wainrib, G. (September 2010). "Fluid limit theorems for stochastic hybrid systems with application to neuron models". Advances in Applied Probability. 42 (3): 761–794. arXiv: 1001.2474 . doi:10.1239/aap/1282924062.
  9. Löpker, A.; Palmowski, Z. (2013). "On time reversal of piecewise deterministic Markov processes". Electronic Journal of Probability. 18. arXiv: 1110.3813 . doi:10.1214/EJP.v18-1958.
  10. Costa, O. L. V.; Dufour, F. (2008). "Stability and Ergodicity of Piecewise Deterministic Markov Processes" (PDF). SIAM Journal on Control and Optimization. 47 (2): 1053. doi:10.1137/060670109.
  11. Galtier, T. (2019). "On the optimal importance process for piecewise deterministic Markov process". ESAIM: PS. 23. doi:10.1051/ps/2019015.
  12. Chennetier, G. (2022). "Adaptive importance sampling based on fault tree analysis for piecewise deterministic Markov process" (PDF). Preprint.