In probability theory, a piecewise-deterministic Markov process (PDMP) is a process whose behaviour is governed by random jumps at points in time, but whose evolution is deterministically governed by an ordinary differential equation between those times. The class of models is "wide enough to include as special cases virtually all the non-diffusion models of applied probability." [1] The process is defined by three quantities: the flow, the jump rate, and the transition measure. [2]
The model was first introduced in a paper by Mark H. A. Davis in 1984. [1]
Piecewise linear models such as Markov chains, continuous-time Markov chains, the M/G/1 queue, the GI/G/1 queue and the fluid queue can be encapsulated as PDMPs with simple differential equations. [1]
PDMPs have been shown useful in ruin theory, [3] queueing theory, [4] [5] for modelling biochemical processes such as DNA replication in eukaryotes and subtilin production by the organism B. subtilis, [6] and for modelling earthquakes. [7] Moreover, this class of processes has been shown to be appropriate for biophysical neuron models with stochastic ion channels. [8]
Löpker and Palmowski have shown conditions under which a time reversed PDMP is a PDMP. [9] General conditions are known for PDMPs to be stable. [10]
Galtier et al. [11] studied the law of the trajectories of PDMP and provided a reference measure in order to express a density of a trajectory of the PDMP. Their work opens the way to any application using densities of trajectory. (For instance, they used the density of a trajectories to perform importance sampling, this work was further developed by Chennetier and Al. [12] to estimate the reliability of industrial systems.)
Queueing theory is the mathematical study of waiting lines, or queues. A queueing model is constructed so that queue lengths and waiting time can be predicted. Queueing theory is generally considered a branch of operations research because the results are often used when making business decisions about the resources needed to provide a service.
In statistics, Markov chain Monte Carlo (MCMC) is a class of algorithms used to draw samples from a probability distribution. Given a probability distribution, one can construct a Markov chain whose elements' distribution approximates it – that is, the Markov chain's equilibrium distribution matches the target distribution. The more steps that are included, the more closely the distribution of the sample matches the actual desired distribution.
A hybrid system is a dynamical system that exhibits both continuous and discrete dynamic behavior – a system that can both flow and jump. Often, the term "hybrid dynamical system" is used, to distinguish over hybrid systems such as those that combine neural nets and fuzzy logic, or electrical and mechanical drivelines. A hybrid system has the benefit of encompassing a larger class of systems within its structure, allowing for more flexibility in modeling dynamic phenomena.
A mathematical or physical process is time-reversible if the dynamics of the process remain well-defined when the sequence of time-states is reversed.
In queueing models, a discipline within the mathematical theory of probability, the quasi-birth–death process describes a generalisation of the birth–death process. As with the birth-death process it moves up and down between levels one at a time, but the time between these transitions can have a more complicated distribution encoded in the blocks.
Jump diffusion is a stochastic process that involves jumps and diffusion. It has important applications in magnetic reconnection, coronal mass ejections, condensed matter physics, option pricing, and pattern theory and computational vision.
In queueing theory, a discipline within the mathematical theory of probability, Burke's theorem is a theorem asserting that, for the M/M/1 queue, M/M/c queue or M/M/∞ queue in the steady state with arrivals is a Poisson process with rate parameter λ:
In probability theory, a product-form solution is a particularly efficient form of solution for determining some metric of a system with distinct sub-components, where the metric for the collection of components can be written as a product of the metric across the different components. Using capital Pi notation a product-form solution has algebraic form
In queueing theory, a discipline within the mathematical theory of probability, a fork–join queue is a queue where incoming jobs are split on arrival for service by numerous servers and joined before departure. The model is often used for parallel computations or systems where products need to be obtained simultaneously from different suppliers. The key quantity of interest in this model is usually the time taken to service a complete job. The model has been described as a "key model for the performance analysis of parallel and distributed systems." Few analytical results exist for fork–join queues, but various approximations are known.
A two-state trajectory is a dynamical signal that fluctuates between two distinct values: ON and OFF, open and closed, , etc. Mathematically, the signal has, for every either the value or .
In queueing theory, a discipline within the mathematical theory of probability, a fluid queue is a mathematical model used to describe the fluid level in a reservoir subject to randomly determined periods of filling and emptying. The term dam theory was used in earlier literature for these models. The model has been used to approximate discrete models, model the spread of wildfires, in ruin theory and to model high speed data networks. The model applies the leaky bucket algorithm to a stochastic source.
Mark Herbert Ainsworth Davis was Professor of Mathematics at Imperial College London. He made fundamental contributions to the theory of stochastic processes, stochastic control and mathematical finance.
In queueing theory, a discipline within the mathematical theory of probability, an M/G/k queue is a queue model where arrivals are Markovian, service times have a General distribution and there are k servers. The model name is written in Kendall's notation, and is an extension of the M/M/c queue, where service times must be exponentially distributed and of the M/G/1 queue with a single server. Most performance metrics for this queueing system are not known and remain an open problem.
In probability theory, reflected Brownian motion is a Wiener process in a space with reflecting boundaries. In the physical literature, this process describes diffusion in a confined space and it is often called confined Brownian motion. For example it can describe the motion of hard spheres in water confined between two walls.
In queueing theory, a discipline within the mathematical theory of probability, a fluid limit, fluid approximation or fluid analysis of a stochastic model is a deterministic real-valued process which approximates the evolution of a given stochastic process, usually subject to some scaling or limiting criteria.
Mean-field particle methods are a broad class of interacting type Monte Carlo algorithms for simulating from a sequence of probability distributions satisfying a nonlinear evolution equation. These flows of probability measures can always be interpreted as the distributions of the random states of a Markov process whose transition probabilities depends on the distributions of the current random states. A natural way to simulate these sophisticated nonlinear Markov processes is to sample a large number of copies of the process, replacing in the evolution equation the unknown distributions of the random states by the sampled empirical measures. In contrast with traditional Monte Carlo and Markov chain Monte Carlo methods these mean-field particle techniques rely on sequential interacting samples. The terminology mean-field reflects the fact that each of the samples interacts with the empirical measures of the process. When the size of the system tends to infinity, these random empirical measures converge to the deterministic distribution of the random states of the nonlinear Markov chain, so that the statistical interaction between particles vanishes. In other words, starting with a chaotic configuration based on independent copies of initial state of the nonlinear Markov chain model, the chaos propagates at any time horizon as the size the system tends to infinity; that is, finite blocks of particles reduces to independent copies of the nonlinear Markov process. This result is called the propagation of chaos property. The terminology "propagation of chaos" originated with the work of Mark Kac in 1976 on a colliding mean-field kinetic gas model.
Richard H. Stockbridge is a Distinguished Professor of Mathematics at the University of Wisconsin-Milwaukee. His contributions to research primarily involve stochastic control theory, optimal stopping and mathematical finance. Most notably, alongside Professors Thomas G. Kurtz, Kurt Helmes, and Chao Zhu, he developed the methodology of using linear programming to solve stochastic control problems.
Damir Filipović is a Swiss mathematician specializing in quantitative finance. He holds the Swissquote Chair in Quantitative Finance and is the director of the Swiss Finance Institute at EPFL.
Amber Lynn Puha is an American mathematician and educator at California State University San Marcos. Her research concerns probability theory and stochastic processes.
Eugene A. Feinberg is an American mathematician and distinguished professor of applied mathematics and statistics at Stony Brook University. He is noted for his work in probability theory, real analysis, and Markov decision processes.