Last updated

Stochastic refers to a randomly determined process. [1] The word first appeared in English to describe a mathematical object called a stochastic process, but now in mathematics the terms stochastic process and random process are considered interchangeable. [2] [3] [4] [5] [6] The word, with its current definition meaning random, came from German, but it originally came from Greek στόχος (stókhos), meaning 'aim, guess'. [1]

Stochastic process mathematical object usually defined as a collection of random variables

In probability theory and related fields, a stochastic or random process is a mathematical object usually defined as a family of random variables. Historically, the random variables were associated with or indexed by a set of numbers, usually viewed as points in time, giving the interpretation of a stochastic process representing numerical values of some system randomly changing over time, such as the growth of a bacterial population, an electrical current fluctuating due to thermal noise, or the movement of a gas molecule. Stochastic processes are widely used as mathematical models of systems and phenomena that appear to vary in a random manner. They have applications in many disciplines including sciences such as biology, chemistry, ecology, neuroscience, and physics as well as technology and engineering fields such as image processing, signal processing, information theory, computer science, cryptography and telecommunications. Furthermore, seemingly random changes in financial markets have motivated the extensive use of stochastic processes in finance.

Greek language Language spoken in Greece, Cyprus and Southern Albania

Greek is an independent branch of the Indo-European family of languages, native to Greece, Cyprus and other parts of the Eastern Mediterranean and the Black Sea. It has the longest documented history of any living Indo-European language, spanning more than 3000 years of written records. Its writing system has been the Greek alphabet for the major part of its history; other systems, such as Linear B and the Cypriot syllabary, were used previously. The alphabet arose from the Phoenician script and was in turn the basis of the Latin, Cyrillic, Armenian, Coptic, Gothic, and many other writing systems.


The term stochastic is used in many different fields, particularly where stochastic or random processes are used to represent systems or phenomena that seem to change in a random way. The term is used in the physical sciences such as biology, [7] chemistry, [8] ecology, [9] neuroscience, [10] and physics [11] as well as technology and engineering fields such as image processing, signal processing, [12] information theory, [13] computer science, [14] (including the field of artificial intelligence), cryptography [15] and telecommunications. [16] It is also used in finance, due to seemingly random changes in financial markets [17] [18] [19] as well as in medicine, linguistics, music, media, colour theory, botany, manufacturing, and geomorphology.

Biology is the natural science that studies life and living organisms, including their physical structure, chemical processes, molecular interactions, physiological mechanisms, development and evolution. Despite the complexity of the science, there are certain unifying concepts that consolidate it into a single, coherent field. Biology recognizes the cell as the basic unit of life, genes as the basic unit of heredity, and evolution as the engine that propels the creation and extinction of species. Living organisms are open systems that survive by transforming energy and decreasing their local entropy to maintain a stable and vital condition defined as homeostasis.

Chemistry is the scientific discipline involved with elements and compounds composed of atoms, molecules and ions: their composition, structure, properties, behavior and the changes they undergo during a reaction with other substances.

Ecology Scientific study of the relationships between living organisms and their environment

Ecology is the branch of biology which studies the interactions among organisms and their environment. Objects of study include interactions of organisms that include biotic and abiotic components of their environment. Topics of interest include the biodiversity, distribution, biomass, and populations of organisms, as well as cooperation and competition within and between species. Ecosystems are dynamically interacting systems of organisms, the communities they make up, and the non-living components of their environment. Ecosystem processes, such as primary production, pedogenesis, nutrient cycling, and niche construction, regulate the flux of energy and matter through an environment. These processes are sustained by organisms with specific life history traits. Biodiversity means the varieties of species, genes, and ecosystems, enhances certain ecosystem services.

Stochastic social science theory is similar to systems theory.

Systems theory Interdisciplinary study of systems

Systems theory is the interdisciplinary study of systems. A system is a cohesive conglomeration of interrelated and interdependent parts that is either natural or man-made. Every system is delineated by its spatial and temporal boundaries, surrounded and influenced by its environment, described by its structure and purpose or nature and expressed in its functioning. In terms of its effects, a system can be more than the sum of its parts if it expresses synergy or emergent behavior. Changing one part of the system usually affects other parts and the whole system, with predictable patterns of behavior. For systems that are self-learning and self-adapting, the positive growth and adaptation depend upon how well the system is adjusted with its environment. Some systems function mainly to support other systems by aiding in the maintenance of the other system to prevent failure. The goal of systems theory is systematically discovering a system's dynamics, constraints, conditions and elucidating principles that can be discerned and applied to systems at every level of nesting, and in every field for achieving optimized equifinality.


The word stochastic in English was originally used as an adjective with the definition "pertaining to conjecturing", and stemming from a Greek word meaning "to aim at a mark, guess", and the Oxford English Dictionary gives the year 1662 as its earliest occurrence. [1] In his work on probability Ars Conjectandi, originally published in Latin in 1713, Jakob Bernoulli used the phrase "Ars Conjectandi sive Stochastice", which has been translated to "the art of conjecturing or stochastics". [20] This phrase was used, with reference to Bernoulli, by Ladislaus Bortkiewicz [21] who in 1917 wrote in German the word stochastik with a sense meaning random. The term stochastic process first appeared in English in a 1934 paper by Joseph Doob. [1] For the term and a specific mathematical definition, Doob cited another 1934 paper, where the term stochastischer Prozeß was used in German by Aleksandr Khinchin, [22] [23] though the German term had been used earlier in 1931 by Andrey Kolmogorov. [24]

Ladislaus Bortkiewicz Russian economist and statistician

Ladislaus Josephovich Bortkiewicz was a Russian economist and statistician of Polish ancestry. He wrote a book showing how the Poisson distribution, a discrete probability distribution, can be useful in applied statistics, and he made contributions to mathematical economics. He lived most of his professional life in Germany, where he taught at Strassburg University and Berlin University (1901–1931).

Aleksandr Khinchin

Aleksandr Yakovlevich Khinchin was a Soviet mathematician and one of the most significant contributors to the Soviet school of probability theory.

Andrey Kolmogorov Russian mathematician

Andrey Nikolaevich Kolmogorov was a Soviet mathematician who made significant contributions to the mathematics of probability theory, topology, intuitionistic logic, turbulence, classical mechanics, algorithmic information theory and computational complexity.

Artificial intelligence

In artificial intelligence, stochastic programs work by using probabilistic methods to solve problems, as in simulated annealing, stochastic neural networks, stochastic optimization, genetic algorithms, and genetic programming. A problem itself may be stochastic as well, as in planning under uncertainty.

In computer science, artificial intelligence (AI), sometimes called machine intelligence, is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans. Colloquially, the term "artificial intelligence" is often used to describe machines that mimic "cognitive" functions that humans associate with the human mind, such as "learning" and "problem solving".

Simulated annealing Probabilistic optimization technique and metaheuristic

Simulated annealing (SA) is a probabilistic technique for approximating the global optimum of a given function. Specifically, it is a metaheuristic to approximate global optimization in a large search space for an optimization problem. It is often used when the search space is discrete. For problems where finding an approximate global optimum is more important than finding a precise local optimum in a fixed amount of time, simulated annealing may be preferable to alternatives such as gradient descent.

Stochastic neural networks are a type of artificial neural networks built by introducing random variations into the network, either by giving the network's neurons stochastic transfer functions, or by giving them stochastic weights. This makes them useful tools for optimization problems, since the random fluctuations help it escape from local minima.


In the early 1930s, Aleksandr Khinchin gave the first mathematical definition of a stochastic process as a family of random variables indexed by the real line. [25] [22] [lower-alpha 1] Further fundamental work on probability theory and stochastic processes was done by Khinchin as well as other mathematicians such as Andrey Kolmogorov, Joseph Doob, William Feller, Maurice Fréchet, Paul Lévy, Wolfgang Doeblin, and Harald Cramér. [27] [28] Decades later Cramér referred to the 1930s as the "heroic period of mathematical probability theory". [28]

William "Vilim" Feller, born Vilibald Srećko Feller, was a Croatian-American mathematician specializing in probability theory.

Paul Lévy (mathematician) French mathematician

Paul Pierre Lévy was a French mathematician who was active especially in probability theory, introducing fundamental concepts such as local time, stable distributions and characteristic functions. Lévy processes, Lévy flights, Lévy measures, Lévy's constant, the Lévy distribution, the Lévy area, the Lévy arcsine law, and the fractal Lévy C curve are named after him.

Wolfgang Doeblin French-German mathematician

Wolfgang Doeblin, known in France as Vincent Doblin, was a French-German mathematician.

In mathematics, specifically probability theory, the theory of stochastic processes is considered to be an important contribution to mathematics [29] and it continues to be an active topic of research for both theoretical reasons and applications. [30] [31] [32]

The word stochastic is used to describe other terms and objects in mathematics. Examples include a stochastic matrix, which describes a stochastic process known as a Markov process, and stochastic calculus, which involves differential equations and integrals based on stochastic processes such as the Wiener process, also called the Brownian motion process.

Natural science

One of the simplest continuous-time stochastic processes is Brownian motion. This was first observed by botanist Robert Brown while looking through a microscope at pollen grains in water.


The name "Monte Carlo" for the stochastic Monte Carlo method was popularized by physics researchers Stanisław Ulam, Enrico Fermi, John von Neumann, and Nicholas Metropolis, among others. The name is a reference to the Monte Carlo Casino in Monaco where Ulam's uncle would borrow money to gamble. [33] The use of randomness and the repetitive nature of the process are analogous to the activities conducted at a casino. Methods of simulation and statistical sampling generally did the opposite: using simulation to test a previously understood deterministic problem. Though examples of an "inverted" approach do exist historically, they were not considered a general method until the popularity of the Monte Carlo method spread.

Perhaps the most famous early use was by Enrico Fermi in 1930, when he used a random method to calculate the properties of the newly discovered neutron. Monte Carlo methods were central to the simulations required for the Manhattan Project, though were severely limited by the computational tools at the time. Therefore, it was only after electronic computers were first built (from 1945 on) that Monte Carlo methods began to be studied in depth. In the 1950s they were used at Los Alamos for early work relating to the development of the hydrogen bomb, and became popularized in the fields of physics, physical chemistry, and operations research. The RAND Corporation and the U.S. Air Force were two of the major organizations responsible for funding and disseminating information on Monte Carlo methods during this time, and they began to find a wide application in many different fields.

Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling.


Stochastic resonance: In biological systems, introducing stochastic "noise" has been found to help improve the signal strength of the internal feedback loops for balance and other vestibular communication. [34] It has been found to help diabetic and stroke patients with balance control. [35] Many biochemical events also lend themselves to stochastic analysis. Gene expression, for example, has a stochastic component through the molecular collisions—as during binding and unbinding of RNA polymerase to a gene promoter—via the solution's Brownian motion.


Simonton (2003, Psych Bulletin) argues that creativity in science (of scientists) is a constrained stochastic behaviour such that new theories in all sciences are, at least in part, the product of a stochastic process.

Computer science

Stochastic ray tracing is the application of Monte Carlo simulation to the computer graphics ray tracing algorithm. "Distributed ray tracing samples the integrand at many randomly chosen points and averages the results to obtain a better approximation. It is essentially an application of the Monte Carlo method to 3D computer graphics, and for this reason is also called Stochastic ray tracing."[ citation needed ]

Stochastic forensics analyzes computer crime by viewing computers as stochastic processes.


The financial markets use stochastic models to represent the seemingly random behaviour of assets such as stocks, commodities, relative currency prices (i.e., the price of one currency compared to that of another, such as the price of US Dollar compared to that of the Euro), and interest rates. These models are then used by quantitative analysts to value options on stock prices, bond prices, and on interest rates, see Markov models. Moreover, it is at the heart of the insurance industry.


The formation of river meanders has been analyzed as a stochastic process

Language and linguistics

Non-deterministic approaches in language studies are largely inspired by the work of Ferdinand de Saussure, for example, in functionalist linguistic theory, which argues that competence is based on performance. [36] [37] This distinction in functional theories of grammar should be carefully distinguished from the langue and parole distinction. To the extent that linguistic knowledge is constituted by experience with language, grammar is argued to be probabilistic and variable rather than fixed and absolute. This conception of grammar as probabilistic and variable follows from the idea that one's competence changes in accordance with one's experience with language. Though this conception has been contested, [38] it has also provided the foundation for modern statistical natural language processing [39] and for theories of language learning and change. [40]


Manufacturing processes are assumed to be stochastic processes. This assumption is largely valid for either continuous or batch manufacturing processes. Testing and monitoring of the process is recorded using a process control chart which plots a given process control parameter over time. Typically a dozen or many more parameters will be tracked simultaneously. Statistical models are used to define limit lines which define when corrective actions must be taken to bring the process back to its intended operational window.

This same approach is used in the service industry where parameters are replaced by processes related to service level agreements.


The marketing and the changing movement of audience tastes and preferences, as well as the solicitation of and the scientific appeal of certain film and television debuts (i.e., their opening weekends, word-of-mouth, top-of-mind knowledge among surveyed groups, star name recognition and other elements of social media outreach and advertising), are determined in part by stochastic modeling. A recent attempt at repeat business analysis was done by Japanese scholars[ citation needed ] and is part of the Cinematic Contagion Systems patented by Geneva Media Holdings, and such modeling has been used in data collection from the time of the original Nielsen ratings to modern studio and television test audiences.


Stochastic effect, or "chance effect" is one classification of radiation effects that refers to the random, statistical nature of the damage. In contrast to the deterministic effect, severity is independent of dose. Only the probability of an effect increases with dose.


In music, mathematical processes based on probability can generate stochastic elements.

Stochastic processes may be used in music to compose a fixed piece or may be produced in performance. Stochastic music was pioneered by Iannis Xenakis, who coined the term stochastic music. Specific examples of mathematics, statistics, and physics applied to music composition are the use of the statistical mechanics of gases in Pithoprakta , statistical distribution of points on a plane in Diamorphoses , minimal constraints in Achorripsis, the normal distribution in ST/10 and Atrées, Markov chains in Analogiques, game theory in Duel and Stratégie, group theory in Nomos Alpha (for Siegfried Palm), set theory in Herma and Eonta, [41] and Brownian motion in N'Shima.[ citation needed ] Xenakis frequently used computers to produce his scores, such as the ST series including Morsima-Amorsima and Atrées, and founded CEMAMu. Earlier, John Cage and others had composed aleatoric or indeterminate music, which is created by chance processes but does not have the strict mathematical basis (Cage's Music of Changes , for example, uses a system of charts based on the I-Ching ). Lejaren Hiller and Leonard Issacson used generative grammars and Markov chains in their 1957 Illiac Suite . Modern electronic music production techniques make these processes relatively simple to implement, and many hardware devices such as synthesizers and drum machines incorporate randomization features. Generative music techniques are therefore readily accessible to composers, performers, and producers.

Social sciences

Stochastic social science theory is similar to systems theory in that events are interactions of systems, although with a marked emphasis on unconscious processes. The event creates its own conditions of possibility, rendering it unpredictable if simply for the number of variables involved. Stochastic social science theory can be seen as an elaboration of a kind of 'third axis' in which to situate human behavior alongside the traditional 'nature vs. nurture' opposition. See Julia Kristeva on her usage of the 'semiotic', Luce Irigaray on reverse Heideggerian epistemology, and Pierre Bourdieu on polythetic space for examples of stochastic social science theory.[ citation needed ]

The term "Stochastic Terrorism" has fallen into frequent use [42] with regard to Lone wolf (terrorism). The terms "Scripted Violence" and "Stochastic Terrorism" are linked in a "cause <> effect" relationship. "Scripted Violence" rhetoric can result in an act of "Stochastic Terrorism." The phrase "scripted violence" has been used in social science since at least 2002. [43]

Author David Neiwert, who wrote the book Alt-America, told Salon interviewer Chauncey Devega:

Scripted violence is where a person who has a national platform describes the kind of violence that they want to be carried out. He identifies the targets and leaves it up to the listeners to carry out this violence. It is a form of terrorism. It is an act and a social phenomenon where there is an agreement to inflict massive violence on a whole segment of society. Again, this violence is led by people in high-profile positions in the media and the government. They’re the ones who do the scripting, and it is ordinary people who carry it out.

Think of it like Charles Manson and his followers. Manson wrote the script; he didn’t commit any of those murders. He just had his followers carry them out. [44]

Subtractive color reproduction

When color reproductions are made, the image is separated into its component colors by taking multiple photographs filtered for each color. One resultant film or plate represents each of the cyan, magenta, yellow, and black data. Color printing is a binary system, where ink is either present or not present, so all color separations to be printed must be translated into dots at some stage of the work-flow. Traditional line screens which are amplitude modulated had problems with moiré but were used until stochastic screening became available. A stochastic (or frequency modulated) dot pattern creates a sharper image.

See also


  1. Doob, when citing Khinchin, uses the term 'chance variable', which used to be an alternative term for 'random variable'. [26]

Related Research Articles

Statistical mechanics is one of the pillars of modern physics. It is necessary for the fundamental study of any physical system that has a large number of degrees of freedom. The approach is based on statistical methods, probability theory and the microscopic physical laws.

Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. They are often used in physical and mathematical problems and are most useful when it is difficult or impossible to use other approaches. Monte Carlo methods are mainly used in three problem classes: optimization, numerical integration, and generating draws from a probability distribution.

In statistics, Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a Markov chain that has the desired distribution as its equilibrium distribution, one can obtain a sample of the desired distribution by recording states from the chain. The more steps that are included, the more closely the distribution of the sample matches the actual desired distribution. Various algorithms exist for constructing the Markov chain including the Metropolis–Hastings algorithm.

A quantitative analyst is a person who specializes in the application of mathematical and statistical methods to financial and risk management problems. The occupation is similar to those in industrial mathematics in other industries.

Stochastic differential equation differential equations involving stochastic processes

A stochastic differential equation (SDE) is a differential equation in which one or more of the terms is a stochastic process, resulting in a solution which is also a stochastic process. SDEs are used to model various phenomena such as unstable stock prices or physical systems subject to thermal fluctuations. Typically, SDEs contain a variable which represents random white noise calculated as the derivative of Brownian motion or the Wiener process. However, other types of random behaviour are possible, such as jump processes.

Joseph L. Doob American mathematician

Joseph Leo "Joe" Doob was an American mathematician, specializing in analysis and probability theory.

In mathematical finance, a Monte Carlo option model uses Monte Carlo methods to calculate the value of an option with multiple sources of uncertainty or with complicated features. The first application to option pricing was by Phelim Boyle in 1977. In 1996, M. Broadie and P. Glasserman showed how to price Asian options by Monte Carlo. In 2001 F. A. Longstaff and E. S. Schwartz developed a practical Monte Carlo method for pricing American-style options.

In probability theory, the Gillespie algorithm generates a statistically correct trajectory of a stochastic equation. It was created by Joseph L. Doob and others, presented by Dan Gillespie in 1976, and popularized in 1977 in a paper where he uses it to simulate chemical or biochemical systems of reactions efficiently and accurately using limited computational power. As computers have become faster, the algorithm has been used to simulate increasingly complex systems. The algorithm is particularly useful for simulating reactions within cells, where the number of reagents is low and keeping track of the position and behaviour of individual molecules is computationally feasible. Mathematically, it is a variant of a dynamic Monte Carlo method and similar to the kinetic Monte Carlo methods. It is used heavily in computational systems biology.

Polynomial chaos (PC), also called Wiener chaos expansion, is a non-sampling-based method to determine evolution of uncertainty in a dynamical system when there is probabilistic uncertainty in the system parameters. PC was first introduced by Norbert Wiener where Hermite polynomials were used to model stochastic processes with Gaussian random variables. It can be thought of as an extension of Volterra's theory of nonlinear functionals for stochastic systems. According to Cameron and Martin such an expansion converges in the sense for any arbitrary stochastic process with finite second moment. This applies to most physical systems.

Daniel Thomas Gillespie was a physicist who is best known for his derivation in 1976 of the stochastic simulation algorithm (SSA), also called the Gillespie algorithm. The SSA is a procedure for numerically simulating the time evolution of the molecular populations in a chemically reacting system in a way that takes account of the fact that molecules react in whole numbers and in a largely random way. Since the late 1990s, the SSA has been widely used to simulate chemical reactions inside living cells, where the small molecular populations of some reactant species often invalidate the differential equations of traditional deterministic chemical kinetics.

This page lists articles related to probability theory. In particular, it lists many articles corresponding to specific probability distributions. Such articles are marked here by a code of the form (X:Y), which refers to number of random variables involved and the type of the distribution. For example (2:DC) indicates a distribution with two random variables, discrete or continuous. Other codes are just abbreviations for topics. The list of codes can be found in the table of contents.

Stochastic geometry study of random spatial patterns

In mathematics, stochastic geometry is the study of random spatial patterns. At the heart of the subject lies the study of random point patterns. This leads to the theory of spatial point processes, hence notions of Palm conditioning, which extend to the more abstract setting of random measures.

Mathematical finance, also known as quantitative finance and financial mathematics, is a field of applied mathematics, concerned with mathematical modeling of financial markets. Generally, mathematical finance will derive and extend the mathematical or numerical models without necessarily establishing a link to financial theory, taking observed market prices as input. Mathematical consistency is required, not compatibility with economic theory. Thus, for example, while a financial economist might study the structural reasons why a company may have a certain share price, a financial mathematician may take the share price as a given, and attempt to use stochastic calculus to obtain the corresponding value of derivatives of the stock. The fundamental theorem of arbitrage-free pricing is one of the key theorems in mathematical finance, while the Black–Scholes equation and formula are amongst the key results.

In queueing theory, a discipline within the mathematical theory of probability, a fluid queue is a mathematical model used to describe the fluid level in a reservoir subject to randomly determined periods of filling and emptying. The term dam theory was used in earlier literature for these models. The model has been used to approximate discrete models, model the spread of wildfires, in ruin theory and to model high speed data networks. The model applies the leaky bucket algorithm to a stochastic source.

J. Laurie Snell American mathematician and economist

James Laurie Snell, often cited as J. Laurie Snell, was an American mathematician.

In probability theory, a Cauchy process is a type of stochastic process. There are symmetric and asymmetric forms of the Cauchy process. The unspecified term "Cauchy process" is often used to refer to the symmetric Cauchy process.

Mean field particle methods are a broad class of interacting type Monte Carlo algorithms for simulating from a sequence of probability distributions satisfying a nonlinear evolution equation These flows of probability measures can always be interpreted as the distributions of the random states of a Markov process whose transition probabilities depends on the distributions of the current random states. A natural way to simulate these sophisticated nonlinear Markov processes is to sample a large number of copies of the process, replacing in the evolution equation the unknown distributions of the random states by the sampled empirical measures. In contrast with traditional Monte Carlo and Markov chain Monte Carlo methods these mean field particle techniques rely on sequential interacting samples. The terminology mean field reflects the fact that each of the samples interacts with the empirical measures of the process. When the size of the system tends to infinity, these random empirical measures converge to the deterministic distribution of the random states of the nonlinear Markov chain, so that the statistical interaction between particles vanishes. In other words, starting with a chaotic configuration based on independent copies of initial state of the nonlinear Markov chain model, the chaos propagates at any time horizon as the size the system tends to infinity; that is, finite blocks of particles reduces to independent copies of the nonlinear Markov process. This result is called the propagation of chaos property. The terminology "propagation of chaos" originated with the work of Mark Kac in 1976 on a colliding mean field kinetic gas model

In mathematics, the walk-on-spheres method (WoS) is a numerical probabilistic algorithm, or Monte-Carlo method, used mainly in order to approximate the solutions of some specific boundary value problem for partial differential equations (PDEs). The WoS method was first introduced by Mervin E. Muller in 1956 to solve Laplace's equation, and was since then generalized to other problems.


  1. 1 2 3 4 "Stochastic". Oxford Dictionaries . Oxford University Press.
  2. Robert J. Adler; Jonathan E. Taylor (29 January 2009). Random Fields and Geometry. Springer Science & Business Media. pp. 7–8. ISBN   978-0-387-48116-6.
  3. David Stirzaker (2005). Stochastic Processes and Models. Oxford University Press. p. 45. ISBN   978-0-19-856814-8.
  4. Loïc Chaumont; Marc Yor (19 July 2012). Exercises in Probability: A Guided Tour from Measure Theory to Random Processes, Via Conditioning. Cambridge University Press. p. 175. ISBN   978-1-107-60655-5.
  5. Murray Rosenblatt (1962). Random Processes. Oxford University Press. p. 91.
  6. Olav Kallenberg (8 January 2002). Foundations of Modern Probability. Springer Science & Business Media. pp. 24 and 25. ISBN   978-0-387-95313-7.
  7. Paul C. Bressloff (22 August 2014). Stochastic Processes in Cell Biology. Springer. ISBN   978-3-319-08488-6.
  8. N.G. Van Kampen (30 August 2011). Stochastic Processes in Physics and Chemistry. Elsevier. ISBN   978-0-08-047536-3.
  9. Russell Lande; Steinar Engen; Bernt-Erik Sæther (2003). Stochastic Population Dynamics in Ecology and Conservation. Oxford University Press. ISBN   978-0-19-852525-7.
  10. Carlo Laing; Gabriel J Lord (2010). Stochastic Methods in Neuroscience. OUP Oxford. ISBN   978-0-19-923507-0.
  11. Wolfgang Paul; Jörg Baschnagel (11 July 2013). Stochastic Processes: From Physics to Finance. Springer Science & Business Media. ISBN   978-3-319-00327-6.
  12. Edward R. Dougherty (1999). Random processes for image and signal processing. SPIE Optical Engineering Press. ISBN   978-0-8194-2513-3.
  13. Thomas M. Cover; Joy A. Thomas (28 November 2012). Elements of Information Theory. John Wiley & Sons. p. 71. ISBN   978-1-118-58577-1.
  14. Michael Baron (15 September 2015). Probability and Statistics for Computer Scientists, Second Edition. CRC Press. p. 131. ISBN   978-1-4987-6060-7.
  15. Jonathan Katz; Yehuda Lindell (2007-08-31). Introduction to Modern Cryptography: Principles and Protocols. CRC Press. p. 26. ISBN   978-1-58488-586-3.
  16. François Baccelli; Bartlomiej Blaszczyszyn (2009). Stochastic Geometry and Wireless Networks. Now Publishers Inc. pp. 200–. ISBN   978-1-60198-264-3.
  17. J. Michael Steele (2001). Stochastic Calculus and Financial Applications. Springer Science & Business Media. ISBN   978-0-387-95016-7.
  18. Marek Musiela; Marek Rutkowski (21 January 2006). Martingale Methods in Financial Modelling. Springer Science & Business Media. ISBN   978-3-540-26653-2.
  19. Steven E. Shreve (3 June 2004). Stochastic Calculus for Finance II: Continuous-Time Models. Springer Science & Business Media. ISBN   978-0-387-40101-0.
  20. O. B. Sheĭnin (2006). Theory of probability and statistics as exemplified in short dictums. NG Verlag. p. 5. ISBN   978-3-938417-40-9.
  21. Oscar Sheynin; Heinrich Strecker (2011). Alexandr A. Chuprov: Life, Work, Correspondence. V&R unipress GmbH. p. 136. ISBN   978-3-89971-812-6.
  22. 1 2 Doob, Joseph (1934). "Stochastic Processes and Statistics". Proceedings of the National Academy of Sciences of the United States of America. 20 (6): 376–379. doi:10.1073/pnas.20.6.376. PMC   1076423 .
  23. Khintchine, A. (1934). "Korrelationstheorie der stationeren stochastischen Prozesse". Mathematische Annalen. 109 (1): 604–615. doi:10.1007/BF01449156. ISSN   0025-5831.
  24. Kolmogoroff, A. (1931). "Über die analytischen Methoden in der Wahrscheinlichkeitsrechnung". Mathematische Annalen. 104 (1): 1. doi:10.1007/BF01457949. ISSN   0025-5831.
  25. Vere-Jones, David (2006). "Khinchin, Aleksandr Yakovlevich": 4. doi:10.1002/0471667196.ess6027.pub2.
  26. Snell, J. Laurie (2005). "Obituary: Joseph Leonard Doob". Journal of Applied Probability. 42 (1): 251. doi:10.1239/jap/1110381384. ISSN   0021-9002.
  27. Bingham, N. (2000). "Studies in the history of probability and statistics XLVI. Measure into probability: from Lebesgue to Kolmogorov". Biometrika. 87 (1): 145–156. doi:10.1093/biomet/87.1.145. ISSN   0006-3444.
  28. 1 2 Cramer, Harald (1976). "Half a Century with Probability Theory: Some Personal Recollections". The Annals of Probability. 4 (4): 509–546. doi:10.1214/aop/1176996025. ISSN   0091-1798.
  29. Applebaum, David (2004). "Lévy processes: From probability to finance and quantum groups". Notices of the AMS. 51 (11): 1336–1347.
  30. Jochen Blath; Peter Imkeller; Sylvie Rœlly (2011). Surveys in Stochastic Processes. European Mathematical Society. pp. 5–. ISBN   978-3-03719-072-2.
  31. Michel Talagrand (12 February 2014). Upper and Lower Bounds for Stochastic Processes: Modern Methods and Classical Problems. Springer Science & Business Media. pp. 4–. ISBN   978-3-642-54075-2.
  32. Paul C. Bressloff (22 August 2014). Stochastic Processes in Cell Biology. Springer. pp. vii–ix. ISBN   978-3-319-08488-6.
  33. Douglas Hubbard "How to Measure Anything: Finding the Value of Intangibles in Business" p. 46, John Wiley & Sons, 2007
  34. Hänggi, P. (2002). "Stochastic Resonance in Biology How Noise Can Enhance Detection of Weak Signals and Help Improve Biological Information Processing". ChemPhysChem. 3 (3): 285–90. doi:10.1002/1439-7641(20020315)3:3<285::AID-CPHC285>3.0.CO;2-A. PMID   12503175.
  35. Priplata, A.; et al. (2006). "Noise-Enhanced Balance Control in Patients with Diabetes and Patients with Stroke" (PDF). Ann Neurol. 59: 4–12. doi:10.1002/ana.20670. PMID   16287079.
  36. Newmeyer, Frederick. 2001. "The Prague School and North American functionalist approaches to syntax" Journal of Linguistics 37, pp. 101–126. "Since most American functionalists adhere to this trend, I will refer to it and its practitioners with the initials 'USF'. Some of the more prominent USFs are Joan Bybee, William Croft, Talmy Givon, John Haiman, Paul Hopper, Marianne Mithun and Sandra Thompson. In its most extreme form (Hopper 1987, 1988), USF rejects the Saussurean dichotomies such as langue vs. parôle. For early interpretivist approaches to focus, see Chomsky (1971) and Jackendoff (1972). parole and synchrony vs. diachrony. All adherents of this tendency feel that the Chomskyan advocacy of a sharp distinction between competence and performance is at best unproductive and obscurantist; at worst theoretically unmotivated. "
  37. Bybee, Joan. "Usage-based phonology." p. 213 in Darnel, Mike (ed). 1999. Functionalism and Formalism in Linguistics: General papers. John Benjamins Publishing Company
  38. Chomsky (1959). Review of Skinner's Verbal Behavior, Language, 35: 26–58
  39. Manning and Schütze, (1999) Foundations of Statistical Natural Language Processing, MIT Press. Cambridge, MA
  40. Bybee (2007) Frequency of use and the organization of language. Oxford: Oxford University Press
  41. Ilias Chrissochoidis, Stavros Houliaras, and Christos Mitsakis, "Set theory in Xenakis' EONTA", in International Symposium Iannis Xenakis, ed. Anastasia Georgaki and Makis Solomos (Athens: The National and Kapodistrian University, 2005), 241–249.
  43. Hamamoto, Darrell Y. (2002). "Empire of Death: Militarized Society and the Rise of Serial Killing and Mass Murder". New Political Science. 24 (1): 105–120. doi:10.1080/07393140220122662.
  44. DeVega, Chauncey (1 November 2018). "Author David Neiwert on the outbreak of political violence". Salon. Retrieved 13 December 2018.

Further reading