Probabilistic programming

Last updated

Probabilistic programming (PP) is a programming paradigm in which probabilistic models are specified and inference for these models is performed automatically. [1] It represents an attempt to unify probabilistic modeling and traditional general purpose programming in order to make the former easier and more widely applicable. [2] [3] It can be used to create systems that help make decisions in the face of uncertainty.

Contents

Programming languages used for probabilistic programming are referred to as "probabilistic programming languages" (PPLs).

Applications

Probabilistic reasoning has been used for a wide variety of tasks such as predicting stock prices, recommending movies, diagnosing computers, detecting cyber intrusions and image detection. [4] However, until recently (partially due to limited computing power), probabilistic programming was limited in scope, and most inference algorithms had to be written manually for each task.

Nevertheless, in 2015, a 50-line probabilistic computer vision program was used to generate 3D models of human faces based on 2D images of those faces. The program used inverse graphics as the basis of its inference method, and was built using the Picture package in Julia. [4] This made possible "in 50 lines of code what used to take thousands". [5] [6]

The Gen probabilistic programming library (also written in Julia) has been applied to vision and robotics tasks. [7]

More recently, the probabilistic programming system Turing.jl has been applied in various pharmaceutical [8] and economics applications. [9]

Probabilistic programming in Julia has also been combined with differentiable programming by combining the Julia package Zygote.jl with Turing.jl. [10]

Probabilistic programming languages are also commonly used in Bayesian cognitive science to develop and evaluate models of cognition. [11]

Probabilistic programming languages

PPLs often extend from a basic language. For instance, Turing.jl [12] is based on Julia, Infer.NET is based on .NET Framework, [13] while PRISM extends from Prolog. [14] However, some PPLs, such as WinBUGS, offer a self-contained language that maps closely to the mathematical representation of the statistical models, with no obvious origin in another programming language. [15] [16]

The language for WinBUGS was implemented to perform Bayesian computation using Gibbs Sampling and related algorithms. Although implemented in a relatively unknown programming language (Component Pascal), this language permits Bayesian inference for a wide variety of statistical models using a flexible computational approach. The same BUGS language may be used to specify Bayesian models for inference via different computational choices ("samplers") and conventions or defaults, using a standalone program WinBUGS (or related R packages, rbugs and r2winbugs) and JAGS (Just Another Gibbs Sampler, another standalone program with related R packages including rjags, R2jags, and runjags). More recently, other languages to support Bayesian model specification and inference allow different or more efficient choices for the underlying Bayesian computation, and are accessible from the R data analysis and programming environment, e.g.: Stan, NIMBLE and NUTS. The influence of the BUGS language is evident in these later languages, which even use the same syntax for some aspects of model specification.

Several PPLs are in active development, including some in beta test. Two popular tools are Stan and PyMC. [17]

Relational

A probabilistic relational programming language (PRPL) is a PPL specially designed to describe and infer with probabilistic relational models (PRMs).

A PRM is usually developed with a set of algorithms for reducing, inference about and discovery of concerned distributions, which are embedded into the corresponding PRPL.

Probabilistic logic programming

Probabilistic logic programming is a programming paradigm that extends logic programming with probabilities.

Most approaches to probabilistic logic programming are based on the distribution semantics, which splits a program into a set of probabilistic facts and a logic program. It defines a probability distribution on interpretations of the Herbrand universe of the program. [18]

List of probabilistic programming languages

This list summarises the variety of PPLs that are currently available, and clarifies their origins.

NameExtends fromHost language
Analytica [19] C++
bayesloop [20] [21] PythonPython
Bean Machine [22] PyTorch Python
Venture [23] Scheme C++
BayesDB [24] SQLite, Python
PRISM [14] B-Prolog
Infer.NET [13] .NET Framework.NET Framework
diff-SAT [25] Answer set programming, SAT (DIMACS CNF)
PSQL [26] SQL
BUGS [15] Component Pascal
Dyna [27] Prolog
Figaro [28] ScalaScala
ProbLog [29] PrologPython
ProBT [30] C++, Python
Stan [16] BUGSC++
Hakaru [31] Haskell Haskell
BAli-Phy (software) [32] HaskellC++
ProbCog [33] Java, Python
PyMC [34] PythonPython
Rainier [35] [36] ScalaScala
greta [37] TensorFlow R
pomegranate [38] PythonPython
Lea [39] PythonPython
WebPPL [40] JavaScriptJavaScript
Picture [4] JuliaJulia
Turing.jl [12] Julia Julia
Gen [41] Julia Julia
Edward [42] TensorFlow Python
TensorFlow Probability [43] TensorFlowPython
Edward2 [44] TensorFlow ProbabilityPython
Pyro [45] PyTorch Python
NumPyro [46] JAX Python
Birch [47] C++
PSI [48] D
Blang [49]
MultiVerse [50] PythonPython

Difficulty

Reasoning about variables as probability distributions causes difficulties for novice programmers, but these difficulties can be addressed through use of Bayesian network visualisations and graphs of variable distributions embedded within the source code editor. [51]

See also

Notes

  1. "Probabilistic programming does in 50 lines of code what used to take thousands". phys.org. April 13, 2015. Retrieved April 13, 2015.
  2. "Probabilistic Programming". probabilistic-programming.org. Archived from the original on January 10, 2016. Retrieved December 24, 2013.
  3. Pfeffer, Avrom (2014), Practical Probabilistic Programming, Manning Publications. p.28. ISBN   978-1 6172-9233-0
  4. 1 2 3 "Short probabilistic programming machine-learning code replaces complex programs for computer-vision tasks". KurzweilAI. April 13, 2015. Retrieved November 27, 2017.
  5. Hardesty, Larry (April 13, 2015). "Graphics in reverse".
  6. "MIT shows off machine-learning script to make CREEPY HEADS". The Register .
  7. "MIT's Gen programming system flattens the learning curve for AI projects". VentureBeat. June 27, 2019. Retrieved June 27, 2019.
  8. Semenova, Elizaveta; Williams, Dominic P.; Afzal, Avid M.; Lazic, Stanley E. (November 1, 2020). "A Bayesian neural network for toxicity prediction". Computational Toxicology. 16: 100133. doi:10.1016/j.comtox.2020.100133. ISSN   2468-1113. S2CID   225362130.
  9. Williams, Dominic P.; Lazic, Stanley E.; Foster, Alison J.; Semenova, Elizaveta; Morgan, Paul (2020), "Predicting Drug-Induced Liver Injury with Bayesian Machine Learning", Chemical Research in Toxicology, 33 (1): 239–248, doi:10.1021/acs.chemrestox.9b00264, PMID   31535850, S2CID   202689667
  10. Innes, Mike; Edelman, Alan; Fischer, Keno; Rackauckas, Chris; Saba, Elliot; Viral B Shah; Tebbutt, Will (2019). "∂P: A Differentiable Programming System to Bridge Machine Learning and Scientific Computing". arXiv: 1907.07587 [cs.PL].
  11. Goodman, Noah D; Tenenbaum, Joshua B; Buchsbaum, Daphna; Hartshorne, Joshua; Hawkins, Robert; O'Donnell, Timothy J; Tessler, Michael Henry. "Probabilistic Models of Cognition". Probabilistic Models of Cognition - 2nd Edition. Retrieved May 27, 2023.
  12. 1 2 "The Turing language for probabilistic programming". GitHub . December 28, 2021.
  13. 1 2 "Infer.NET". microsoft.com. Microsoft.
  14. 1 2 "PRISM: PRogramming In Statistical Modeling". rjida.meijo-u.ac.jp. Archived from the original on March 1, 2015. Retrieved July 8, 2015.
  15. 1 2 "The BUGS Project - MRC Biostatistics Unit". cam.ac.uk. Archived from the original on March 14, 2014. Retrieved January 12, 2011.
  16. 1 2 "Stan". mc-stan.org. Archived from the original on September 3, 2012.
  17. "The Algorithms Behind Probabilistic Programming" . Retrieved March 10, 2017.
  18. De Raedt, Luc; Kimmig, Angelika (July 1, 2015). "Probabilistic (logic) programming concepts". Machine Learning. 100 (1): 5–47. doi:10.1007/s10994-015-5494-z. ISSN   1573-0565.
  19. "Analytica-- A Probabilistic Modeling Language". lumina.com.
  20. "bayesloop - Probabilistic programming framework". bayesloop.com.
  21. "GitHub -- bayesloop". GitHub . December 7, 2021.
  22. "Bean Machine - A universal probabilistic programming language to enable fast and accurate Bayesian analysis". beanmachine.org.
  23. "Venture -- a general-purpose probabilistic programming platform". mit.edu. Archived from the original on January 25, 2016. Retrieved September 20, 2014.
  24. "BayesDB on SQLite. A Bayesian database table for querying the probable implications of data as easily as SQL databases query the data itself". GitHub. December 26, 2021.
  25. "diff-SAT (probabilistic SAT/ASP)". GitHub . October 8, 2021.
  26. Dey, Debabrata; Sarkar, Sumit (1998). "PSQL: A query language for probabilistic relational data". Data & Knowledge Engineering. 28: 107–120. doi:10.1016/S0169-023X(98)00015-9.
  27. "Dyna". www.dyna.org. Archived from the original on January 17, 2016. Retrieved January 12, 2011.
  28. "Charles River Analytics - Probabilistic Modeling Services". cra.com. February 9, 2017.
  29. "ProbLog: Probabilistic Programming". dtai.cs.kuleuven.be.
  30. ProbaYes. "ProbaYes - Ensemble, nous valorisations vos données". probayes.com. Archived from the original on March 5, 2016. Retrieved November 26, 2013.
  31. "Hakaru Home Page". hakaru-dev.github.io/.
  32. "BAli-Phy Home Page". bali-phy.org.
  33. "ProbCog". GitHub.
  34. PyMC devs. "PyMC". pymc-devs.github.io.
  35. stripe/rainier, Stripe, August 19, 2020, retrieved August 26, 2020
  36. "Rainier · Bayesian inference for Scala". samplerainier.com. Retrieved August 26, 2020.
  37. "greta: simple and scalable statistical modelling in R". GitHub. Retrieved October 2, 2018.
  38. "Home — pomegranate 0.10.0 documentation". pomegranate.readthedocs.io. Retrieved October 2, 2018.
  39. "Lea Home Page". bitbucket.org.
  40. "WebPPL Home Page". github.com/probmods/webppl.
  41. "Gen: A General Purpose Probabilistic Programming Language with Programmable Inference" . Retrieved June 11, 2024.
  42. "Edward – Home". edwardlib.org. Retrieved January 17, 2017.
  43. TensorFlow (April 11, 2018). "Introducing TensorFlow Probability". TensorFlow. Retrieved October 2, 2018.
  44. "'Edward2' TensorFlow Probability module". GitHub. Retrieved June 11, 2024.
  45. "Pyro". pyro.ai. Retrieved February 9, 2018.
  46. "NumPyro". pyro.ai. Retrieved July 23, 2021.
  47. "Probabilistic Programming in Birch". birch-lang.org. Retrieved April 20, 2018.
  48. "PSI Solver - Exact inference for probabilistic programs". psisolver.org. Retrieved August 18, 2019.
  49. "Home". www.stat.ubc.ca.
  50. Perov, Yura; Graham, Logan; Gourgoulias, Kostis; Richens, Jonathan G.; Lee, Ciarán M.; Baker, Adam; Johri, Saurabh (January 28, 2020), MultiVerse: Causal Reasoning using Importance Sampling in Probabilistic Programming, arXiv: 1910.08091
  51. Gorinova, Maria I.; Sarkar, Advait; Blackwell, Alan F.; Syme, Don (January 1, 2016). "A Live, Multiple-Representation Probabilistic Programming Environment for Novices". Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. CHI '16. New York, NY, USA: ACM. pp. 2533–2537. doi:10.1145/2858036.2858221. ISBN   9781450333627. S2CID   3201542.

Related Research Articles

Inductive logic programming (ILP) is a subfield of symbolic artificial intelligence which uses logic programming as a uniform representation for examples, background knowledge and hypotheses. The term "inductive" here refers to philosophical rather than mathematical induction. Given an encoding of the known background knowledge and a set of examples represented as a logical database of facts, an ILP system will derive a hypothesised logic program which entails all the positive and none of the negative examples.

Computer science is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is the ACM Computing Classification System devised by the Association for Computing Machinery.

Programming languages can be grouped by the number and types of paradigms supported.

In statistics, Markov chain Monte Carlo (MCMC) is a class of algorithms used to draw samples from a probability distribution. Given a probability distribution, one can construct a Markov chain whose elements' distribution approximates it – that is, the Markov chain's equilibrium distribution matches the target distribution. The more steps that are included, the more closely the distribution of the sample matches the actual desired distribution.

Solomonoff's theory of inductive inference is a mathematical theory of induction introduced by Ray Solomonoff, based on probability theory and theoretical computer science. In essence, Solomonoff's induction derives the posterior probability of any computable theory, given a sequence of observed data. This posterior probability is derived from Bayes' rule and some universal prior, that is, a prior that assigns a positive probability to any computable theory.

Formal epistemology uses formal methods from decision theory, logic, probability theory and computability theory to model and reason about issues of epistemological interest. Work in this area spans several academic fields, including philosophy, computer science, economics, and statistics. The focus of formal epistemology has tended to differ somewhat from that of traditional epistemology, with topics like uncertainty, induction, and belief revision garnering more attention than the analysis of knowledge, skepticism, and issues with justification.

Conditional random fields (CRFs) are a class of statistical modeling methods often applied in pattern recognition and machine learning and used for structured prediction. Whereas a classifier predicts a label for a single sample without considering "neighbouring" samples, a CRF can take context into account. To do so, the predictions are modelled as a graphical model, which represents the presence of dependencies between the predictions. What kind of graph is used depends on the application. For example, in natural language processing, "linear chain" CRFs are popular, for which each prediction is dependent only on its immediate neighbours. In image processing, the graph typically connects locations to nearby and/or similar locations to enforce that they receive similar predictions.

Probabilistic logic involves the use of probability and logic to deal with uncertain situations. Probabilistic logic extends traditional logic truth tables with probabilistic expressions. A difficulty of probabilistic logics is their tendency to multiply the computational complexities of their probabilistic and logical components. Other difficulties include the possibility of counter-intuitive results, such as in case of belief fusion in Dempster–Shafer theory. Source trust and epistemic uncertainty about the probabilities they provide, such as defined in subjective logic, are additional elements to consider. The need to deal with a broad variety of contexts and issues has led to many different proposals.

Statistical relational learning (SRL) is a subdiscipline of artificial intelligence and machine learning that is concerned with domain models that exhibit both uncertainty and complex, relational structure. Typically, the knowledge representation formalisms developed in SRL use first-order logic to describe relational properties of a domain in a general manner and draw upon probabilistic graphical models to model the uncertainty; some also build upon the methods of inductive logic programming. Significant contributions to the field have been made since the late 1990s.

A probabilistic logic network (PLN) is a conceptual, mathematical and computational approach to uncertain inference. It was inspired by logic programming and it uses probabilities in place of crisp (true/false) truth values, and fractional uncertainty in place of crisp known/unknown values. In order to carry out effective reasoning in real-world circumstances, artificial intelligence software handles uncertainty. Previous approaches to uncertain inference do not have the breadth of scope required to provide an integrated treatment of the disparate forms of cognitively critical uncertainty as they manifest themselves within the various forms of pragmatic inference. Going beyond prior probabilistic approaches to uncertain inference, PLN encompasses uncertain logic with such ideas as induction, abduction, analogy, fuzziness and speculation, and reasoning about time and causality.

Tensor software is a class of mathematical software designed for manipulation and calculation with tensors.

<span class="mw-page-title-main">Julia (programming language)</span> Dynamic programming language

Julia is a high-level, general-purpose dynamic programming language, most commonly used for numerical analysis and computational science. Distinctive aspects of Julia's design include a type system with parametric polymorphism and the use of multiple dispatch as a core programming paradigm, efficient garbage collection, and a just-in-time (JIT) compiler.

<span class="mw-page-title-main">Stan (software)</span> Probabilistic programming language for Bayesian inference

Stan is a probabilistic programming language for statistical inference written in C++. The Stan language is used to specify a (Bayesian) statistical model with an imperative program calculating the log probability density function.

This glossary of artificial intelligence is a list of definitions of terms and concepts relevant to the study of artificial intelligence, its sub-disciplines, and related fields. Related glossaries include Glossary of computer science, Glossary of robotics, and Glossary of machine vision.

PyMC is a probabilistic programming language written in Python. It can be used for Bayesian statistical modeling and probabilistic machine learning.

<span class="mw-page-title-main">Flux (machine-learning framework)</span> Open-source machine-learning software library

Flux is an open-source machine-learning software library and ecosystem written in Julia. Its current stable release is v0.14.5 . It has a layer-stacking-based interface for simpler models, and has a strong support on interoperability with other Julia packages instead of a monolithic design. For example, GPU support is implemented transparently by CuArrays.jl. This is in contrast to some other machine learning frameworks which are implemented in other languages with Julia bindings, such as TensorFlow.jl, and thus are more limited by the functionality present in the underlying implementation, which is often in C or C++. Flux joined NumFOCUS as an affiliated project in December of 2021.

In probability theory and Bayesian statistics, the Lewandowski-Kurowicka-Joe distribution, often referred to as the LKJ distribution, is a probability distribution over positive definite symmetric matrices with unit diagonals.

ProbLog is a probabilistic logic programming language that extends Prolog with probabilities. It minimally extends Prolog by adding the notion of a probabilistic fact, which combines the idea of logical atoms and random variables. Similarly to Prolog, ProbLog can query an atom. While Prolog returns the truth value of the queried atom, ProbLog returns the probability of it being true.

Probabilistic logic programming is a programming paradigm that combines logic programming with probabilities.