Radical probabilism

Last updated

Radical probabilism is a hypothesis in philosophy, in particular epistemology, and probability theory that holds that no facts are known for certain. That view holds profound implications for statistical inference. The philosophy is particularly associated with Richard Jeffrey who wittily characterised it with the dictum "It's probabilities all the way down."

Contents

Background

Bayes' theorem states a rule for updating a probability conditioned on other information. In 1967, Ian Hacking argued that in a static form, Bayes' theorem only connects probabilities that are held simultaneously; it does not tell the learner how to update probabilities when new evidence becomes available over time, contrary to what contemporary Bayesians suggested. [1]

According to Hacking, adopting Bayes' theorem is a temptation. Suppose that a learner forms probabilities Pold(A & B) = p and Pold(B) = q. If the learner subsequently learns that B is true, nothing in the axioms of probability or the results derived therefrom tells him how to behave. He might be tempted to adopt Bayes' theorem by analogy and set his Pnew(A) = Pold(A | B) = p/q.

In fact, that step, Bayes' rule of updating, can be justified, as necessary and sufficient, through a dynamic Dutch book argument that is additional to the arguments used to justify the probability axioms. This argument was first put forward by David Lewis in the 1970s though he never published it. [2] The dynamic Dutch book argument for Bayesian updating has been criticised by Hacking, [1] Kyburg, [3] Christensen, [4] and Maher. [5] [6] It was defended by Brian Skyrms. [2]

Certain and uncertain knowledge

That works when the new data is certain. C. I. Lewis had argued that "If anything is to be probable then something must be certain". [7] There must, on Lewis' account, be some certain facts on which probabilities were conditioned. However, the principle known as Cromwell's rule declares that nothing, apart from a logical law, if that, can ever be known for certain. Jeffrey famously rejected Lewis' dictum. [8] He later quipped, "It's probabilities all the way down," a reference to the "turtles all the way down" metaphor for the infinite regress problem. He called this position radical probabilism. [9]

Conditioning on an uncertainty – probability kinematics

In this case Bayes' rule isn't able to capture a mere subjective change in the probability of some critical fact. The new evidence may not have been anticipated or even be capable of being articulated after the event. It seems reasonable, as a starting position, to adopt the law of total probability and extend it to updating in much the same way as was Bayes' theorem. [10]

Pnew(A) = Pold(A | B)Pnew(B) + Pold(A | not-B)Pnew(not-B)

Adopting such a rule is sufficient to avoid a Dutch book but not necessary. [2] Jeffrey advocated this as a rule of updating under radical probabilism and called it probability kinematics. Others have named it Jeffrey conditioning.

Alternatives to probability kinematics

Probability kinematics is not the only sufficient updating rule for radical probabilism. Others have been advocated including E. T. Jaynes' maximum entropy principle, and Skyrms' principle of reflection. It turns out that probability kinematics is a special case of maximum entropy inference. However, maximum entropy is not a generalisation of all such sufficient updating rules. [11]

Related Research Articles

Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief.

<span class="mw-page-title-main">Probability</span> Branch of mathematics concerning chance and uncertainty

In science, the probability of an event is a number that indicates how likely the event is to occur. It is expressed as a number in the range from 0 and 1, or, using percentage notation, in the range from 0% to 100%. The more likely it is that the event will occur, the higher its probability. The probability of an impossible event is 0; that of an event that is certain to occur is 1. The probabilities of two complementary events A and B – either A occurs or B occurs – add up to 1. A simple example is the tossing of a fair (unbiased) coin. If a coin is fair, the two possible outcomes are equally likely; since these two outcomes are complementary and the probability of "heads" equals the probability of "tails", the probability of each of the two outcomes equals 1/2.

<span class="mw-page-title-main">Quantum information</span> Information held in the state of a quantum system

Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term.

In probability theory and statistics, Bayes' theorem, named after Thomas Bayes, describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For example, if the risk of developing health problems is known to increase with age, Bayes' theorem allows the risk to an individual of a known age to be assessed more accurately by conditioning it relative to their age, rather than simply assuming that the individual is typical of the population as a whole.

Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called "Bayesian probability".

<span class="mw-page-title-main">Bruno de Finetti</span> Italian mathematician (1906–1985)

Bruno de Finetti was an Italian probabilist statistician and actuary, noted for the "operational subjective" conception of probability. The classic exposition of his distinctive theory is the 1937 "La prévision: ses lois logiques, ses sources subjectives," which discussed probability founded on the coherence of betting odds and the consequences of exchangeability.

The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precisely stated prior data.

Inferences are steps in reasoning, moving from premises to logical consequences; etymologically, the word infer means to "carry forward". Inference is theoretically traditionally divided into deduction and induction, a distinction that in Europe dates at least to Aristotle. Deduction is inference deriving logical conclusions from premises known or assumed to be true, with the laws of valid inference being studied in logic. Induction is inference from particular evidence to a universal conclusion. A third type of inference is sometimes distinguished, notably by Charles Sanders Peirce, contradistinguishing abduction from induction.

<span class="mw-page-title-main">Ian Hacking</span> Canadian philosopher (1936–2023)

Ian MacDougall Hacking was a Canadian philosopher specializing in the philosophy of science. Throughout his career, he won numerous awards, such as the Killam Prize for the Humanities and the Balzan Prize, and was a member of many prestigious groups, including the Order of Canada, the Royal Society of Canada and the British Academy.

A prior probability distribution of an uncertain quantity, often simply called the prior, is its assumed probability distribution before some evidence is taken into account. For example, the prior could be the probability distribution representing the relative proportions of voters who will vote for a particular politician in a future election. The unknown quantity may be a parameter of the model or a latent variable rather than an observable variable.

In quantum physics, a measurement is the testing or manipulation of a physical system to yield a numerical result. A fundamental feature of quantum theory is that the predictions it makes are probabilistic. The procedure for finding a probability involves combining a quantum state, which mathematically describes a quantum system, with a mathematical representation of the measurement to be performed on that system. The formula for this calculation is known as the Born rule. For example, a quantum particle like an electron can be described by a quantum state that associates to each point in space a complex number called a probability amplitude. Applying the Born rule to these amplitudes gives the probabilities that the electron will be found in one region or another when an experiment is performed to locate it. This is the best the theory can do; it cannot say for certain where the electron will be found. The same quantum state can also be used to make a prediction of how the electron will be moving, if an experiment is performed to measure its momentum instead of its position. The uncertainty principle implies that, whatever the quantum state, the range of predictions for the electron's position and the range of predictions for its momentum cannot both be narrow. Some quantum states imply a near-certain prediction of the result of a position measurement, but the result of a momentum measurement will be highly unpredictable, and vice versa. Furthermore, the fact that nature violates the statistical conditions known as Bell inequalities indicates that the unpredictability of quantum measurement results cannot be explained away as due to ignorance about "hidden variables" within quantum systems.

In gambling, a Dutch book or lock is a set of odds and bets, established by the bookmaker, that ensures that the bookmaker will profit—at the expense of the gamblers—regardless of the outcome of the event on which the gamblers bet. It is associated with probabilities implied by the odds not being coherent.

In physics, maximum entropy thermodynamics views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon information theory, Bayesian probability, and the principle of maximum entropy. These techniques are relevant to any situation requiring prediction from incomplete or insufficient data. MaxEnt thermodynamics began with two papers by Edwin T. Jaynes published in the 1957 Physical Review.

Formal epistemology uses formal methods from decision theory, logic, probability theory and computability theory to model and reason about issues of epistemological interest. Work in this area spans several academic fields, including philosophy, computer science, economics, and statistics. The focus of formal epistemology has tended to differ somewhat from that of traditional epistemology, with topics like uncertainty, induction, and belief revision garnering more attention than the analysis of knowledge, skepticism, and issues with justification.

Richard Carl Jeffrey was an American philosopher, logician, and probability theorist. He is best known for developing and championing the philosophy of radical probabilism and the associated heuristic of probability kinematics, also known as Jeffrey conditioning.

<span class="mw-page-title-main">Brian Skyrms</span> American philosopher

Brian Skyrms is an American philosopher, Distinguished Professor of Logic and Philosophy of Science and Economics at the University of California, Irvine, and a professor of philosophy at Stanford University. He has worked on problems in the philosophy of science, causation, decision theory, game theory, and the foundations of probability.

Probabilistic logic involves the use of probability and logic to deal with uncertain situations. Probabilistic logic extends traditional logic truth tables with probabilistic expressions. A difficulty of probabilistic logics is their tendency to multiply the computational complexities of their probabilistic and logical components. Other difficulties include the possibility of counter-intuitive results, such as in case of belief fusion in Dempster–Shafer theory. Source trust and epistemic uncertainty about the probabilities they provide, such as defined in subjective logic, are additional elements to consider. The need to deal with a broad variety of contexts and issues has led to many different proposals.

In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the assumption of non-contextuality. Andrew M. Gleason first proved the theorem in 1957, answering a question posed by George W. Mackey, an accomplishment that was historically significant for the role it played in showing that wide classes of hidden-variable theories are inconsistent with quantum physics. Multiple variations have been proven in the years since. Gleason's theorem is of particular importance for the field of quantum logic and its attempt to find a minimal set of mathematical axioms for quantum theory.

Equiprobability is a property for a collection of events that each have the same probability of occurring. In statistics and probability theory it is applied in the discrete uniform distribution and the equidistribution theorem for rational numbers. If there are events under consideration, the probability of each occurring is

Bayesian epistemology is a formal approach to various topics in epistemology that has its roots in Thomas Bayes' work in the field of probability theory. One advantage of its formal method in contrast to traditional epistemology is that its concepts and theorems can be defined with a high degree of precision. It is based on the idea that beliefs can be interpreted as subjective probabilities. As such, they are subject to the laws of probability theory, which act as the norms of rationality. These norms can be divided into static constraints, governing the rationality of beliefs at any moment, and dynamic constraints, governing how rational agents should change their beliefs upon receiving new evidence. The most characteristic Bayesian expression of these principles is found in the form of Dutch books, which illustrate irrationality in agents through a series of bets that lead to a loss for the agent no matter which of the probabilistic events occurs. Bayesians have applied these fundamental principles to various epistemological topics but Bayesianism does not cover all topics of traditional epistemology. The problem of confirmation in the philosophy of science, for example, can be approached through the Bayesian principle of conditionalization by holding that a piece of evidence confirms a theory if it raises the likelihood that this theory is true. Various proposals have been made to define the concept of coherence in terms of probability, usually in the sense that two propositions cohere if the probability of their conjunction is higher than if they were neutrally related to each other. The Bayesian approach has also been fruitful in the field of social epistemology, for example, concerning the problem of testimony or the problem of group belief. Bayesianism still faces various theoretical objections that have not been fully solved.

References

  1. 1 2 Hacking, Ian (1967). "Slightly more realistic personal probability". Philosophy of Science. 34 (4): 311–325. doi:10.1086/288169. S2CID   14344339.
  2. 1 2 3 Skyrms, Brian (1987a). "Dynamic coherence and probability kinematics". Philosophy of Science. 54: 1–20. doi:10.1086/289350. S2CID   120881078.
  3. Kyburg, H. (1978). "Subjective probability: Criticisms, reflections and problems". Journal of Philosophical Logic. 7: 157–180. doi:10.1007/bf00245926. S2CID   36972950.
  4. Christensen, D. (1991). "Clever bookies and coherent beliefs". Philosophical Review. 100 (2): 229–47. doi:10.2307/2185301. JSTOR   2185301.
  5. Maher, P (1992a). Betting on Theories. Cambridge: Cambridge University Press.
  6. Maher, Patrick (1992b). "Diachronic rationality". Philosophy of Science. 59: 120–41. doi:10.1086/289657. S2CID   224830300.
  7. Lewis, C. I. (1946). An Analysis of Knowledge and Valuation. La Salle, Illinois: Open Court. p. 186.
  8. Jeffrey, Richard C. (2004). "Chapter 3" . Subjective Probability: The Real Thing. Cambridge: Cambridge University Press.
  9. Skyrms, B (1996). "The structure of radical probabilism". Erkenntnis. 45 (2–3): 285–97. doi:10.1007/BF00276795.
  10. Jeffrey, Richard (1987). "Alias Smith and Jones: The testimony of the senses". Erkenntnis. 26 (3): 391–399. doi:10.1007/bf00167725. S2CID   121478331.
  11. Skyrms, B (1987b). "Updating, supposing and MAXENT". Theory and Decision. 22 (3): 225–46. doi:10.1007/bf00134086. S2CID   121847242.

Further reading