Author | John Maynard Keynes |
---|---|
Language | English |
Published | 1921 |
Publication place | England |
A Treatise on Probability, [1] published by John Maynard Keynes in 1921, provides a much more general logic of uncertainty than the more familiar and straightforward 'classical' theories of probability. [notes 1] [3] [notes 2] This has since become known as a "logical-relationist" approach, [5] [notes 3] and become regarded as the seminal and still classic account of the logical interpretation of probability (or probabilistic logic), a view of probability that has been continued by such later works as Carnap's Logical Foundations of Probability and E.T. Jaynes Probability Theory: The Logic of Science. [8]
Keynes's conception of this generalised notion of probability is that it is a strictly logical relation between evidence and hypothesis, a degree of partial implication. It was in part pre-empted by Bertrand Russell's use of an unpublished version. [9] [notes 4]
In a 1922 review, Bertrand Russell, the co-author of Principia Mathematica , called it "undoubtedly the most important work on probability that has appeared for a very long time," and said that the "book as a whole is one which it is impossible to praise too highly." [17] [notes 5]
With recent developments in machine learning to enable 'artificial intelligence' and behavioural economics the need for a logical approach that neither assumes some unattainable 'objectivity' nor relies on the subjective views of its designers or policy-makers has become more appreciated, and there has been a renewed interest in Keynes's work. [20] [21]
Here Keynes generalises the conventional concept of numerical probabilities to expressions of uncertainty that are not necessarily quantifiable or even comparable. [notes 6] [26]
In Chapter 1 'The Meaning of Probability' Keynes notes that one needs to consider the probability of propositions, not events. [notes 7]
In Chapter 2 'Probability in Relation to the Theory of knowledge' Keynes considers 'knowledge', 'rational belief' and 'argument' in relation to probability. [29]
In Chapter 3 'The Measurement of Probabilities' he considers probability as a not necessarily precise normalised measure [notes 8] and used the example of taking an umbrella in case of rain to illustrate this idea, that generalised probabilities can't always be compared.
Is our expectation of rain, when we start out for a walk, always more likely than not, or less likely than not, or as likely as not? I am prepared to argue that on some occasions none of these alternatives hold, and that it will be an arbitrary matter to decide for or against the umbrella. If the barometer is high, but the clouds are black, it is not always rational that one should prevail over the other in our minds, or even that we should balance them, though it will be rational to allow caprice to determine us and to waste no time on the debate. [30]
Chapter 4 'The Principle of Indifference' summarises and develops some objections to the over-use of 'the principle of indifference' (otherwise known as 'the principle of insufficient reason') to justify treating some probabilities as necessarily equal. [notes 9]
In Chapter 5 'Other Methods of Determining Probabilities' Keynes gives some examples of common fallacies, including:
It might plausibly be supposed that evidence would be favourable to our conclusion which is favourable to favourable evidence ... Whilst, however, this argument is frequently employed under conditions, which, if explicitly stated, would justify it, there are also conditions in which this is not so, so that it is not necessarily valid. For the very deceptive fallacy involved in the above supposition, Mr. Johnson, has suggested to me the name of the Fallacy of the Middle Term. [33]
He also presents some arguments to justify the use of 'direct judgement' to determine that one probability is greater than another in particular cases. [notes 10]
Chapter 6 'Weight of Argument' develops the idea of 'weight of argument' from chapter 3 and discusses the relevance of the 'amount' of evidence in support of a given probability judgement. [notes 11] Chapter 3 further noted the importance of the 'weight' of evidence in addition to any probability:
This comparison turns upon a balance, not between the favourable and the unfavourable evidence, but between the absolute amounts of relevant knowledge and of relevant ignorance respectively.
As the relevant evidence at our disposal increases, the magnitude of the probability of the argument may either decrease or increase, according as the new knowledge strengthens the unfavourable or the favourable evidence; but something seems to have increased in either case, we have a more substantial basis upon which to rest our conclusion. I express this by saying that an accession of new evidence increases the weight of an argument. New evidence will sometimes decrease the probability of an argument, but it will always increase its 'weight.' [37]
Chapter 7 provides a 'Historical Retrospect' while Chapter 8 describes 'The Frequency Theory of Probability', noting some limitations and caveats. In particular, he notes difficulties in establishing 'relevance' [38] and, further, the lack of support that the theory gives for common uses of induction and statistics. [39] [notes 12]
Part 1 concludes with Chapter 9 'The Constructive Theory of Part I. Summarised.' Keynes notes the ground to be covered by the subsequent parts.
This part has been likened to an appendix to Russell and Whitehead's Principia Mathematica. [41] According to Whitehead Chapter 12 'The Definition and Axioms of Inference and Probability'
'has the great merit that accompanies good symbolism, that essential points which without it are subtle and easily lost sight of, with it become simple and obvious. Also the axioms are good ... The very certainty and ease by which he is enabled to solve difficult questions and to detect ambiguities and errors inn the work of his predecessors exemplifies and at the same time almost conceals that advance which he has made. [42]
Chapter 14 'The Fundamental Theorems of Probable Inference' gives the main results on the addition, multiplication independence and relevance of conditional probabilities, leading up to an exposition of the 'Inverse principle' (now known as Bayes Rule) incorporating some previously unpublished work from W. E. Johnson correcting some common text-book errors in formulation and fallacies in interpretation, including 'the fallacy of the middle term'. [43]
In chapter 15 'Numerical Measurement and Approximation of Probabilities' Keynes develops the formalism of interval estimates as examples of generalised probabilities: Intervals that overlap are not greater than, less than or equal to each other. [notes 13]
Part 2 concludes with Chapter 17 'Some Problems in Inverse Probability, including Averages'. Keynes' concept of probability is significantly more subject to variation with evidence than the more conventional quantified classical probability. [notes 14]
Here Keynes considers under what circumstances conventional inductive reasoning might be applicable to both conventional and generalise probabilities, and how the results might be interpreted. He concludes that inductive arguments only affirm that 'relative to certain evidence there is a probability in its favour'. [45] [notes 15]
Chapter 21 'The Nature of Inductive Argument Continued' discusses the practical application of induction, particularly within the sciences.
The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be much less simple than the bare principle of Uniformity. They appear to assume something much more like what mathematicians call the principle of the superposition of small effects, or, as I prefer to call it, in this connection, the atomic character of natural law. ... ... Yet there might well be quite different laws for wholes of different degrees of complexity, and laws of connection between complexes which could not be stated in terms of laws connecting individual parts. In this case natural law would be organic and not, as it is generally supposed, atomic. [46] [notes 16]
Part 3 concludes with Chapter 23 'Some Historical Notes on Induction'. This notes that Francis Bacon and John Stuart Mill had implicitly made assumptions similar to those Keynes criticised above, but that nevertheless their arguments provide useful insights. [48]
Here Keynes considers some broader issues of application and interpretation. He concludes this part with Chapter 26 'The Application of Probability to Conduct'. Here Keynes notes that the conventional notion of utility as 'mathematical expectation' (summing value times probability) is derived from gambling. he doubts that value is 'subject to the laws of arithmetic' and in any case cites part 1 as denying that probabilities are. He further notes that often 'weights' are relevant and that in any case it 'assumes that an even chance of heaven or hell is precisely as much to be desired as the certain attainment of a state of mediocrity'. [49] He goes on to expand on these objections to what is known by economists as the expected utility hypothesis, particularly with regard to extreme cases. [notes 17]
Keynes ends by noting:
The chance that a man of 56 taken at random will die within a day ... is practically disregarded by a man of 56 who knows his health to be good. [notes 18]
and
To a stranger the probability that I shall send a letter to the post unstamped may be derived from the statistic of the Post Office; for me those figures would have not then slightest bearing on the situation. [51] [notes 19]
Keynes goes beyond induction to consider statistical inference, particularly as then used by the sciences.
In Chapter 28 ' The Law of Great Numbers' Keynes attributes to Poisson the view that 'in the long ... each class of events does eventually occur in a definite proportion of cases.' [53] He goes on:
The existence of numerous instances of the Law of Great Numbers, or of something of the kind, is absolutely essential for the importance of Statistical Induction. Apart from this the more precise parts of statistics, the collection of facts for the prediction of future frequencies and associations, would be nearly useless. But the 'Law of Great Numbers' is not at all a good name for the principle which underlies Statistical Induction. The 'Stability of Statistical Frequencies ' would be a much better name for it. The former suggests, as perhaps Poisson intended to suggest, but what is certainly false, that every class of event shows statistical regularity of occurrence if only one takes a sufficient number of instances of it. It also encourages the method of procedure, by which it is thought legitimate to take any observed degree of frequency or association, which is shown in a fairly numerous set of statistics and to assume with insufficient investigation that, because the statistics are numerous, the observed degree of frequency is therefore stable. Observation shows that some statistical frequencies are, within narrower or wider limits, stable. But stable frequencies are not very common, and cannot be assumed lightly. [54]
The key chapter is Chapter 32 'The Inductive Use of Statistical Frequencies for the Determination of Probability a posteriori - The Method of Lexis'. After citing Lexis' observations on both 'subnormal' and 'supernormal' dispersion, he notes that 'a supernormal dispersion [can] also arise out of connexite or organic connection between the successive terms. [55]
He concludes with Chapter 33, ‘An Outline of a Constructive Theory’. He notes a significant limitation of conventional statistical methods, as then used:
Where there is no stability at all and the frequencies are chaotic, the resulting series can be described as 'non-statistical.' Amongst 'statistical series ' we may term 'independent series' those of which the instances are independent and the stability normal, and 'organic series', those of which the instances are mutually dependent and the stability abnormal, whether in excess or in defect. [56]
Keynes also deals with the special case where the conventional notion of probability seems reasonable:
There is a great difference between the proposition "It is probable that every instance of this generalisation is true" and the proposition "It is probable of any instance of this generalisation taken at random that it is true." The latter proposition may remain valid, even if it is certain that some instances of the generalisation are false. It is more likely than not, for example, that any number will be divisible either by two or by three, but it is not more likely than not that all numbers are divisible either by two or by three.
The first type of proposition has been discussed in Part III. under the name of Universal Induction. The latter belongs to Inductive Correlation or Statistical Induction, an attempt at the logical analysis of which must be my final task.
His final paragraph reveals Keynes views on the significance of his findings, based on the then conventional view of classical science as traditionally understood at Cambridge:
In laying the foundations of the subject of Probability, I have departed a good deal from the conception of it which governed the minds of Laplace and Quetelet and has dominated through their influence the thought of the past century, though I believe that Leibniz and Hume might have read what I have written with sympathy. But in taking leave of Probability, I should like to say that, in my judgment; the practical usefulness of those modes of inference, here termed Universal and Statistical Induction, on the validity of which the boasted knowledge of modern science depends, can only exist and I do not now pause to inquire again whether such an argument must be circular if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appear more and more clearly as the ultimate result to which material science is tending …. Here, though I have complained sometimes at their want of logic, I am in fundamental sympathy with the deep underlying conceptions of the statistical theory of the day. If the contemporary doctrines of Biology and Physics remain tenable, we may have a remarkable, if undeserved, justification of some of the methods of the traditional Calculus of Probabilities. [notes 20]
The above assumptions of non-organic ‘characteristics of atomism and limited variety’ and hence the applicability of the then conventional statistical methods was not long to remain credible, even for the natural sciences, [58] [59] [60] and some economists, notably in the US, applied some of his ideas in the interwar years, [61] [62] although some philosophers continued to find it 'very puzzling indeed'. [63] [notes 21] [notes 22]
Keynes had also noted in Chapter 21 the limitations of 'mathematical expectation' for 'rational' decision making. [67] [68] Keynes developed this point in his more well-known General Theory of Employment, Interest and Money and subsequently, specifically in his thinking on the nature and role of long-term expectation in economics, [69] notably on Animal spirits. [70] [notes 23]
Keynes' ideas found practical application by Turing and Good at Bletchley Park during WWII, which practice formed the basis for the subsequent development of 'modern Bayesian probability', [73] and the notion of imprecise probabilities is now well established in statistics, with a wide range of important applications. [74] [notes 24]
The significance of 'true' uncertainty beyond mere precise probabilities had already been highlighted by Frank Knight [76] and the additional insights of Keynes tended to be overlooked. [notes 25] From the late 60s onwards even this limited aspect began to be less appreciated by economists, and was even disregarded or discounted by many 'Keynesian' economists. [78] After the financial crashes of 2007-9 'mainstream economics' was regarded as having been 'further away' from Keynes' ideas than ever before. [79] But subsequently there was a partial 'return of the master' [3] leading to calls for a 'paradigm shift' building further on Keynes' insights into 'the nature of behaviour under conditions of uncertainty'. [80]
The centenary event organised by the University of Oxford and supported by The Alan Turing Institute for the Treatise and Frank Knight's Risk, Uncertainty, and Profit noted: [81]
In Risk, Uncertainty, and Profit, Knight put forward the vital difference between risk, where empirical evaluation of unknown outcomes can still be applicable, and uncertainty, where no quantified measurement is valid but subjective estimate. In A Treatise on Probability, Keynes argued that the concept of probability should be about the logical implication from premises to hypotheses, in contrast to the classical quantified perspective of probability.
The fundamental uncertainty proposed in both works has then deeply influenced the development of economic and probability theory in the past century and it still resonates with our lives today, considering the ups and downs that the world economy is experiencing.
However it has often been regarded as more philosophical in nature despite extensive mathematical formulations and its implications for practice. [82] [83] [8]
Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief.
Frequentist probability or frequentism is an interpretation of probability; it defines an event's probability as the limit of its relative frequency in infinitely many trials . Probabilities can be found by a repeatable objective process. The continued use of frequentist methods in scientific inference, however, has been called into question.
The word probability has been used in a variety of ways since it was first applied to the mathematical study of games of chance. Does probability measure the real, physical, tendency of something to occur, or is it a measure of how strongly one believes it will occur, or does it draw on both these elements? In answering such questions, mathematicians interpret the probability values of probability theory.
John Maynard Keynes, 1st Baron Keynes, was an English economist and philosopher whose ideas fundamentally changed the theory and practice of macroeconomics and the economic policies of governments. Originally trained in mathematics, he built on and greatly refined earlier work on the causes of business cycles. One of the most influential economists of the 20th century, he produced writings that are the basis for the school of thought known as Keynesian economics, and its various offshoots. His ideas, reformulated as New Keynesianism, are fundamental to mainstream macroeconomics. He is known as the "father of macroeconomics".
Post-Keynesian economics is a school of economic thought with its origins in The General Theory of John Maynard Keynes, with subsequent development influenced to a large degree by Michał Kalecki, Joan Robinson, Nicholas Kaldor, Sidney Weintraub, Paul Davidson, Piero Sraffa and Jan Kregel. Historian Robert Skidelsky argues that the post-Keynesian school has remained closest to the spirit of Keynes' original work. It is a heterodox approach to economics.
The problem of induction is a philosophical problem that questions the rationality of predictions about unobserved things based on previous observations. These inferences from the observed to the unobserved are known as "inductive inferences". David Hume, who first formulated the problem in 1739, argued that there is no non-circular way to justify inductive inferences, while acknowledging that everyone does and must make such inferences.
William Stanley Jevons was an English economist and logician.
Frank Plumpton Ramsey was a British philosopher, mathematician, and economist who made major contributions to all three fields before his death at the age of 26. He was a close friend of Ludwig Wittgenstein and, as an undergraduate, translated Wittgenstein's Tractatus Logico-Philosophicus into English. He was also influential in persuading Wittgenstein to return to philosophy and Cambridge. Like Wittgenstein, he was a member of the Cambridge Apostles, the secret intellectual society, from 1921.
Frank Hyneman Knight was an American economist who spent most of his career at the University of Chicago, where he became one of the founders of the Chicago School.
The principle of indifference is a rule for assigning epistemic probabilities. The principle of indifference states that in the absence of any relevant evidence, agents should distribute their credence equally among all the possible outcomes under consideration.
Inferences are steps in reasoning, moving from premises to logical consequences; etymologically, the word infer means to "carry forward". Inference is theoretically traditionally divided into deduction and induction, a distinction that in Europe dates at least to Aristotle. Deduction is inference deriving logical conclusions from premises known or assumed to be true, with the laws of valid inference being studied in logic. Induction is inference from particular evidence to a universal conclusion. A third type of inference is sometimes distinguished, notably by Charles Sanders Peirce, contradistinguishing abduction from induction.
The General Theory of Employment, Interest and Money is a book by English economist John Maynard Keynes published in February 1936. It caused a profound shift in economic thought, giving macroeconomics a central place in economic theory and contributing much of its terminology – the "Keynesian Revolution". It had equally powerful consequences in economic policy, being interpreted as providing theoretical support for government spending in general, and for budgetary deficits, monetary intervention and counter-cyclical policies in particular. It is pervaded with an air of mistrust for the rationality of free-market decision making.
Inductive reasoning is any of various methods of reasoning in which broad generalizations or principles are derived from a body of observations. This article is concerned with the inductive reasoning other than deductive reasoning, where the conclusion of a deductive argument is certain given the premises are correct; in contrast, the truth of the conclusion of an inductive argument is at best probable, based upon the evidence given.
Common and special causes are the two distinct origins of variation in a process, as defined in the statistical thinking and methods of Walter A. Shewhart and W. Edwards Deming. Briefly, "common causes", also called natural patterns, are the usual, historical, quantifiable variation in a system, while "special causes" are unusual, not previously observed, non-quantifiable variation.
In decision theory, the Ellsberg paradox is a paradox in which people's decisions are inconsistent with subjective expected utility theory. John Maynard Keynes published a version of the paradox in 1921. Daniel Ellsberg popularized the paradox in his 1961 paper, "Risk, Ambiguity, and the Savage Axioms". It is generally taken to be evidence of ambiguity aversion, in which a person tends to prefer choices with quantifiable risks over those with unknown, incalculable risks.
Sir Henry Roy Forbes Harrod was an English economist. He is best known for writing The Life of John Maynard Keynes (1951) and for the development of the Harrod–Domar model, which he and Evsey Domar developed independently. He is also known for his International Economics, a former standard textbook of international economics, the first edition of which contained some observations and ruminations that would foreshadow theories developed independently by later scholars.
George Lennox Sharman Shackle was an English economist. He made a practical attempt to challenge classical rational choice theory and has been characterised as a "post-Keynesian", though he is influenced as well by Austrian economics. Much of his work is associated with the Dempster–Shafer theory of evidence.
Probabilistic logic involves the use of probability and logic to deal with uncertain situations. Probabilistic logic extends traditional logic truth tables with probabilistic expressions. A difficulty of probabilistic logics is their tendency to multiply the computational complexities of their probabilistic and logical components. Other difficulties include the possibility of counter-intuitive results, such as in case of belief fusion in Dempster–Shafer theory. Source trust and epistemic uncertainty about the probabilities they provide, such as defined in subjective logic, are additional elements to consider. The need to deal with a broad variety of contexts and issues has led to many different proposals.
The following is a timeline of probability and statistics.
John Maynard Keynes is a biography of John Maynard Keynes, written by Robert Skidelsky. It is published in three volumes.
Informational notes
Citations
Keynes's thesis that some probability relationships are measurable and others unmeasurable leads to intolerable difficulties without any compensating advantages.
Bibliography