In economics, game theory, and decision theory, the expected utility hypothesis—concerning people's preferences with regard to choices that have uncertain outcomes (gambles)—states that the subjective value associated with an individual's gamble is the statistical expectation of that individual's valuations of the outcomes of that gamble, where these valuations may differ from the dollar value of those outcomes. The introduction of St. Petersburg Paradox by Daniel Bernoulli in 1738 is considered the beginnings of the hypothesis. This hypothesis has proven useful to explain some popular choices that seem to contradict the expected value criterion (which takes into account only the sizes of the payouts and the probabilities of occurrence), such as occur in the contexts of gambling and insurance.
Economics is the social science that studies the production, distribution, and consumption of goods and services.
Game theory is the study of mathematical models of strategic interaction among rational decision-makers. It has applications in all fields of social science, as well as in logic, systems science, and computer science. Originally, it addressed zero-sum games, in which each participant's gains or losses are exactly balanced by those of the other participants. Today, game theory applies to a wide range of behavioral relations, and is now an umbrella term for the science of logical decision making in humans, animals, and computers.
Decision theory is the study of an agent's choices. Decision theory can be broken into two branches: normative decision theory, which analyzes the outcomes of decisions or determines the optimal decisions given constraints and assumptions, and descriptive decision theory, which analyzes how agents actually make the decisions they do.
The von Neumann–Morgenstern utility theorem provides necessary and sufficient conditions under which the expected utility hypothesis holds. From relatively early on, it was accepted that some of these conditions would be violated by real decision-makers in practice but that the conditions could be interpreted nonetheless as 'axioms' of rational choice.
In decision theory, the von Neumann-Morgenstern utility theorem shows that, under certain axioms of rational behavior, a decision-maker faced with risky (probabilistic) outcomes of different choices will behave as if he or she is maximizing the expected value of some function defined over the potential outcomes at some specified point in the future. This function is known as the von Neumann-Morgenstern utility function. The theorem is the basis for expected utility theory.
An axiom or postulate is a statement that is taken to be true, to serve as a premise or starting point for further reasoning and arguments. The word comes from the Greek axíōma (ἀξίωμα) 'that which is thought worthy or fit' or 'that which commends itself as evident.'
Rationality is the quality or state of being rational – that is, being based on or agreeable to reason. Rationality implies the conformity of one's beliefs with one's reasons to believe, and of one's actions with one's reasons for action. "Rationality" has different specialized meanings in philosophy, economics, sociology, psychology, evolutionary biology, game theory and political science.
Until the mid-twentieth century, the standard term for the expected utility was the moral expectation, contrasted with "mathematical expectation" for the expected value.
Bernoulli came across expected utility by playing the St Petersburg paradox. This paradox involves you flipping a coin until you get to heads. The number of times it took you to get to heads is what you put as an exponent to 2 and receive that in dollar amounts. This game helped to understand what people were willing to pay versus what people were expected to gain from this game.
When the entity whose value affects a person's utility takes on one of a set of discrete values, the formula for expected utility, which is assumed to be maximized, is
where the left side is the subjective valuation of the gamble as a whole, is the ith possible outcome, is its valuation, and is its probability. There could be either a finite set of possible values in which case the right side of this equation has a finite number of terms; or there could be an infinite set of discrete values, in which case the right side has an infinite number of terms.
When can take on any of a continuous range of values, the expected utility is given by
where is the probability density function of
In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample in the sample space can be interpreted as providing a relative likelihood that the value of the random variable would equal that sample. In other words, while the absolute likelihood for a continuous random variable to take on any particular value is 0, the value of the PDF at two different samples can be used to infer, in any particular draw of the random variable, how much more likely it is that the random variable would equal one sample compared to the other sample.
In the presence of risky outcomes, a human decision maker does not always choose the option with higher expected value investments. For example, suppose there is a choice between a guaranteed payment of $1.00, and a gamble in which the probability of getting a $100 payment is 1 in 80 and the alternative, far more likely outcome (79 out of 80) is receiving $0. The expected value of the first alternative is $1.00 and the expected value of the second alternative is $1.25. According to expected value theory, people should choose the $100-or-nothing gamble; however, as stressed by expected utility theory, some people are risk averse enough to prefer the sure thing, despite its lower expected value. People with less risk aversion would choose the riskier, higher-expected-value gamble. This is precedence for utility theory.
Nicolas Bernoulli described the St. Petersburg paradox (involving infinite expected values) in 1713, prompting two Swiss mathematicians to develop expected utility theory as a solution. The theory can also more accurately describe more realistic scenarios (where expected values are finite) than expected value alone. In 1728, Gabriel Cramer, in a letter to Nicolas Bernoulli, wrote, "the mathematicians estimate money in proportion to its quantity, and men of good sense in proportion to the usage that they may make of it."
In 1738, Nicolas' cousin Daniel Bernoulli, published the canonical 18th Century description of this solution in Specimen theoriae novae de mensura sortis or Exposition of a New Theory on the Measurement of Risk.Daniel Bernoulli proposed that a nonlinear function of utility of an outcome should be used instead of the expected value of an outcome, accounting for risk aversion, where the risk premium is higher for low-probability events than the difference between the payout level of a particular outcome and its expected value. Bernoulli further proposed that it was not the goal of the gambler to maximize his expected gain but to instead maximize the logarithm of his gain.
Bernoulli's paper was the first formalization of marginal utility, which has broad application in economics in addition to expected utility theory. He used this concept to formalize the idea that the same amount of additional money was less useful to an already-wealthy person than it would be to a poor person.
The St. Petersburg paradox (named after the journal in which Bernoulli's paper was published) arises when there is no upper bound on the potential rewards from very low probability events. Because some probability distribution functions have an infinite expected value, an expected-wealth maximizing person would pay an arbitrarily large finite amount to take this gamble. In real life, people do not do this.
Bernoulli proposed a solution to this paradox in his paper: the utility function used in real life means that the expected utility of the gamble is finite, even if its expected value is infinite. (Thus he hypothesized diminishing marginal utility of increasingly larger amounts of money.) It has also been resolved differently by other economists by proposing that very low probability events are neglected, by taking into account the finite resources of the participants, or by noting that one simply cannot buy that which is not sold (and that sellers would not produce a lottery whose expected loss to them were unacceptable).
In the 1950s Leonard Jimmie Savage, an American statistician, derived a framework for comprehending expected utility. At that point, it was considered the first and most thorough foundation to understanding the concept. Savage's framework involved proving that expected utility could be used to make an optimal choice among several acts through seven postulates (notated as P1-P7).
Savage's framework has since been used in neo-Bayesian statistics (see Bayesian probability) and the field of applied statistics.
There are four axioms of the expected utility theory that define a rational decision maker. They are completeness, transitivity, independence and continuity.
Completeness assumes that an individual has well defined preferences and can always decide between any two alternatives.
This means that the individual either prefers A to B, or is indifferent between A and B, or prefers B to A.
Transitivity assumes that, as an individual decides according to the completeness axiom, the individual also decides consistently.
Independence of irrelevant alternatives pertains to well-defined preferences as well. It assumes that two gambles mixed with an irrelevant third one will maintain the same order of preference as when the two are presented independently of the third one. The independence axiom is the most controversial axiom.[ citation needed ].
Continuity assumes that when there are three lotteries (A, B and C) and the individual prefers A to B and B to C, then there should be a possible combination of A and C in which the individual is then indifferent between this mix and the lottery B.
If all these axioms are satisfied, then the individual is said to be rational and the preferences can be represented by a utility function, i.e. one can assign numbers (utilities) to each outcome of the lottery such that choosing the best lottery according to the preference amounts to choosing the lottery with the highest expected utility. This result is called the von Neumann–Morgenstern utility representation theorem.
In other words, if an individual's behavior always satisfies the above axioms, then there is a utility function such that the individual will choose one gamble over another if and only if the expected utility of one exceeds that of the other. The expected utility of any gamble may be expressed as a linear combination of the utilities of the outcomes, with the weights being the respective probabilities. Utility functions are also normally continuous functions. Such utility functions are also referred to as von Neumann–Morgenstern (vNM) utility functions. This is a central theme of the expected utility hypothesis in which an individual chooses not the highest expected value, but rather the highest expected utility. The expected utility maximizing individual makes decisions rationally based on the axioms of the theory.
The von Neumann–Morgenstern formulation is important in the application of set theory to economics because it was developed shortly after the Hicks–Allen "ordinal revolution" of the 1930s, and it revived the idea of cardinal utility in economic theory.[ citation needed ] However, while in this context the utility function is cardinal, in that implied behavior would be altered by a non-linear monotonic transformation of utility, the expected utility function is ordinal because any monotonic increasing transformation of expected utility gives the same behavior.
The expected utility theory takes into account that individuals may be risk-averse, meaning that the individual would refuse a fair gamble (a fair gamble has an expected value of zero). Risk aversion implies that their utility functions are concave and show diminishing marginal wealth utility. The risk attitude is directly related to the curvature of the utility function: risk neutral individuals have linear utility functions, while risk seeking individuals have convex utility functions and risk averse individuals have concave utility functions. The degree of risk aversion can be measured by the curvature of the utility function.
Since the risk attitudes are unchanged under affine transformations of u, the second derivative u'' is not an adequate measure of the risk aversion of a utility function. Instead, it needs to be normalized. This leads to the definition of the Arrow–Prattmeasure of absolute risk aversion:
where is wealth.
The Arrow–Pratt measure of relative risk aversion is:
Special classes of utility functions are the CRRA (constant relative risk aversion) functions, where RRA(w) is constant, and the CARA (constant absolute risk aversion) functions, where ARA(w) is constant. They are often used in economics for simplification.
A decision that maximizes expected utility also maximizes the probability of the decision's consequences being preferable to some uncertain threshold (Castagnoli and LiCalzi,1996; Bordley and LiCalzi,2000; Bordley and Kirkwood).[ citation needed ] In the absence of uncertainty about the threshold, expected utility maximization simplifies to maximizing the probability of achieving some fixed target. If the uncertainty is uniformly distributed, then expected utility maximization becomes expected value maximization. Intermediate cases lead to increasing risk aversion above some fixed threshold and increasing risk seeking below a fixed threshold.
The utility function was originally suggested by Bernoulli (see above). It has relative risk aversion constant and equal to one, and is still sometimes assumed in economic analyses. The utility function
exhibits constant absolute risk aversion, and for this reason is often avoided, although it has the advantage of offering substantial mathematical tractability when asset returns are normally distributed. Note that, as per the affine transformation property alluded to above, the utility function gives exactly the same preferences orderings as does ; thus it is irrelevant that the values of and its expected value are always negative: what matters for preference ordering is which of two gambles gives the higher expected utility, not the numerical values of those expected utilities.
The class of constant relative risk aversion utility functions contains three categories. Bernoulli's utility function
has relative risk aversion equal to 1. The functions
for have relative risk aversion equal to . And the functions
for have relative risk aversion equal to
See also the discussion of utility functions having hyperbolic absolute risk aversion (HARA).
Often people refer to "risk" in the sense of a potentially quantifiable entity. In the context of mean-variance analysis, variance is used as a risk measure for portfolio return; however, this is only valid if returns are normally distributed or otherwise jointly elliptically distributed,or in the unlikely case in which the utility function has a quadratic form. However, David E. Bell proposed a measure of risk which follows naturally from a certain class of von Neumann-Morgenstern utility functions. Let utility of wealth be given by
for individual-specific positive parameters a and b. Then expected utility is given by
Thus the risk measure is , which differs between two individuals if they have different values of the parameter allowing different people to disagree about the degree of risk associated with any given portfolio. Individuals sharing a given risk measure (based on given value of a) may choose different portfolios because they may have different values of b. See also Entropic risk measure.
For general utility functions, however, expected utility analysis does not permit the expression of preferences to be separated into two parameters with one representing the expected value of the variable in question and the other representing its risk.
Expected utility theory is a theory about how to make optimal decisions under risk. It has a normative interpretation which economists particularly used to think applies in all situations to rational agents but now tend to regard as a useful and insightful first order approximation. In empirical applications, a number of violations have been shown to be systematic and these falsifications have deepened understanding of how people actually decide. Daniel Kahneman and Amos Tversky in 1979 presented their prospect theory which showed empirically, among other things, how preferences of individuals are inconsistent among the same choices, depending on how those choices are presented.
Like any mathematical model, expected utility theory is an abstraction and simplification of reality. The mathematical correctness of expected utility theory and the salience of its primitive concepts do not guarantee that expected utility theory is a reliable guide to human behavior or optimal practice.
The mathematical clarity of expected utility theory has helped scientists design experiments to test its adequacy, and to distinguish systematic departures from its predictions. This has led to the field of behavioral finance, which has produced deviations from expected utility theory to account for the empirical facts.
It is well established that humans find logic hard, mathematics harder, and probability even more challenging.[ citation needed ] Psychologists have discovered systematic violations of probability calculations and behavior by humans.[ citation needed ] Consider, for example, the Monty Hall problem.
In updating probability distributions using evidence, a standard method uses conditional probability, namely the rule of Bayes. An experiment on belief revision has suggested that humans change their beliefs faster when using Bayesian methods than when using informal judgment.
Behavioral finance has produced several generalized expected utility theories to account for instances where people's choices deviate from those predicted by expected utility theory. These deviations are described as "irrational" because they can depend on the way the problem is presented, not on the actual costs, rewards, or probabilities involved.
Particular theories include prospect theory, rank-dependent expected utility and cumulative prospect theory and SP/A theory.
Starting with studies such as Lichtenstein & Slovic (1971), it was discovered that subjects sometimes exhibit signs of preference reversals with regard to their certainty equivalents of different lotteries. Specifically, when eliciting certainty equivalents, subjects tend to value "p bets" (lotteries with a high chance of winning a low prize) lower than "$ bets" (lotteries with a small chance of winning a large prize). When subjects are asked which lotteries they prefer in direct comparison, however, they frequently prefer the "p bets" over "$ bets".Many studies have examined this "preference reversal", from both an experimental (e.g., Plott & Grether, 1979) and theoretical (e.g., Holt, 1986) standpoint, indicating that this behavior can be brought into accordance with neoclassical economic theory under specific assumptions.
If one is using the frequentist notion of probability, where probabilities are considered to be fixed values, then applying expected value and expected utility to decision-making requires knowing the probabilities of various outcomes. However, in practice there will be many situations where the probabilities are unknown, and one is operating under uncertainty. In economics, Knightian uncertainty or ambiguity may occur. Thus one must make assumptions about the probabilities, but then the expected values of various decisions can be very sensitive to the assumptions. This is particularly a problem when the expectation is dominated by rare extreme events, as in a long-tailed distribution.
Alternative decision techniques are robust to uncertainty of probability of outcomes, either not depending on probabilities of outcomes and only requiring scenario analysis (as in minimax or minimax regret), or being less sensitive to assumptions.
Bayesian approaches to probability treat it as a degree of belief and thus they do not draw a distinction between risk and a wider concept of uncertainty: they deny the existence of Knightian uncertainty. They would model uncertain probabilities with hierarchical models, i.e. where the uncertain probabilities are modelled as distributions whose parameters are themselves drawn from a higher-level distribution (hyperpriors).
Within economics, the concept of utility is used to model worth or value. Its usage has evolved significantly over time. The term was introduced initially as a measure of pleasure or satisfaction within the theory of utilitarianism by moral philosophers such as Jeremy Bentham and John Stuart Mill. The term has been adapted and reapplied within neoclassical economics, which dominates modern economic theory, as a utility function that represents a consumer's preference ordering over a choice set. It is devoid of its original interpretation as a measurement of the pleasure or satisfaction obtained by the consumer from that choice.
The prospect theory is an economics theory developed by Daniel Kahneman and Amos Tversky in 1979. It challenges the expected utility theory, developed by John von Neumann and Oskar Morgenstern in 1944, and earned Daniel Kahneman the Nobel Memorial Prize in Economics in 2002. It is the founding theory of behavioral economics and of behavioral finance, and constitutes one of the first economic theories built using experimental methods.
In mathematical optimization and decision theory, a loss function or cost function is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event. An optimization problem seeks to minimize a loss function. An objective function is either a loss function or its negative, in which case it is to be maximized.
The St. Petersburg paradox or St. Petersburg lottery is a paradox related to probability and decision theory in economics. It is based on a particular (theoretical) lottery game that leads to a random variable with infinite expected value but nevertheless seems to be worth only a very small amount to the participants. The St. Petersburg paradox is a situation where a naive decision criterion which takes only the expected value into account predicts a course of action that presumably no actual person would be willing to take. Several resolutions are possible.
In decision theory, subjective expected utility is the attractiveness of an economic opportunity as perceived by a decision-maker in the presence of risk. Characterizing the behavior of decision-makers as using subjective expected utility was promoted and axiomatized by L. J. Savage in 1954 following previous work by Ramsey and von Neumann. The theory of subjective expected utility combines two subjective concepts: first, a personal utility function, and second a personal probability distribution.
In economics, a cardinal utility function or scale is a utility index that preserves preference orderings uniquely up to positive affine transformations. Two utility indices are related by an affine transformation if for the value of one index u, occurring at any quantity of the goods bundle being evaluated, the corresponding value of the other index v satisfies a relationship of the form
The Ellsberg paradox is a paradox in decision theory in which people's choices violate the postulates of subjective expected utility. It is generally taken to be evidence for ambiguity aversion. The paradox was popularized by Daniel Ellsberg, although a version of it was noted considerably earlier by John Maynard Keynes.
In probability theory and intertemporal portfolio choice, the Kelly criterion, Kelly strategy, Kelly formula, or Kelly bet is a formula for bet sizing that leads almost surely to higher wealth compared to any other strategy in the long run. The Kelly bet size is found by maximizing the expected value of the logarithm of wealth, which is equivalent to maximizing the expected geometric growth rate. The Kelly Criterion is to bet a predetermined fraction of assets, and it can be counterintuitive. It was described by J. L. Kelly, Jr, a researcher at Bell Labs, in 1956. The practical use of the formula has been demonstrated.
The Allais paradox is a choice problem designed by Maurice Allais (1953) to show an inconsistency of actual observed choices with the predictions of expected utility theory.
Stochastic dominance is a partial order between random variables. It is a form of stochastic ordering. The concept arises in decision theory and decision analysis in situations where one gamble can be ranked as superior to another gamble for a broad class of decision-makers. It is based on shared preferences regarding sets of possible outcomes and their associated probabilities. Only limited knowledge of preferences is required for determining dominance. Risk aversion is a factor only in second order stochastic dominance.
In decision theory and economics, ambiguity aversion is a preference for known risks over unknown risks. An ambiguity-averse individual would rather choose an alternative where the probability distribution of the outcomes is known over one where the probabilities are unknown. This behavior was first introduced through the Ellsberg paradox.
Cumulative prospect theory (CPT) is a model for descriptive decisions under risk and uncertainty which was introduced by Amos Tversky and Daniel Kahneman in 1992. It is a further development and variant of prospect theory. The difference between this version and the original version of prospect theory is that weighting is applied to the cumulative probability distribution function, as in rank-dependent expected utility theory but not applied to the probabilities of individual outcomes. In 2002, Daniel Kahneman received the Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel for his contributions to behavioral economics, in particular the development of Cumulative Prospect Theory (CPT).
In decision theory, economics, and finance, a two-moment decision model is a model that describes or prescribes the process of making decisions in a context in which the decision-maker is faced with random variables whose realizations cannot be known in advance, and in which choices are made based on knowledge of two moments of those random variables. The two moments are almost always the mean—that is, the expected value, which is the first moment about zero—and the variance, which is the second moment about the mean.
In finance, economics, and decision theory, hyperbolic absolute risk aversion (HARA) refers to a type of risk aversion that is particularly convenient to model mathematically and to obtain empirical predictions from. It refers specifically to a property of von Neumann–Morgenstern utility functions, which are typically functions of final wealth, and which describe a decision-maker's degree of satisfaction with the outcome for wealth. The final outcome for wealth is affected both by random variables and by decisions. Decision-makers are assumed to make their decisions so as to maximize the expected value of the utility function.
In economics and other social sciences, preference is the order that a person gives to alternatives based on their relative utility, a process which results in an optimal "choice". Instead of the prices of goods, personal income, or availability of goods, the character of the preferences is determined purely by a person's tastes. However, persons are still expected to act in their best interest.
In expected utility theory, a lottery is a discrete distribution of probability on a set of states of nature. The elements of a lottery correspond to the probabilities that each of the states of nature will occur. Much of the theoretical analysis of choice under uncertainty involves characterizing the available choices in terms of lotteries.
Risk aversion is a preference for a sure outcome over a gamble with higher or equal expected value. Conversely, the rejection of a sure thing in favor of a gamble of lower or equal expected value is known as risk-seeking behavior.
In decision theory, a multi-attribute utility function is used to represent the preferences of an agent over bundles of goods either under conditions of certainty about the results of any potential choice, or under conditions of uncertainty.