In economics, a random utility model (RUM), [1] [2] also called stochastic utility model, [3] is a mathematical description of the preferences of a person, whose choices are not deterministic, but depend on a random state variable.
A basic assumption in classic economics is that the choices of a rational person choices are guided by a preference relation, which can usually be described by a utility function. When faced with several alternatives, the person will choose the alternative with the highest utility. The utility function is not visible; however, by observing the choices made by the person, we can "reverse-engineer" his utility function. This is the goal of revealed preference theory.[ citation needed ]
In practice, however, people are not rational. Ample empirical evidence shows that, when faced with the same set of alternatives, people may make different choices. [4] [5] [6] [7] [8] To an outside observer, their choices may appear random.
One way to model this behavior is called stochastic rationality. It is assumed that each agent has an unobserved state, which can be considered a random variable. Given that state, the agent behaves rationally. In other words: each agent has, not a single preference-relation, but a distribution over preference-relations (or utility functions).[ citation needed ]
Block and Marschak [9] presented the following problem. Suppose we are given as input, a set of choice probabilitiesPa,B, describing the probability that an agent chooses alternative a from the set B. We want to rationalize the agent's behavior by a probability distribution over preference relations. That is: we want to find a distribution such that, for all pairs a,B given in the input, Pa,B = Prob[a is weakly preferred to all alternatives in B]. What conditions on the set of probabilities Pa,B guarantee the existence of such a distribution?[ citation needed ]
Falmagne [10] solved this problem for the case in which the set of alternatives is finite: he proved that a probability distribution exists iff a set of polynomials derived from the choice-probabilities, denoted Block-Marschak polynomials, are nonnegative. His solution is constructive, and provides an algorithm for computing the distribution.
Barbera and Pattanaik [11] extend this result to settings in which the agent may choose sets of alternatives, rather than just singletons.
Block and Marschak [9] proved that, when there are at most 3 alternatives, the random utility model is unique ("identified"); however, when there are 4 or more alternatives, the model may be non-unique. [11] For example, [12] we can compute the probability that the agent prefers w to x (w>x), and the probability that y>z, but may not be able to know the probability that both w>x and y>z. There are even distributions with disjoint supports, which induce the same set of choice probabilities.
Some conditions for uniqueness were given by Falmagne. [10] Turansick [13] presents two characterizations for the existence of a unique random utility representation.
There are various RUMs, which differ in the assumptions on the probability distributions of the agent's utility, A popular RUM was developed by Luce [14] and Plackett. [15]
The Plackett-Luce model was applied in econometrics, [16] for example, to analyze automobile prices in market equilibrium. [17] It was also applied in machine learning and information retrieval. [18] It was also applied in social choice, to analyze an opinion poll conducted during the Irish presidential election. [19] Efficient methods for expectation-maximization and Expectation propagation exist for the Plackett-Luce model. [20] [21] [22]
RUMs can be used not only for modeling the behavior of a single agent, but also for decision-making among a society of agents. [23] One approach to social choice, first formalized by Condorcet's jury theorem, is that there is a "ground truth" - a true ranking of the alternatives. Each agent in society receives a noisy signal of this true ranking. The best way to approach the ground truth is using maximum likelihood estimation: construct a social ranking which maximizes the likelihood of the set of individual rankings.
Condorcet's original model assumes that the probabilities of agents' mistakes in pairwise comparisons are independent and identically distributed: all mistakes have the same probability p. This model has several drawbacks:
RUM provides an alternative model: there is a ground-truth vector of utilities; each agent draws a utility for each alternative, based on a probability distribution whose mean value is the ground-truth. This model captures the strength of preferences, and rules out cyclic preferences. Moreover, for some common probability distributions (particularly, the Plackett-Luce model), the maximum likelihood estimators can be computed efficiently.[ citation needed ]
Walker and Ben-Akiva [25] generalize the classic RUM in several ways, aiming to improve the accuracy of forecasts:
Blavatzkyy [26] studies stochastic utility theory based on choices between lotteries. The input is a set of choice probabilities, which indicate the likelihood that the agent choose one lottery over the other.
Independence of irrelevant alternatives (IIA), also known as binary independence, the independence axiom, is an axiom of decision theory and economics describing a necessary condition for rational behavior. The axiom says that a choice between and should not depend on the quality of a third, unrelated outcome .
In mathematical optimization and decision theory, a loss function or cost function is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event. An optimization problem seeks to minimize a loss function. An objective function is either a loss function or its opposite, in which case it is to be maximized. The loss function could include terms from several levels of the hierarchy.
The expected utility hypothesis is a foundational assumption in mathematical economics concerning decision making under uncertainty. It postulates that rational agents maximize utility, meaning the subjective desirability of their actions. Rational choice theory, a cornerstone of microeconomics, builds this postulate to model aggregate social behaviour.
Econometric models are statistical models used in econometrics. An econometric model specifies the statistical relationship that is believed to hold between the various economic quantities pertaining to a particular economic phenomenon. An econometric model can be derived from a deterministic economic model by allowing for uncertainty, or from an economic model which itself is stochastic. However, it is also possible to use econometric models that are not tied to any specific economic theory.
In decision theory, the Ellsberg paradox is a paradox in which people's decisions are inconsistent with subjective expected utility theory. John Maynard Keynes published a version of the paradox in 1921. Daniel Ellsberg popularized the paradox in his 1961 paper, "Risk, Ambiguity, and the Savage Axioms". It is generally taken to be evidence of ambiguity aversion, in which a person tends to prefer choices with quantifiable risks over those with unknown, incalculable risks.
In statistics, the method of moments is a method of estimation of population parameters. The same principle is used to derive higher moments like skewness and kurtosis.
The Allais paradox is a choice problem designed by Maurice Allais to show an inconsistency of actual observed choices with the predictions of expected utility theory. The Allais paradox demonstrates that individuals rarely make rational decisions consistently when required to do so immediately. The independence axiom of expected utility theory, which requires that the preferences of an individual should not change when altering two lotteries by equal proportions, was proven to be violated by the paradox.
Jacob Marschak was an American economist.
Pairwise comparison generally is any process of comparing entities in pairs to judge which of each entity is preferred, or has a greater amount of some quantitative property, or whether or not the two entities are identical. The method of pairwise comparison is used in the scientific study of preferences, attitudes, voting systems, social choice, public choice, requirements engineering and multiagent AI systems. In psychology literature, it is often referred to as paired comparison.
In economics, discrete choice models, or qualitative choice models, describe, explain, and predict choices between two or more discrete alternatives, such as entering or not entering the labor market, or choosing between modes of transport. Such choices contrast with standard consumption models in which the quantity of each good consumed is assumed to be a continuous variable. In the continuous case, calculus methods can be used to determine the optimum amount chosen, and demand can be modeled empirically using regression analysis. On the other hand, discrete choice analysis examines situations in which the potential outcomes are discrete, such that the optimum is not characterized by standard first-order conditions. Thus, instead of examining "how much" as in problems with continuous choice variables, discrete choice analysis examines "which one". However, discrete choice analysis can also be used to examine the chosen quantity when only a few distinct quantities must be chosen from, such as the number of vehicles a household chooses to own and the number of minutes of telecommunications service a customer decides to purchase. Techniques such as logistic regression and probit regression can be used for empirical analysis of discrete choice.
Quantal response equilibrium (QRE) is a solution concept in game theory. First introduced by Richard McKelvey and Thomas Palfrey, it provides an equilibrium notion with bounded rationality. QRE is not an equilibrium refinement, and it can give significantly different results from Nash equilibrium. QRE is only defined for games with discrete strategies, although there are continuous-strategy analogues.
Choice modelling attempts to model the decision process of an individual or segment via revealed preferences or stated preferences made in a particular context or contexts. Typically, it attempts to use discrete choices in order to infer positions of the items on some relevant latent scale. Indeed many alternative models exist in econometrics, marketing, sociometrics and other fields, including utility maximization, optimization applied to consumer theory, and a plethora of other identification strategies which may be more or less accurate depending on the data, sample, hypothesis and the particular decision being modelled. In addition, choice modelling is regarded as the most suitable method for estimating consumers' willingness to pay for quality improvements in multiple dimensions.
Bayesian econometrics is a branch of econometrics which applies Bayesian principles to economic modelling. Bayesianism is based on a degree-of-belief interpretation of probability, as opposed to a relative-frequency interpretation.
In economics, and in other social sciences, preference refers to an order by which an agent, while in search of an "optimal choice", ranks alternatives based on their respective utility. Preferences are evaluations that concern matters of value, in relation to practical reasoning. Individual preferences are determined by taste, need, ..., as opposed to price, availability or personal income. Classical economics assumes that people act in their best (rational) interest. In this context, rationality would dictate that, when given a choice, an individual will select an option that maximizes their self-interest. But preferences are not always transitive, both because real humans are far from always being rational and because in some situations preferences can form cycles, in which case there exists no well-defined optimal choice. An example of this is Efron dice.
Jean-Claude Falmagne is a mathematical psychologist whose scientific contributions deal with problems in reaction time theory, psychophysics, philosophy of science, measurement theory, decision theory, and educational technology. Together with Jean-Paul Doignon, he developed knowledge space theory, which is the mathematical foundation for the ALEKS software for the assessment of knowledge in various academic subjects, including K-12 mathematics, chemistry, and accounting.
Jean-François Mertens was a Belgian game theorist and mathematical economist.
In statistics and econometrics, the maximum score estimator is a nonparametric estimator for discrete choice models developed by Charles Manski in 1975. Unlike the multinomial probit and multinomial logit estimators, it makes no assumptions about the distribution of the unobservable part of utility. However, its statistical properties are more complicated than the multinomial probit and logit models, making statistical inference difficult. To address these issues, Joel Horowitz proposed a variant, called the smoothed maximum score estimator.
Stochastic transitivity models are stochastic versions of the transitivity property of binary relations studied in mathematics. Several models of stochastic transitivity exist and have been used to describe the probabilities involved in experiments of paired comparisons, specifically in scenarios where transitivity is expected, however, empirical observations of the binary relation is probabilistic. For example, players' skills in a sport might be expected to be transitive, i.e. "if player A is better than B and B is better than C, then player A must be better than C"; however, in any given match, a weaker player might still end up winning with a positive probability. Tightly matched players might have a higher chance of observing this inversion while players with large differences in their skills might only see these inversions happen seldom. Stochastic transitivity models formalize such relations between the probabilities and the underlying transitive relation.
Fractional social choice is a branch of social choice theory in which the collective decision is not a single alternative, but rather a weighted sum of two or more alternatives. For example, if society has to choose between three candidates: A B or C, then in standard social choice, exactly one of these candidates is chosen, while in fractional social choice, it is possible to choose "2/3 of A and 1/3 of B".