Luce's choice axiom

Last updated

In probability theory, Luce's choice axiom, formulated by R. Duncan Luce (1959), [1] states that the relative odds of selecting one item over another from a pool of many items is not affected by the presence or absence of other items in the pool. Selection of this kind is said to have "independence from irrelevant alternatives" (IIA). [2]

Contents

Overview

Consider a set of possible outcomes, and consider a selection rule , such that for any with a finite set, the selector selects from with probability .

Luce proposed two choice axioms. The second one is usually meant by "Luce's choice axiom", as the first one is usually called "independence from irrelevant alternatives" (IIA). [3]

Luce's choice axiom 1 (IIA): if , then for any , we still have .

Luce's choice axiom 2 ("path independence"): for any . [4]

Luce's choice axiom 1 is implied by choice axiom 2.

Matching law formulation

Define the matching law selection rule , for some "value" function . This is sometimes called the softmax function, or the Boltzmann distribution.

Theorem: Any matching law selection rule satisfies Luce's choice axiom. Conversely, if for all , then Luce's choice axiom implies that it is a matching law selection rule.

Applications

In economics, it can be used to model a consumer's tendency to choose one brand of product over another.[ citation needed ]

In behavioral psychology, it is used to model response behavior in the form of matching law.

In cognitive science, it is used to model approximately rational decision processes.

Related Research Articles

Naive set theory is any of several theories of sets used in the discussion of the foundations of mathematics. Unlike axiomatic set theories, which are defined using formal logic, naive set theory is defined informally, in natural language. It describes the aspects of mathematical sets familiar in discrete mathematics, and suffices for the everyday use of set theory concepts in contemporary mathematics.

<span class="mw-page-title-main">Ultrafilter</span> Maximal proper filter

In the mathematical field of order theory, an ultrafilter on a given partially ordered set is a certain subset of namely a maximal filter on that is, a proper filter on that cannot be enlarged to a bigger proper filter on

The Baire category theorem (BCT) is an important result in general topology and functional analysis. The theorem has two forms, each of which gives sufficient conditions for a topological space to be a Baire space. It is used in the proof of results in many areas of analysis and geometry, including some of the fundamental theorems of functional analysis.

Arrow's impossibility theorem is a key result in social choice showing that no rank-order method for collective decision-making can behave rationally or coherently. Specifically, any such rule violates independence of irrelevant alternatives, the principle that a choice between and should not depend on the quality of a third, unrelated option .

In set theory, Zermelo–Fraenkel set theory, named after mathematicians Ernst Zermelo and Abraham Fraenkel, is an axiomatic system that was proposed in the early twentieth century in order to formulate a theory of sets free of paradoxes such as Russell's paradox. Today, Zermelo–Fraenkel set theory, with the historically controversial axiom of choice (AC) included, is the standard form of axiomatic set theory and as such is the most common foundation of mathematics. Zermelo–Fraenkel set theory with the axiom of choice included is abbreviated ZFC, where C stands for "choice", and ZF refers to the axioms of Zermelo–Fraenkel set theory with the axiom of choice excluded.

Independence of irrelevant alternatives (IIA), also known as binary independence, the independence axiom, is an axiom of decision theory and economics describing a necessary condition for rational behavior. The axiom says that a choice between and should not depend on the quality of a third, unrelated outcome .

In mathematics, in set theory, the constructible universe, denoted by , is a particular class of sets that can be described entirely in terms of simpler sets. is the union of the constructible hierarchy. It was introduced by Kurt Gödel in his 1938 paper "The Consistency of the Axiom of Choice and of the Generalized Continuum-Hypothesis". In this paper, he proved that the constructible universe is an inner model of ZF set theory, and also that the axiom of choice and the generalized continuum hypothesis are true in the constructible universe. This shows that both propositions are consistent with the basic axioms of set theory, if ZF itself is consistent. Since many other theorems only hold in systems in which one or both of the propositions is true, their consistency is an important result.

In the foundations of mathematics, von Neumann–Bernays–Gödel set theory (NBG) is an axiomatic set theory that is a conservative extension of Zermelo–Fraenkel–choice set theory (ZFC). NBG introduces the notion of class, which is a collection of sets defined by a formula whose quantifiers range only over sets. NBG can define classes that are larger than sets, such as the class of all sets and the class of all ordinals. Morse–Kelley set theory (MK) allows classes to be defined by formulas whose quantifiers range over classes. NBG is finitely axiomatizable, while ZFC and MK are not.

The expected utility hypothesis is a foundational assumption in mathematical economics concerning decision making under uncertainty. It postulates that rational agents maximize utility, meaning the subjective desirability of their actions. Rational choice theory, a cornerstone of microeconomics, builds this postulate to model aggregate social behaviour.

The Kripke–Platek set theory (KP), pronounced, is an axiomatic set theory developed by Saul Kripke and Richard Platek. The theory can be thought of as roughly the predicative part of ZFC and is considerably weaker than it.

Robert Duncan Luce was an American mathematician and social scientist, and one of the most preeminent figures in the field of mathematical psychology. At the end of his life, he held the position of Distinguished Research Professor of Cognitive Science at the University of California, Irvine.

In set theory, -induction, also called epsilon-induction or set-induction, is a principle that can be used to prove that all sets satisfy a given property. Considered as an axiomatic principle, it is called the axiom schema of set induction.

Axiomatic constructive set theory is an approach to mathematical constructivism following the program of axiomatic set theory. The same first-order language with "" and "" of classical set theory is usually used, so this is not to be confused with a constructive types approach. On the other hand, some constructive theories are indeed motivated by their interpretability in type theories.

In mathematics, Smale's axiom A defines a class of dynamical systems which have been extensively studied and whose dynamics is relatively well understood. A prominent example is the Smale horseshoe map. The term "axiom A" originates with Stephen Smale. The importance of such systems is demonstrated by the chaotic hypothesis, which states that, 'for all practical purposes', a many-body thermostatted system is approximated by an Anosov system.

The theory of conjoint measurement is a general, formal theory of continuous quantity. It was independently discovered by the French economist Gérard Debreu (1960) and by the American mathematical psychologist R. Duncan Luce and statistician John Tukey.

Polynomial conjoint measurement is an extension of the theory of conjoint measurement to three or more attributes. It was initially developed by the mathematical psychologists David Krantz (1968) and Amos Tversky (1967). The theory was given a comprehensive mathematical exposition in the first volume of Foundations of Measurement, which Krantz and Tversky wrote in collaboration with the mathematical psychologist R. Duncan Luce and philosopher Patrick Suppes. Krantz & Tversky (1971) also published a non-technical paper on polynomial conjoint measurement for behavioural scientists in the journal Psychological Review.

<span class="mw-page-title-main">Semiorder</span> Numerical ordering with a margin of error

In order theory, a branch of mathematics, a semiorder is a type of ordering for items with numerical scores, where items with widely differing scores are compared by their scores and where scores within a given margin of error are deemed incomparable. Semiorders were introduced and applied in mathematical psychology by Duncan Luce as a model of human preference. They generalize strict weak orderings, in which items with equal scores may be tied but there is no margin of error. They are a special case of partial orders and of interval orders, and can be characterized among the partial orders by additional axioms, or by two forbidden four-item suborders.

Resource monotonicity is a principle of fair division. It says that, if there are more resources to share, then all agents should be weakly better off; no agent should lose from the increase in resources. The RM principle has been studied in various division problems.

Stochastic transitivity models are stochastic versions of the transitivity property of binary relations studied in mathematics. Several models of stochastic transitivity exist and have been used to describe the probabilities involved in experiments of paired comparisons, specifically in scenarios where transitivity is expected, however, empirical observations of the binary relation is probabilistic. For example, players' skills in a sport might be expected to be transitive, i.e. "if player A is better than B and B is better than C, then player A must be better than C"; however, in any given match, a weaker player might still end up winning with a positive probability. Tightly matched players might have a higher chance of observing this inversion while players with large differences in their skills might only see these inversions happen seldom. Stochastic transitivity models formalize such relations between the probabilities and the underlying transitive relation.

<span class="mw-page-title-main">Ultrafilter on a set</span> Maximal proper filter

In the mathematical field of set theory, an ultrafilter on a set is a maximal filter on the set In other words, it is a collection of subsets of that satisfies the definition of a filter on and that is maximal with respect to inclusion, in the sense that there does not exist a strictly larger collection of subsets of that is also a filter. Equivalently, an ultrafilter on the set can also be characterized as a filter on with the property that for every subset of either or its complement belongs to the ultrafilter.

References

  1. Luce, R. Duncan (2005). Individual choice behavior : a theoretical analysis. Mineola, N.Y.: Dover Publications. ISBN   0-486-44136-9. OCLC   874031603.
  2. Luce, R.Duncan (June 1977). "The choice axiom after twenty years". Journal of Mathematical Psychology. 15 (3): 215–233. doi:10.1016/0022-2496(77)90032-3.
  3. Luce, R.Duncan (June 1977). "The choice axiom after twenty years". Journal of Mathematical Psychology. 15 (3): 215–233. doi:10.1016/0022-2496(77)90032-3.
  4. Luce, R. Duncan (2008-12-02). "Luce's choice axiom". Scholarpedia. 3 (12): 8077. Bibcode:2008SchpJ...3.8077L. doi: 10.4249/scholarpedia.8077 . ISSN   1941-6016.