Border's theorem

Last updated

In auction theory and mechanism design, Border's theorem gives a necessary and sufficient condition for interim allocation rules (or reduced form auctions) to be implementable via an auction.

Contents

It was first proven by Kim Border in 1991, [1] expanding on work from Steven Matthews, [2] Eric Maskin and John Riley. [3] A similar version with different hypotheses was proven by Border in 2007. [4]

Preliminaries

Auctions

Auctions are a mechanism designed to allocate an indivisible good among bidders with private valuation for the good – that is, when the auctioneer has incomplete information on the bidders' true valuation and each bidder knows only their own valuation.

Formally, this uncertainty is represented by a family of probability spaces for each bidder , in which each represents a possible type (valuation) for bidder to have, denotes a σ-algebra on , and a prior and common knowledge probability distribution on , which assigns the probability that a bidder is of type . Finally, we define as the set of type profiles, and the set of profiles .

Bidders simultaneously report their valuation of the good [nb 1] , and an auction assigns a probability that they will receive it. In this setting, an auction is thus a function satisfying, for every type profile

where is the -th component of . Intuitively, this only means that the probability that some bidder will receive the good is no greater than 1.

Interim allocation rules (reduced form auctions)

From the point of view of each bidder , every auction induces some expected probability that they will win the good given their type, which we can compute as

where is conditional probability of other bidders having profile type given that bidder is of type . We refer to such probabilites as interim allocation rules, as they give the probability of winning the auction in the interim period: after each player knowing their own type, but before the knowing the type of other bidders.

The function defined by is often referred to as a reduced form auction. Working with reduced form auctions is often much more analytically tractable for revenue maximization. [3]

Implementability

Taken on its own, an allocation rule is called implementable if there exists an auction such that

for every bidder and type .

Statement

Border proved two main versions of the theorem, with different restrictions on the auction environment. [1] [4]

i.i.d environment

The auction environment is i.i.d if the probability spaces are the same for every bidder , and types are independent. In this case, one only needs to consider symmetric auctions [nb 2] , [3] and thus also becomes the same for every . Border's theorem in this setting thus states: [1]

Proposition: An interim allocation rule is implementable by a symmetric auction if and only if for each measurable set of types , one has the inequality

Intuitively, the right-hand side represents the probability that the winner of the auction is of some type , and the left-hand side represents the probability that there exists some bidder with type . The fact that the inequality is necessary for implementability is intuitive; it being sufficient means that this inequality fully characterizes implementable auctions, and represents the strength of the theorem.

Finite sets of types

If all the sets are finite, the restriction to the i.i.d case can be dropped. In the more general environment developed above, Border thus proved: [4] [5]

Proposition: An interim allocation rule is implementable by an auction if and only if for each measurable sets of types , one has the inequality

The intuition of the i.i.d case remains: the right-hand side represents the probability that the winner of the auction is some bidder with type , and the left-hand side represents the probability that there exists some bidder with type . Once again, the strength of the result comes from it being sufficient to characterize implementable interim allocation rules.

Notes

  1. More generally, bidders could report any bid, not necessarily their true valuation. But we can, without loss of generality, concentrate on direct-revelation mechanisms and let the payment functions restrict the auction's incentive compatibility constraints . [1]
  2. An auction is symmetric if, for any permutation over and every bidder , we have . Intuitively, this means that a bidder 's identity does not matter, only their valuation .

Related Research Articles

<span class="mw-page-title-main">Multivariate normal distribution</span> Generalization of the one-dimensional normal distribution to higher dimensions

In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One definition is that a random vector is said to be k-variate normally distributed if every linear combination of its k components has a univariate normal distribution. Its importance derives mainly from the multivariate central limit theorem. The multivariate normal distribution is often used to describe, at least approximately, any set of (possibly) correlated real-valued random variables, each of which clusters around a mean value.

In physics, the CHSH inequality can be used in the proof of Bell's theorem, which states that certain consequences of entanglement in quantum mechanics cannot be reproduced by local hidden-variable theories. Experimental verification of the inequality being violated is seen as confirmation that nature cannot be described by such theories. CHSH stands for John Clauser, Michael Horne, Abner Shimony, and Richard Holt, who described it in a much-cited paper published in 1969. They derived the CHSH inequality, which, as with John Stewart Bell's original inequality, is a constraint—on the statistical occurrence of "coincidences" in a Bell test—which is necessarily true if an underlying local hidden-variable theory exists. In practice, the inequality is routinely violated by modern experiments in quantum mechanics.

In theoretical computer science a bisimulation is a binary relation between state transition systems, associating systems that behave in the same way in that one system simulates the other and vice versa.

In statistics, the Bhattacharyya distance is a quantity which represents a notion of similarity between two probability distributions. It is closely related to the Bhattacharyya coefficient, which is a measure of the amount of overlap between two statistical samples or populations.

Variational Bayesian methods are a family of techniques for approximating intractable integrals arising in Bayesian inference and machine learning. They are typically used in complex statistical models consisting of observed variables as well as unknown parameters and latent variables, with various sorts of relationships among the three types of random variables, as might be described by a graphical model. As typical in Bayesian inference, the parameters and latent variables are grouped together as "unobserved variables". Variational Bayesian methods are primarily used for two purposes:

  1. To provide an analytical approximation to the posterior probability of the unobserved variables, in order to do statistical inference over these variables.
  2. To derive a lower bound for the marginal likelihood of the observed data. This is typically used for performing model selection, the general idea being that a higher marginal likelihood for a given model indicates a better fit of the data by that model and hence a greater probability that the model in question was the one that generated the data.

In statistics and information theory, a maximum entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions. According to the principle of maximum entropy, if nothing is known about a distribution except that it belongs to a certain class, then the distribution with the largest entropy should be chosen as the least-informative default. The motivation is twofold: first, maximizing entropy minimizes the amount of prior information built into the distribution; second, many physical systems tend to move towards maximal entropy configurations over time.

In mathematics, Borel summation is a summation method for divergent series, introduced by Émile Borel. It is particularly useful for summing divergent asymptotic series, and in some sense gives the best possible sum for such series. There are several variations of this method that are also called Borel summation, and a generalization of it called Mittag-Leffler summation.

<span class="mw-page-title-main">Interval exchange transformation</span>

In mathematics, an interval exchange transformation is a kind of dynamical system that generalises circle rotation. The phase space consists of the unit interval, and the transformation acts by cutting the interval into several subintervals, and then permuting these subintervals. They arise naturally in the study of polygonal billiards and in area-preserving flows.

In mathematics and economics, the envelope theorem is a major result about the differentiability properties of the value function of a parameterized optimization problem. As we change parameters of the objective, the envelope theorem shows that, in a certain sense, changes in the optimizer of the objective do not contribute to the change in the objective function. The envelope theorem is an important tool for comparative statics of optimization models.

In mathematics, a π-system on a set is a collection of certain subsets of such that

In mathematics, the disintegration theorem is a result in measure theory and probability theory. It rigorously defines the idea of a non-trivial "restriction" of a measure to a measure zero subset of the measure space in question. It is related to the existence of conditional probability measures. In a sense, "disintegration" is the opposite process to the construction of a product measure.

In probability theory, a Markov kernel is a map that in the general theory of Markov processes plays the role that the transition matrix does in the theory of Markov processes with a finite state space.

The exponential mechanism is a technique for designing differentially private algorithms. It was developed by Frank McSherry and Kunal Talwar in 2007. Their work was recognized as a co-winner of the 2009 PET Award for Outstanding Research in Privacy Enhancing Technologies.

The uncertainty theory invented by Baoding Liu is a branch of mathematics based on normality, monotonicity, self-duality, countable subadditivity, and product measure axioms.

In mathematics, particularly linear algebra, the Schur–Horn theorem, named after Issai Schur and Alfred Horn, characterizes the diagonal of a Hermitian matrix with given eigenvalues. It has inspired investigations and substantial generalizations in the setting of symplectic geometry. A few important generalizations are Kostant's convexity theorem, Atiyah–Guillemin–Sternberg convexity theorem and Kirwan convexity theorem.

In probability theory, more specifically the study of random matrices, the circular law concerns the distribution of eigenvalues of an n × n random matrix with independent and identically distributed entries in the limit n → ∞.

In probability theory and statistics, the normal-Wishart distribution is a multivariate four-parameter family of continuous probability distributions. It is the conjugate prior of a multivariate normal distribution with unknown mean and precision matrix.

In probability theory and statistics, the normal-inverse-Wishart distribution is a multivariate four-parameter family of continuous probability distributions. It is the conjugate prior of a multivariate normal distribution with unknown mean and covariance matrix.

The min-entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the most likely outcome. The various Rényi entropies are all equal for a uniform distribution, but measure the unpredictability of a nonuniform distribution in different ways. The min-entropy is never greater than the ordinary or Shannon entropy and that in turn is never greater than the Hartley or max-entropy, defined as the logarithm of the number of outcomes with nonzero probability.

In machine learning, the kernel embedding of distributions comprises a class of nonparametric methods in which a probability distribution is represented as an element of a reproducing kernel Hilbert space (RKHS). A generalization of the individual data-point feature mapping done in classical kernel methods, the embedding of distributions into infinite-dimensional feature spaces can preserve all of the statistical features of arbitrary distributions, while allowing one to compare and manipulate distributions using Hilbert space operations such as inner products, distances, projections, linear transformations, and spectral analysis. This learning framework is very general and can be applied to distributions over any space on which a sensible kernel function may be defined. For example, various kernels have been proposed for learning from data which are: vectors in , discrete classes/categories, strings, graphs/networks, images, time series, manifolds, dynamical systems, and other structured objects. The theory behind kernel embeddings of distributions has been primarily developed by Alex Smola, Le Song , Arthur Gretton, and Bernhard Schölkopf. A review of recent works on kernel embedding of distributions can be found in.

References

  1. 1 2 3 4 Border, Kim C. (1991). "Implementation of Reduced Form Auctions: A Geometric Approach". Econometrica. 59 (4): 1175–1187. doi:10.2307/2938181. ISSN   0012-9682. JSTOR   2938181 . Retrieved 3 April 2021.
  2. Matthews, Steven (1984). "On the Implementability of Reduced Form Auctions" (PDF). Econometrica. 52 (6): 1519–1522. doi:10.2307/1913517. hdl: 10419/220920 . JSTOR   1913517.
  3. 1 2 3 Maskin, Eric; Riley, John (1984). "Optimal Auctions with Risk Averse Buyers". Econometrica. 52 (6): 1473–1518. doi:10.2307/1913516. JSTOR   1913516.
  4. 1 2 3 Border, Kim (2007). "Reduced form auctions revisited". Economic Theory. 31 (1): 167–181. doi:10.1007/s00199-006-0080-z.
  5. 1 Gopalan, Parikshit; Nisan, Noam; Roughgarden, Tim (2015). "Public projects, Boolean functions and the borders of Border's theorem". arXiv: 1504.07687 [cs.GT].{{cite arXiv}}: CS1 maint: numeric names: authors list (link)