Almost ideal demand system

Last updated

The Almost Ideal Demand System (AIDS) is a consumer demand model used primarily by economists to study consumer behavior. [1] The AIDS model gives an arbitrary second-order approximation to any demand system and has many desirable qualities of demand systems. For instance it satisfies the axioms of order, aggregates over consumers without invoking parallel linear Engel curves, is consistent with budget constraints, and is simple to estimate.

Contents

Model

The AIDS model is based on a first specification of a cost/expenditure function c(u,p):

where p stand for price of L goods, and u the utility level. This specification satisfies homogeneity of order 1 in prices, and is a second order approximation of any cost function.

From this, demand equations are derived (using Shephard's lemma), but are however simpler to put in term of budget shares :

with is the total expenditure, , and P is the price index defined byUnder relevant constraints on the parameters , These budget shares equations share the properties of a demand function:

Origin

First developed by Angus Deaton and John Muellbauer, the AIDS system is derived from the "Price Invariant Generalized Logarithmic" (PIGLOG) model [2] which allows researchers to treat aggregate consumer behavior as if it were the outcome of a single maximizing consumer.

Applications

Many studies have used the AIDS system to determine the optimal allocation of expenditure among broad commodity groups, i.e., at high levels of commodity aggregation.

In addition, the AIDS system has been used as a brand demand system to determine optimal consumption rates for each brand using product category spending and brand prices alone. [3] Assuming weak separability of consumer preferences, the optimal allocation of expenditure among the brands of a given product category can be determined independently of the allocation of expenditure within other product categories. [4]

Extensions of the Almost Ideal Demand System

An extension of the almost ideal demand system is the Quadratic Almost Ideal Demand System (QUAIDS) which was developed by James Banks, Richard Blundell, and Arthur Lewbel. [5] It considers the existence of non-linear engel curve which is not expressed in the standard almost ideal demand system.

Related Research Articles

<span class="mw-page-title-main">Exponential distribution</span> Probability distribution

In probability theory and statistics, the exponential distribution or negative exponential distribution is the probability distribution of the time between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant average rate. It is a particular case of the gamma distribution. It is the continuous analogue of the geometric distribution, and it has the key property of being memoryless. In addition to being used for the analysis of Poisson point processes it is found in various other contexts.

<span class="mw-page-title-main">Gamma distribution</span> Probability distribution

In probability theory and statistics, the gamma distribution is a two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-squared distribution are special cases of the gamma distribution. There are two equivalent parameterizations in common use:

  1. With a shape parameter and a scale parameter .
  2. With a shape parameter and an inverse scale parameter , called a rate parameter.
<span class="mw-page-title-main">Quantum group</span> Algebraic construct of interest in theoretical physics

In mathematics and theoretical physics, the term quantum group denotes one of a few different kinds of noncommutative algebras with additional structure. These include Drinfeld–Jimbo type quantum groups, compact matrix quantum groups, and bicrossproduct quantum groups. Despite their name, they do not themselves have a natural group structure, though they are in some sense 'close' to a group.

The Gell-Mann matrices, developed by Murray Gell-Mann, are a set of eight linearly independent 3×3 traceless Hermitian matrices used in the study of the strong interaction in particle physics. They span the Lie algebra of the SU(3) group in the defining representation.

In mathematics, a matrix norm is a vector norm in a vector space whose elements (vectors) are matrices.

<span class="mw-page-title-main">Inverse-gamma distribution</span> Two-parameter family of continuous probability distributions

In probability theory and statistics, the inverse gamma distribution is a two-parameter family of continuous probability distributions on the positive real line, which is the distribution of the reciprocal of a variable distributed according to the gamma distribution.

In econometrics, endogeneity broadly refers to situations in which an explanatory variable is correlated with the error term. The distinction between endogenous and exogenous variables originated in simultaneous equations models, where one separates variables whose values are determined by the model from variables which are predetermined; ignoring simultaneity in the estimation leads to biased estimates as it violates the exogeneity assumption of the Gauss–Markov theorem. The problem of endogeneity is often ignored by researchers conducting non-experimental research and doing so precludes making policy recommendations. Instrumental variable techniques are commonly used to mitigate this problem.

In continuum mechanics, the finite strain theory—also called large strain theory, or large deformation theory—deals with deformations in which strains and/or rotations are large enough to invalidate assumptions inherent in infinitesimal strain theory. In this case, the undeformed and deformed configurations of the continuum are significantly different, requiring a clear distinction between them. This is commonly the case with elastomers, plastically-deforming materials and other fluids and biological soft tissue.

The principle of detailed balance can be used in kinetic systems which are decomposed into elementary processes. It states that at equilibrium, each elementary process is in equilibrium with its reverse process.

<span class="mw-page-title-main">Parameterized post-Newtonian formalism</span>

In physics, precisely in the study of the theory of general relativity and many alternatives to it, the post-Newtonian formalism is a calculational tool that expresses Einstein's (nonlinear) equations of gravity in terms of the lowest-order deviations from Newton's law of universal gravitation. This allows approximations to Einstein's equations to be made in the case of weak fields. Higher-order terms can be added to increase accuracy, but for strong fields, it may be preferable to solve the complete equations numerically. Some of these post-Newtonian approximations are expansions in a small parameter, which is the ratio of the velocity of the matter forming the gravitational field to the speed of light, which in this case is better called the speed of gravity. In the limit, when the fundamental speed of gravity becomes infinite, the post-Newtonian expansion reduces to Newton's law of gravity.

In mathematics, a compact quantum group is an abstract structure on a unital separable C*-algebra axiomatized from those that exist on the commutative C*-algebra of "continuous complex-valued functions" on a compact quantum group.

In natural language processing, Latent Dirichlet Allocation (LDA) is a Bayesian network that explains a set of observations through unobserved groups, and each group explains why some parts of the data are similar. The LDA is an example of a Bayesian topic model. In this, observations are collected into documents, and each word's presence is attributable to one of the document's topics. Each document will contain a small number of topics.

The time-evolving block decimation (TEBD) algorithm is a numerical scheme used to simulate one-dimensional quantum many-body systems, characterized by at most nearest-neighbour interactions. It is dubbed Time-evolving Block Decimation because it dynamically identifies the relevant low-dimensional Hilbert subspaces of an exponentially larger original Hilbert space. The algorithm, based on the Matrix Product States formalism, is highly efficient when the amount of entanglement in the system is limited, a requirement fulfilled by a large class of quantum many-body systems in one dimension.

<span class="mw-page-title-main">Structure constants</span> Coefficients of an algebra over a field

In mathematics, the structure constants or structure coefficients of an algebra over a field are used to explicitly specify the product of two basis vectors in the algebra as a linear combination. Given the structure constants, the resulting product is bilinear and can be uniquely extended to all vectors in the vector space, thus uniquely determining the product for the algebra.

A flavor of the k·p perturbation theory used for calculating the structure of multiple, degenerate electronic bands in bulk and quantum well semiconductors. The method is a generalization of the single band k·p theory.

In continuum mechanics, a compatible deformation tensor field in a body is that unique tensor field that is obtained when the body is subjected to a continuous, single-valued, displacement field. Compatibility is the study of the conditions under which such a displacement field can be guaranteed. Compatibility conditions are particular cases of integrability conditions and were first derived for linear elasticity by Barré de Saint-Venant in 1864 and proved rigorously by Beltrami in 1886.

<span class="mw-page-title-main">Calculus of moving surfaces</span> Extension of the classical tensor calculus

The calculus of moving surfaces (CMS) is an extension of the classical tensor calculus to deforming manifolds. Central to the CMS is the Tensorial Time Derivative whose original definition was put forth by Jacques Hadamard. It plays the role analogous to that of the covariant derivative on differential manifolds in that it produces a tensor when applied to a tensor.

In mathematics, the Kodaira–Spencer map, introduced by Kunihiko Kodaira and Donald C. Spencer, is a map associated to a deformation of a scheme or complex manifold X, taking a tangent space of a point of the deformation space to the first cohomology group of the sheaf of vector fields on X.

<span class="mw-page-title-main">Relativistic angular momentum</span> Angular momentum in special and general relativity

In physics, relativistic angular momentum refers to the mathematical formalisms and physical concepts that define angular momentum in special relativity (SR) and general relativity (GR). The relativistic quantity is subtly different from the three-dimensional quantity in classical mechanics.

In mathematics, a harmonic morphism is a (smooth) map between Riemannian manifolds that pulls back real-valued harmonic functions on the codomain to harmonic functions on the domain. Harmonic morphisms form a special class of harmonic maps i.e. those that are horizontally (weakly) conformal.

References

  1. An Almost Ideal Demand System, Angus Deaton, John Muellbauer, The American Economic Review, Vol. 70, No. 3. (Jun., 1980), pp. 312–326.
  2. The Piglog Model USDA Web site
  3. Baltas, George (2002). "An Applied Analysis of Brand Demand Structure". Applied Economics. 34 (9): 1171–1175. doi:10.1080/00036840110085996. S2CID   154033919.
  4. Thomas, R.L. (1987). Applied Demand Analysis. Essex: Longman.
  5. Banks, James, Richard Blundell, and Arthur Lewbel. "Quadratic Engel curves and consumer demand." Review of Economics and statistics 79.4 (1997): 527-539.