This article needs additional citations for verification .(July 2011) |
In economic theory and econometrics, the term heterogeneity refers to differences across the units being studied. For example, a macroeconomic model in which consumers are assumed to differ from one another is said to have heterogeneous agents.
In econometrics, statistical inferences may be erroneous if, in addition to the observed variables under study, there exist other relevant variables that are unobserved, but correlated with the observed variables; dependent and independent variables . [1]
Methods for obtaining valid statistical inferences in the presence of unobserved heterogeneity include the instrumental variables method; multilevel models, including fixed effects and random effects models; and the Heckman correction for selection bias.
Economic models are often formulated by means of a representative agent. Depending on the application, individual agents can be aggregated to or represented by a single agent. For example, individual demand can be aggregated to market demand if and only if individual preferences are of the Gorman polar form (or equivalently satisfy linear and parallel Engel curves). Under this condition, even heterogeneous preferences can be represented by a single aggregate agent simply by summing over individual demand to market demand. However, some questions in economic theory cannot be accurately addressed without considering differences across agents, requiring a heterogeneous agent model.
How to solve a heterogeneous agent model depends on the assumptions that are made about the expectations of the agents in the model. Broadly speaking, models with heterogeneous agents fall into the category of agent-based computational economics (ACE) if the agents have adaptive expectations, or into the category of dynamic stochastic general equilibrium (DSGE) if the agents have rational expectations. DSGE models with heterogeneneous agents are especially difficult to solve, and have only recently become a widespread topic of research; most early DSGE research instead focused on representative agent models.
In economics, general equilibrium theory attempts to explain the behavior of supply, demand, and prices in a whole economy with several or many interacting markets, by seeking to prove that the interaction of demand and supply will result in an overall general equilibrium. General equilibrium theory contrasts to the theory of partial equilibrium, which analyzes a specific part of an economy while its other factors are held constant. In general equilibrium, constant influences are considered to be noneconomic, therefore, resulting beyond the natural scope of economic analysis. The noneconomic influences is possible to be non-constant when the economic variables change, and the prediction accuracy may depend on the independence of the economic factors.
In economics, "rational expectations" are model-consistent expectations, in that agents inside the model are assumed to "know the model" and on average take the model's predictions as valid. Rational expectations ensure internal consistency in models involving uncertainty. To obtain consistency within a model, the predictions of future values of economically relevant variables from the model are assumed to be the same as that of the decision-makers in the model, given their information set, the nature of the random processes involved, and model structure. The rational expectations assumption is used especially in many contemporary macroeconomic models.
This aims to be a complete article list of economics topics:
Economists use the term representative agent to refer to the typical decision-maker of a certain type.
(Interest rates to national income)
A macroeconomic model is an analytical tool designed to describe the operation of the problems of economy of a country or a region. These models are usually designed to examine the comparative statics and dynamics of aggregate quantities such as the total amount of goods and services produced, total income earned, the level of employment of productive resources, and the level of prices.
Computational economics is an interdisciplinary research discipline that involves computer science, economics, and management science. This subject encompasses computational modeling of economic systems, whether agent-based, general-equilibrium, macroeconomic, or rational-expectations, computational econometrics and statistics, computational finance, computational tools for the design of automated internet markets, programming tool specifically designed for computational economics and the teaching of computational economics. Some of these areas are unique, while others extend traditional areas of economics by solving problems that are tedious to study without computers and associated numerical methods.
In economics, an agent is an actor in an economic model who typically solves an optimization or choice problem.
Economics education or economic education is a field within economics that focuses on two main themes:
Dynamic stochastic general equilibrium modeling is a macroeconomic method which is often employed by monetary and fiscal authorities for policy analysis, explaining historical time-series data, as well as future forecasting purposes. DSGE econometric modeling applies general equilibrium theory and microeconomic principles in a tractable manner to postulate economic phenomena, such as economic growth and business cycles, as well as policy effects and market shocks.
Microfoundations are an effort to understand macroeconomic phenomena in terms of economic agents' behaviors and their interactions. Research in microfoundations explores the link between macroeconomic and microeconomic principles in order to explore the aggregate relationships in macroeconomic models.
The Heckman correction is a statistical technique to correct bias from non-randomly selected samples or otherwise incidentally truncated dependent variables, a pervasive issue in quantitative social sciences when using observational data. Conceptually, this is achieved by explicitly modelling the individual sampling probability of each observation together with the conditional expectation of the dependent variable. The resulting likelihood function is mathematically similar to the tobit model for censored dependent variables, a connection first drawn by James Heckman in 1974. Heckman also developed a two-step control function approach to estimate this model, which avoids the computational burden of having to estimate both equations jointly, albeit at the cost of inefficiency. Heckman received the Nobel Memorial Prize in Economic Sciences in 2000 for his work in this field.
Macroeconomic theory has its origins in the study of business cycles and monetary theory. In general, early theorists believed monetary factors could not affect real factors such as real output. John Maynard Keynes attacked some of these "classical" theories and produced a general theory that described the whole economy in terms of aggregates rather than individual, microeconomic parts. Attempting to explain unemployment and recessions, he noticed the tendency for people and businesses to hoard cash and avoid investment during a recession. He argued that this invalidated the assumptions of classical economists who thought that markets always clear, leaving no surplus of goods and no willing labor left idle.
The methodology of econometrics is the study of the range of differing approaches to undertaking econometric analysis.
There have been many criticisms of econometrics' usefulness as a discipline and perceived widespread methodological shortcomings in econometric modelling practices.
Control functions are statistical methods to correct for endogeneity problems by modelling the endogeneity in the error term. The approach thereby differs in important ways from other models that try to account for the same econometric problem. Instrumental variables, for example, attempt to model the endogenous variable X as an often invertible model with respect to a relevant and exogenous instrument Z. Panel analysis uses special data properties to difference out unobserved heterogeneity that is assumed to be fixed over time.
In applied statistics, fractional models are, to some extent, related to binary response models. However, instead of estimating the probability of being in one bin of a dichotomous variable, the fractional model typically deals with variables that take on all possible values in the unit interval. One can easily generalize this model to take on values on any other interval by appropriate transformations. Examples range from participation rates in 401(k) plans to television ratings of NBA games.
Issues of heterogeneity in duration models can take on different forms. On the one hand, unobserved heterogeneity can play a crucial role when it comes to different sampling methods, such as stock or flow sampling. On the other hand, duration models have also been extended to allow for different subpopulations, with a strong link to mixture models. Many of these models impose the assumptions that heterogeneity is independent of the observed covariates, it has a distribution that depends on a finite number of parameters only, and it enters the hazard function multiplicatively.
Stéphane Bonhomme is a French economist currently at the University of Chicago, where he is the Ann L. and Lawrence B. Buttenwieser Professor of Economics. Bonhomme specializes in microeconometrics. His research involves latent variable modeling, modeling of unobserved heterogeneity in panel data, and its applications in labor economics, in particular the analysis of earnings inequality and dynamics.
Yingyao Hu 胡颖尧 is an econometrician, a professor of economics, and currently the Chair of the Department of Economics, Johns Hopkins University.