Macroeconomic model

Last updated

A macroeconomic model is an analytical tool designed to describe the operation of the problems of economy of a country or a region. These models are usually designed to examine the comparative statics and dynamics of aggregate quantities such as the total amount of goods and services produced, total income earned, the level of employment of productive resources, and the level of prices.

Contents

Macroeconomic models may be logical, mathematical, and/or computational; the different types of macroeconomic models serve different purposes and have different advantages and disadvantages. [1] Macroeconomic models may be used to clarify and illustrate basic theoretical principles; they may be used to test, compare, and quantify different macroeconomic theories; they may be used to produce "what if" scenarios (usually to predict the effects of changes in monetary, fiscal, or other macroeconomic policies); and they may be used to generate economic forecasts. Thus, macroeconomic models are widely used in academia in teaching and research, and are also widely used by international organizations, national governments and larger corporations, as well as by economic consultants and think tanks.

Types

Simple theoretical models

Simple textbook descriptions of the macroeconomy involving a small number of equations or diagrams are often called ‘models’. Examples include the IS-LM model and Mundell–Fleming model of Keynesian macroeconomics, and the Solow model of neoclassical growth theory. These models share several features. They are based on a few equations involving a few variables, which can often be explained with simple diagrams. [2] Many of these models are static, but some are dynamic, describing the economy over many time periods. The variables that appear in these models often represent macroeconomic aggregates (such as GDP or total employment) rather than individual choice variables, and while the equations relating these variables are intended to describe economic decisions, they are not usually derived directly by aggregating models of individual choices. They are simple enough to be used as illustrations of theoretical points in introductory explanations of macroeconomic ideas; but therefore quantitative application to forecasting, testing, or policy evaluation is usually impossible without substantially augmenting the structure of the model.

Empirical forecasting models

In the 1940s and 1950s, as governments began accumulating national income and product accounting data, economists set out to construct quantitative models to describe the dynamics observed in the data. [3] These models estimated the relations between different macroeconomic variables using (mostly linear) time series analysis. Like the simpler theoretical models, these empirical models described relations between aggregate quantities, but many addressed a much finer level of detail (for example, studying the relations between output, employment, investment, and other variables in many different industries). Thus, these models grew to include hundreds or thousands of equations describing the evolution of hundreds or thousands of prices and quantities over time, making computers essential for their solution. While the choice of which variables to include in each equation was partly guided by economic theory (for example, including past income as a determinant of consumption, as suggested by the theory of adaptive expectations), variable inclusion was mostly determined on purely empirical grounds. [4]

Dutch economist Jan Tinbergen developed the first comprehensive national model, which he built for the Netherlands in 1936. He later applied the same modeling structure to the economies of the United States and the United Kingdom. [3] The first global macroeconomic model, Wharton Econometric Forecasting Associates' LINK project, was initiated by Lawrence Klein. The model was cited in 1980 when Klein, like Tinbergen before him, won the Nobel Prize. Large-scale empirical models of this type, including the Wharton model, are still in use today, especially for forecasting purposes. [5] [6] [7]

The Lucas critique of empirical forecasting models

Econometric studies in the first part of the 20th century showed a negative correlation between inflation and unemployment called the Phillips curve. [8] Empirical macroeconomic forecasting models, being based on roughly the same data, had similar implications: they suggested that unemployment could be permanently lowered by permanently increasing inflation. However, in 1968, Milton Friedman [9] and Edmund Phelps [10] argued that this apparent tradeoff was illusory. They claimed that the historical relation between inflation and unemployment was due to the fact that past inflationary episodes had been largely unexpected. They argued that if monetary authorities permanently raised the inflation rate, workers and firms would eventually come to understand this, at which point the economy would return to its previous, higher level of unemployment, but now with higher inflation too. The stagflation of the 1970s appeared to bear out their prediction. [11]

In 1976, Robert Lucas, Jr., published an influential paper arguing that the failure of the Phillips curve in the 1970s was just one example of a general problem with empirical forecasting models. [12] [13] He pointed out that such models are derived from observed relationships between various macroeconomic quantities over time, and that these relations differ depending on what macroeconomic policy regime is in place. In the context of the Phillips curve, this means that the relation between inflation and unemployment observed in an economy where inflation has usually been low in the past would differ from the relation observed in an economy where inflation has been high. [14] Furthermore, this means one cannot predict the effects of a new policy regime using an empirical forecasting model based on data from previous periods when that policy regime was not in place. Lucas argued that economists would remain unable to predict the effects of new policies unless they built models based on economic fundamentals (like preferences, technology, and budget constraints) that should be unaffected by policy changes.

Dynamic stochastic general equilibrium models

Partly as a response to the Lucas critique, economists of the 1980s and 1990s began to construct microfounded [15] macroeconomic models based on rational choice, which have come to be called dynamic stochastic general equilibrium (DSGE) models. These models begin by specifying the set of agents active in the economy, such as households, firms, and governments in one or more countries, as well as the preferences, technology, and budget constraint of each one. Each agent is assumed to make an optimal choice, taking into account prices and the strategies of other agents, both in the current period and in the future. Summing up the decisions of the different types of agents, it is possible to find the prices that equate supply with demand in every market. Thus these models embody a type of equilibrium self-consistency: agents choose optimally given the prices, while prices must be consistent with agents’ supplies and demands.

DSGE models often assume that all agents of a given type are identical (i.e. there is a ‘representative household’ and a ‘representative firm’) and can perform perfect calculations that forecast the future correctly on average (which is called rational expectations). However, these are only simplifying assumptions, and are not essential for the DSGE methodology; many DSGE studies aim for greater realism by considering heterogeneous agents [16] or various types of adaptive expectations. [17] Compared with empirical forecasting models, DSGE models typically have fewer variables and equations, mainly because DSGE models are harder to solve, even with the help of computers. [18] Simple theoretical DSGE models, involving only a few variables, have been used to analyze the forces that drive business cycles; this empirical work has given rise to two main competing frameworks called the real business cycle model [19] [20] [21] and the New Keynesian DSGE model. [22] [23] More elaborate DSGE models are used to predict the effects of changes in economic policy and evaluate their impact on social welfare. However, economic forecasting is still largely based on more traditional empirical models, which are still widely believed to achieve greater accuracy in predicting the impact of economic disturbances over time.

DSGE versus CGE models

A closely related methodology that pre-dates DSGE modeling is computable general equilibrium (CGE) modeling. Like DSGE models, CGE models are often microfounded on assumptions about preferences, technology, and budget constraints. However, CGE models focus mostly on long-run relationships, making them most suited to studying the long-run impact of permanent policies like the tax system or the openness of the economy to international trade. [24] [25] DSGE models instead emphasize the dynamics of the economy over time (often at a quarterly frequency), making them suited for studying business cycles and the cyclical effects of monetary and fiscal policy.

Agent-based computational macroeconomic models

Another modeling methodology which has developed at the same time as DSGE models is Agent-based computational economics (ACE), which is a variety of Agent-based modeling. [26] Like the DSGE methodology, ACE seeks to break down aggregate macroeconomic relationships into microeconomic decisions of individual agents. ACE models also begin by defining the set of agents that make up the economy, and specify the types of interactions individual agents can have with each other or with the market as a whole. Instead of defining the preferences of those agents, ACE models often jump directly to specifying their strategies. Or sometimes, preferences are specified, together with an initial strategy and a learning rule whereby the strategy is adjusted according to its past success. [27] Given these strategies, the interaction of large numbers of individual agents (who may be very heterogeneous) can be simulated on a computer, and then the aggregate, macroeconomic relationships that arise from those individual actions can be studied.

Strengths and weaknesses of DSGE and ACE models

DSGE and ACE models have different advantages and disadvantages due to their different underlying structures. DSGE models may exaggerate individual rationality and foresight, and understate the importance of heterogeneity, since the rational expectations, representative agent case remains the simplest and thus the most common type of DSGE model to solve. Also, unlike ACE models, it may be difficult to study local interactions between individual agents in DSGE models, which instead focus mostly on the way agents interact through aggregate prices. On the other hand, ACE models may exaggerate errors in individual decision-making, since the strategies assumed in ACE models may be very far from optimal choices unless the modeler is very careful. A related issue is that ACE models which start from strategies instead of preferences may remain vulnerable to the Lucas critique: a changed policy regime should generally give rise to changed strategies.

See also

Related Research Articles

Macroeconomics branch of economics that studies aggregated indicators

Macroeconomics is a branch of economics dealing with the performance, structure, behavior, and decision-making of an economy as a whole. This includes regional, national, and global economies.

IS–LM model Keynesian macroeconomic model about interest rates and assets markets that places general equilibrium (simultaneous equilibria in goods/asset markets) at the intersection of “investment–saving” (IS) and “liquidity preference–money supply” (LM)

The IS–LM model, or Hicks–Hansen model, is a two-dimensional macroeconomic tool that shows the relationship between interest rates and assets market. The intersection of the "investment–saving" (IS) and "liquidity preference–money supply" (LM) curves models "general equilibrium" where supposed simultaneous equilibria occur in both the goods and the asset markets. Yet two equivalent interpretations are possible: first, the IS–LM model explains changes in national income when price level is fixed short-run; second, the IS–LM model shows why an aggregate demand curve can shift. Hence, this tool is sometimes used not only to analyse economic fluctuations but also to suggest potential levels for appropriate stabilisation policies.

In economics, "rational expectations" are model-consistent expectations, in that agents inside the model are assumed to "know the model" and on average take the model's predictions as valid. Rational expectations ensure internal consistency in models involving uncertainty. To obtain consistency within a model, the predictions of future values of economically relevant variables from the model are assumed to be the same as that of the decision-makers in the model, given their information set, the nature of the random processes involved, and model structure. The rational expectations assumption is used especially in many contemporary macroeconomic models.

New Keynesian economics is a school of contemporary macroeconomics that strives to provide microeconomic foundations for Keynesian economics. It developed partly as a response to criticisms of Keynesian macroeconomics by adherents of new classical macroeconomics.

This aims to be a complete article list of economics topics:

The Phillips curve is a single-equation economic model, named after William Phillips, describing an inverse relationship between rates of unemployment and corresponding rates of rises in wages that result within an economy. Stated simply, decreased unemployment, in an economy will correlate with higher rates of wage rises. Phillips did not himself state there was any relationship between employment and inflation; this notion was a trivial deduction from his statistical findings. Samuelson and Solow made the connection explicit and subsequently Milton Friedman and Edmund Phelps put the theoretical structure in place. In so doing, Friedman was to successfully predict the imminent collapse of Phillips' a-theoretic correlation.

Nominal rigidity, also known as price-stickiness or wage-stickiness, is a situation in which a nominal price is resistant to change. Complete nominal rigidity occurs when a price is fixed in nominal terms for a relevant period of time. For example, the price of a particular good might be fixed at $10 per unit for a year. Partial nominal rigidity occurs when a price may vary in nominal terms, but not as much as it would if perfectly flexible. For example, in a regulated market there might be limits to how much a price can change in a given year.

The Lucas critique, named for Robert Lucas's work on macroeconomic policymaking, argues that it is naive to try to predict the effects of a change in economic policy entirely on the basis of relationships observed in historical data, especially highly aggregated historical data. More formally, it states that the decision rules of Keynesian models—such as the consumption function—cannot be considered as structural in the sense of being invariant with respect to changes in government policy variables. The Lucas critique is significant in the history of economic thought as a representative of the paradigm shift that occurred in macroeconomic theory in the 1970s towards attempts at establishing micro-foundations.

Economists use the term representative agent to refer to the typical decision-maker of a certain type.

Economic model simplified representations of economic reality

In economics, a model is a theoretical construct representing economic processes by a set of variables and a set of logical and/or quantitative relationships between them. The economic model is a simplified, often mathematical, framework designed to illustrate complex processes. Frequently, economic models posit structural parameters. A model may have various exogenous variables, and those variables may change to create various responses by economic variables. Methodological uses of models include investigation, theorizing, and fitting theories to the world.

Economic forecasting is the process of making predictions about the economy. Forecasts can be carried out at a high level of aggregation—for example for GDP, inflation, unemployment or the fiscal deficit—or at a more disaggregated level, for specific sectors of the economy or even specific firms.

Monetary disequilibrium theory is a product of the monetarist school and is mainly represented in the works of Leland Yeager and Austrian macroeconomics. The basic concepts of monetary equilibrium and disequilibrium were, however, defined in terms of an individual's demand for cash balance by Mises (1912) in his Theory of Money and Credit.

In monetary economics, the demand for money is the desired holding of financial assets in the form of money: that is, cash or bank deposits rather than investments. It can refer to the demand for money narrowly defined as M1, or for money in the broader sense of M2 or M3.

Dynamic stochastic general equilibrium modeling is a method in macroeconomics that attempts to explain economic phenomena, such as economic growth and business cycles, and the effects of economic policy, through econometric models based on applied general equilibrium theory and microeconomic principles.

In economics, the microfoundations are the microeconomic behavior of individual agents, such as households or firms, that underpins an economic theory.

New classical macroeconomics, sometimes simply called new classical economics, is a school of thought in macroeconomics that builds its analysis entirely on a neoclassical framework. Specifically, it emphasizes the importance of rigorous foundations based on microeconomics, especially rational expectations.

History of macroeconomic thought Wikimedia history article

Macroeconomic theory has its origins in the study of business cycles and monetary theory. In general, early theorists believed monetary factors could not affect real factors such as real output. John Maynard Keynes attacked some of these "classical" theories and produced a general theory that described the whole economy in terms of aggregates rather than individual, microeconomic parts. Attempting to explain unemployment and recessions, he noticed the tendency for people and businesses to hoard cash and avoid investment during a recession. He argued that this invalidated the assumptions of classical economists who thought that markets always clear, leaving no surplus of goods and no willing labor left idle.

Jacques H. Drèze is a Belgian economist noted for his contributions to economic theory, econometrics, and economic policy as well as for his leadership in the economics profession. Drèze was the first President of the European Economic Association in 1986 and was the President of the Econometric Society in 1970.

Following the development of Keynesian economics, applied economics began developing forecasting models based on economic data including national income and product accounting data. In contrast with typical textbook models, these large-scale macroeconometric models used large amounts of data and based forecasts on past correlations instead of theoretical relations. These models estimated the relations between different macroeconomic variables using regression analysis on time series data. These models grew to include hundreds or thousands of equations describing the evolution of hundreds or thousands of prices and quantities over time, making computers essential for their solution. While the choice of which variables to include in each equation was partly guided by economic theory, variable inclusion was mostly determined on purely empirical grounds. Large-scale macroeconometric model consists of systems of dynamic equations of the economy with the estimation of parameters using time-series data on a quarterly to yearly basis.

In economic theory and econometrics, the term heterogeneity refers to differences across the units being studied. For example, a macroeconomic model in which consumers are assumed to differ from one another is said to have heterogeneous agents.

References

  1. Blanchard, Olivier (2017), “The need for different classes of macroeconomic models”, blog post, Jan. 12, 2017, Peterson Institute for International Economics.
  2. Blanchard, Olivier (2000), Macroeconomics, 2nd ed., Chap. 3.3, p. 47. Prentice Hall, ISBN   0-13-013306-X.
  3. 1 2 Klein, Lawrence (2004). "The contribution of Jan Tinbergen to economic science". De Economist. 152 (2): 155–157. doi:10.1023/B:ECOT.0000023251.14849.4f.
  4. Koopmans, Tjalling C. (1947). "Measurement Without Theory". Review of Economics and Statistics . 29 (3): 161–172. doi:10.2307/1928627. JSTOR   1928627.
  5. Klein, Lawrence R., ed. (1991). Comparative Performance of US Econometric Models . Oxford University Press. ISBN   0-19-505772-4.
  6. Eckstein, Otto (1983). The DRI Model of the US Economy. McGraw-Hill. ISBN   0-07-018972-2.
  7. Bodkin, Ronald; Klein, Lawrence; Marwah, Kanta (1991). A History of Macroeconometric Model Building. Edward Elgar.
  8. Phillips, A. W. (1958), "The relationship between unemployment and the rate of change of money wages in the United Kingdom 1861-1957", Economica , 25 (100): 283–299, doi:10.2307/2550759, JSTOR   2550759
  9. Friedman, Milton (1968), "The role of monetary policy", American Economic Review , American Economic Association, 58 (1): 1–17, JSTOR   1831652
  10. Phelps, Edmund S. (1968), "Money wage dynamics and labor market equilibrium", Journal of Political Economy , 76 (4): 678–711, doi:10.1086/259438
  11. Blanchard, Olivier (2000), op. cit., Ch. 28, p. 540.
  12. Lucas, Robert E., Jr. (1976), "Econometric Policy Evaluation: A Critique" (PDF), Carnegie-Rochester Conference Series on Public Policy, 1: 19–46, doi:10.1016/S0167-2231(76)80003-6
  13. Hoover, Kevin D. (1988). "Econometrics and the Analysis of Policy". The New Classical Macroeconomics. Oxford: Basil Blackwell. pp.  167–209. ISBN   0-631-14605-9.
  14. Blanchard, Olivier (2000), op. cit., Ch. 28, p. 542.
  15. Edmund S. Phelps, ed., (1970), Microeconomic Foundations of Employment and Inflation Theory. New York, Norton and Co. ISBN   0-393-09326-3.
  16. Krusell, Per; Smith, Anthony A., Jr. (1998). "Income and wealth heterogeneity in the macroeconomy". Journal of Political Economy . 106 (5): 243–277. doi:10.1086/250034.
  17. George W. Evans and Seppo Honkapohja (2001), Learning and Expectations in Macroeconomics. Princeton University Press, ISBN   0-691-04921-1.
  18. DeJong, D. N. with C. Dave (2007), Structural Macroeconometrics. Princeton University Press, ISBN   0-691-12648-8.
  19. Kydland, Finn E.; Prescott, Edward C. (1982). "Time to Build and Aggregate Fluctuations". Econometrica. 50 (6): 1345–70. doi:10.2307/1913386. JSTOR   1913386.
  20. Thomas F. Cooley (1995), Frontiers of Business Cycle Research. Princeton University Press.
  21. Andrew Abel and Ben Bernanke (1995), Macroeconomics, 2nd ed., Ch. 11.1, pp. 355-362. Addison-Wesley, ISBN   0-201-54392-3.
  22. Rotemberg, Julio J.; Woodford, Michael (1997). "An optimization-based econometric framework for the evaluation of monetary policy" (PDF). NBER Macroeconomics Annual. 12: 297–346. doi:10.1086/654340. JSTOR   3585236.
  23. Woodford, Michael (2003). Interest and Prices: Foundations of a Theory of Monetary Policy. Princeton University Press. ISBN   0-691-01049-8.
  24. Shoven, John B.; Whalley, John (1972). "A general equilibrium calculation of the effects of differential taxation of income from capital in the US" (PDF). Journal of Public Economics . 1 (3–4): 281–321. doi:10.1016/0047-2727(72)90009-6.
  25. Kehoe, Patrick J.; Kehoe, Timothy J. (1994). "A primer on static applied general equilibrium models" (PDF). Federal Reserve Bank of Minneapolis Quarterly Review. 18 (1): 2–16.
  26. Tesfatsion, Leigh (2003). "Agent-Based Computational Economics" (PDF). Iowa State University Economics Working Paper #1.
  27. Brock, William; Hommes, Cars (1997). "A rational route to randomness". Econometrica . 65 (5): 1059–1095. doi:10.2307/2171879. JSTOR   2171879.