Dynamic stochastic general equilibrium

Last updated

Dynamic stochastic general equilibrium modeling (abbreviated as DSGE, or DGE, or sometimes SDGE) is a method in macroeconomics that attempts to explain economic phenomena, such as economic growth and business cycles, and the effects of economic policy, through econometric models based on applied general equilibrium theory and microeconomic principles.

Macroeconomics branch of economics that studies aggregated indicators

Macroeconomics is a branch of economics dealing with the performance, structure, behavior, and decision-making of an economy as a whole. This includes regional, national, and global economies.

Economic growth increase in production and consumption in an economy

Economic growth is the increase in the inflation-adjusted market value of the goods and services produced by an economy over time. It is conventionally measured as the percent rate of increase in real gross domestic product, or real GDP.

Economic policy refers to the actions that governments take in the economic field

The economic policy of governments covers the systems for setting levels of taxation, government budgets, the money supply and interest rates as well as the labour market, national ownership, and many other areas of government interventions into the economy.



As a practical matter, people often use the term "DSGE models" to refer to a particular class of econometric, quantitative models of business cycles or economic growth called real business cycle (RBC) models. [1] Considered to be classic quantitative DSGE models are the ones proposed by Kydland & Prescott, [2] and Long & Plosser. [3] Charles Plosser has stated that DSGE models are an “update” of RBC models. [4]

Econometrics is the application of statistical methods to economic data in order to give empirical content to economic relationships. More precisely, it is "the quantitative analysis of actual economic phenomena based on the concurrent development of theory and observation, related by appropriate methods of inference". An introductory economics textbook describes econometrics as allowing economists "to sift through mountains of data to extract simple relationships". The first known use of the term "econometrics" was by Polish economist Paweł Ciompa in 1910. Jan Tinbergen is considered by many to be one of the founding fathers of econometrics. Ragnar Frisch is credited with coining the term in the sense in which it is used today.

The business cycle, also known as the economic cycle or trade cycle, is the downward and upward movement of gross domestic product (GDP) around its long-term growth trend. The length of a business cycle is the period of time containing a single boom and contraction in sequence. These fluctuations typically involve shifts over time between periods of relatively rapid economic growth and periods of relative stagnation or decline.

Charles Plosser American economist

Charles Irving Plosser is a former president of the Federal Reserve Bank of Philadelphia who served from August 1, 2006 to March 1, 2015. An academic macroeconomist, he is well known for his work on real business cycles, a term which he and John B. Long, Jr. coined. Specifically, he wrote along with Charles R. Nelson in 1982 an influential work entitled "Trends and Random Walks in Macroeconomic Time Series" in which they dealt with the hypothesis of permanent shocks affecting the aggregate product (GDP).

As their name indicates, DSGE models are dynamic (studying how the economy evolves over time), stochastic (taking into account the fact that the economy is affected by random shocks), general (referring to the entire economy), and of equilibrium (subscribing to the Walrasian, general equilibrium theory). [5]

Dynamical system mathematical model which describes the time dependence of a point in a geometrical space

In mathematics, a dynamical system is a system in which a function describes the time dependence of a point in a geometrical space. Examples include the mathematical models that describe the swinging of a clock pendulum, the flow of water in a pipe, and the number of fish each springtime in a lake.

Stochastic process mathematical object usually defined as a collection of random variables

In probability theory and related fields, a stochastic or random process is a mathematical object usually defined as a family of random variables. Historically, the random variables were associated with or indexed by a set of numbers, usually viewed as points in time, giving the interpretation of a stochastic process representing numerical values of some system randomly changing over time, such as the growth of a bacterial population, an electrical current fluctuating due to thermal noise, or the movement of a gas molecule. Stochastic processes are widely used as mathematical models of systems and phenomena that appear to vary in a random manner. They have applications in many disciplines including sciences such as biology, chemistry, ecology, neuroscience, and physics as well as technology and engineering fields such as image processing, signal processing, information theory, computer science, cryptography and telecommunications. Furthermore, seemingly random changes in financial markets have motivated the extensive use of stochastic processes in finance.

In economics, a shock is an unexpected or unpredictable event that affects an economy, either positively or negatively. Technically, it is an unpredictable change in exogenous factors — that is, factors unexplained by economics — which may influence endogenous economic variables.

RBC modeling

Early real business-cycle models postulated an economy populated by a representative consumer who operates in perfectly competitive markets. The only sources of uncertainty in these models are "shocks" in technology. [1] RBC theory builds on the neoclassical growth model, under the assumption of flexible prices, to study how real shocks to the economy might cause business cycle fluctuations. [6]

Technology making, modification, usage, and knowledge of tools, machines, techniques, crafts, systems, and methods of organization

Technology is the sum of techniques, skills, methods, and processes used in the production of goods or services or in the accomplishment of objectives, such as scientific investigation. Technology can be the knowledge of techniques, processes, and the like, or it can be embedded in machines to allow for operation without detailed knowledge of their workings. Systems applying technology by taking an input, changing it according to the system's use, and then producing an outcome are referred to as technology systems or technological systems.

The "representative consumer" assumption can either be taken literally or reflect a Gorman aggregation of heterogenous consumers who are facing idiosyncratic income shocks and complete markets in all assets. [note 1] These models took the position that fluctuations in aggregate economic activity are actually an "efficient response" of the economy to exogenous shocks.

Gorman polar form is a functional form for indirect utility functions in economics. Imposing this form on utility allows the researcher to treat a society of utility-maximizers as if it consisted of a single 'representative' individual. Gorman showed that having the function take Gorman polar form is both necessary and sufficient for this condition to hold.

In economics, a complete market is a market with two conditions:

  1. Negligible transaction costs and therefore also perfect information,
  2. there is a price for every asset in every possible state of the world

The models were criticized on a number of issues:

The equity premium puzzle refers to the inability of an important class of economic models to explain the average premium of a well-diversified U.S. equity portfolio over U.S. Treasury Bills observed for more than 100 years. The term was coined by Rajnish Mehra and Edward C. Prescott in a study published in 1985 titled The Equity Premium: A Puzzle,. An earlier version of the paper was published in 1982 under the title A test of the intertemporal asset pricing model. The authors found that a standard general equilibrium model, calibrated to display key U.S. business cycle fluctuations, generated an equity premium of less than 1% for reasonable risk aversion levels. This result stood in sharp contrast with the average equity premium of 6% observed during the historical period.

Consumption (economics) purchase and use of goods and services

Consumption, defined as spending for acquisition of utility, is a major concept in economics and is also studied in many other social sciences. It is seen in contrast to investing, which is spending for acquisition of future income.

The Lucas critique

In a 1976 paper, [note 3] Robert Lucas argued that it is naive to try to predict the effects of a change in economic policy entirely on the basis of relationships observed in historical data, especially highly aggregated historical data. Lucas claimed that the decision rules of Keynesian models, such as the fiscal multiplier, cannot be considered as structural, in the sense that they cannot be invariant with respect to changes in government policy variables, stating:

Given that the structure of an econometric model consists of optimal decision-rules of economic agents, and that optimal decision-rules vary systematically with changes in the structure of series relevant to the decision maker, it follows that any change in policy will systematically alter the structure of econometric models. [9]

This meant that, because the parameters of the models were not structural, i.e. not indifferent to policy, they would necessarily change whenever policy was changed. The so-called Lucas critique followed similar criticism undertaken earlier by Ragnar Frisch, in his critique of Jan Tinbergen’s 1939 book Statistical Testing of Business-Cycle Theories, where Frisch accused Tinbergen of not having discovered autonomous relations, but "coflux" relations, [10] and by Jacob Marschak, in his 1953 contribution to the Cowles Commission Monograph, where he submitted that

In predicting the effect of its decisions (policies), the government...has to take account of exogenous variables, whether controlled by it (the decisions themselves, if they are exogenous variables) or uncontrolled (e.g. weather), and of structural changes, whether controlled by it (the decisions themselves, if they change the structure) or uncontrolled (e.g. sudden changes in people’s attitude). [10]

The Lucas critique is representative of the paradigm shift that occurred in macroeconomic theory in the 1970s towards attempts at establishing micro-foundations.

Response to the Lucas critique

In the 1980s, macro models emerged that attempted to directly respond to Lucas through the use of rational expectations econometrics. [11]

In 1982, Finn E. Kydland and Edward C. Prescott created a real business cycle (RBC) model to "predict the consequence of a particular policy rule upon the operating characteristics of the economy." [2] The stated, exogenous, stochastic components in their model are "shocks to technology" and "imperfect indicators of productivity." The shocks involve random fluctuations in the productivity level, which shift up or down the trend of economic growth. Examples of such shocks include innovations, the weather, sudden and significant price increases in imported energy sources, stricter environmental regulations, etc. The shocks directly change the effectiveness of capital and labour, which, in turn, affects the decisions of workers and firms, who then alter what they buy and produce. This eventually affects output. [2]

The authors stated that, since fluctuations in employment are central to the business cycle, the "stand-in consumer [of the model] values not only consumption but also leisure," meaning that unemployment movements essentially reflect the changes in the number of people who want to work. "Household-production theory," as well as "cross-sectional evidence" ostensibly support a "non-time-separable utility function that admits greater inter-temporal substitution of leisure, something which is needed," according to the authors, "to explain aggregate movements in employment in an equilibrium model." [2] For the K&P model, monetary policy is irrelevant for economic fluctuations.

The associated policy implications were clear: There is no need for any form of government intervention since, ostensibly, government policies aimed at stabilizing the business cycle are welfare-reducing. [11] Since microfoundations are based on the preferences of decision-makers in the model, DSGE models feature a natural benchmark for evaluating the welfare effects of policy changes. [12] [13] The Kydland/Prescott 1982 paper is often considered the starting point of RBC theory and of DSGE modeling in general [6] and its authors were awarded the 2004 Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel. [14]

DSGE modeling


By applying dynamic principles, dynamic stochastic general equilibrium models contrast with the static models studied in applied general equilibrium models and some computable general equilibrium models.

DSGE models share a structure built around three interrelated "blocks": a demand block, a supply block, and a monetary policy equation. Formally, the equations that define these blocks are built on microfoundations and make explicit assumptions about the behavior of the main economic agents in the economy, i.e. households, firms, and the government. [15] The preferences (objectives) of the agents in the economy must be specified. For example, households might be assumed to maximize a utility function over consumption and labor effort. Firms might be assumed to maximize profits and to have a production function, specifying the amount of goods produced, depending on the amount of labor, capital and other inputs they employ. Technological constraints on firms' decisions might include costs of adjusting their capital stocks, their employment relations, or the prices of their products.

Below is an example of the set of assumptions a DSGE is built upon [16]

to which the following frictions are added:

The models’ general equilibrium nature is presumed to capture the interaction between policy actions and agents’ behavior, while the models specify assumptions about the stochastic shocks that give rise to economic fluctuations. Hence, the models are presumed to "trace more clearly the shocks’ transmission to the economy." [15]


Two schools of analysis form the bulk of DSGE modeling: [note 4] the classic RBC models, and the New-Keynesian DSGE models that build on a structure similar to RBC models, but instead assume that prices are set by monopolistically competitive firms, and cannot be instantaneously and costlessly adjusted. Rotemberg & Woodford introduced this framework in 1997. Introductory and advanced textbook presentations of DSGE modeling are given by Galí (2008) and Woodford (2003). Monetary policy implications are surveyed by Clarida, Galí, and Gertler (1999).

The European Central Bank (ECB) has developed [17] a DSGE model, called the Smets–Wouters model, [18] which it uses to analyze the economy of the Eurozone as a whole. [note 5] The Bank's analysts state that

developments in the construction, simulation and estimation of DSGE models have made it possible to combine a rigorous microeconomic derivation of the behavioural equations of macro models with an empirically plausible calibration or estimation which fits the main features of the macroeconomic time series. [17]

The main difference between "empirical" DSGE models and the "more traditional macroeconometric models, such as the Area-Wide Model", [19] according to the ECB, is that "both the parameters and the shocks to the structural equations are related to deeper structural parameters describing household preferences and technological and institutional constraints." [17] The Smets-Wouters model uses seven Eurozone area macroeconomic series: real GDP; consumption; investment; employment; real wages; inflation; and the nominal, short-term interest rate. Using Bayesian estimation and validation techniques, the bank's modeling is ostensibly able to compete with "more standard, unrestricted time series models, such as vector autoregression, in out-of-sample forecasting." [17]


From mainstream economists

Bank of Lithuania Deputy Chairman Raimondas Kuodis disputes the very title of DSGE analysis: The models, he claims, are neither dynamic (since they contain no evolution of stocks of financial assets and liabilities), stochastic (because we live in the world of Keynesian fundamental uncertainty and, since future outcomes or possible choices are unknown, then risk analysis or expected utility theory are not very helpful), general (they lack a full accounting framework, a stock-flow consistent framework, which would significantly reduce the number of degrees of freedom in the economy), or even about equilibrium (since markets clear only in a few quarters). [20]

Willem Buiter, Citigroup Chief Economist, has argued that DSGE models rely excessively on an assumption of complete markets, and are unable to describe the highly nonlinear dynamics of economic fluctuations, making training in 'state-of-the-art' macroeconomic modeling "a privately and socially costly waste of time and resources". [21] Narayana Kocherlakota, President of the Federal Reserve Bank of Minneapolis, wrote that

many modern macro models...do not capture an intermediate messy reality in which market participants can trade multiple assets in a wide array of somewhat segmented markets. As a consequence, the models do not reveal much about the benefits of the massive amount of daily or quarterly re-allocations of wealth within financial markets. The models also say nothing about the relevant costs and benefits of resulting fluctuations in financial structure (across bank loans, corporate debt, and equity). ...The models do not capture an intermediate messy reality ... [and] do not reveal much about the benefits of the massive amount of daily or quarterly re-allocations of wealth within financial markets. [5]

N. Gregory Mankiw, regarded as one of the founders of New Keynesian DSGE modeling, has argued that

New classical and New Keynesian research has had little impact on practical macroeconomists who are charged with [...] policy. [...] From the standpoint of macroeconomic engineering, the work of the past several decades looks like an unfortunate wrong turn. [22]

In the 2010 United States Congress hearings on macroeconomic modeling methods, held on 20 July 2010, and aiming to investigate why macroeconomists failed to foresee the financial crisis of 2007-2010, MIT professor of Economics Robert Solow criticized the DSGE models currently in use:

I do not think that the currently popular DSGE models pass the smell test. They take it for granted that the whole economy can be thought about as if it were a single, consistent person or dynasty carrying out a rationally designed, long-term plan, occasionally disturbed by unexpected shocks, but adapting to them in a rational, consistent way... The protagonists of this idea make a claim to respectability by asserting that it is founded on what we know about microeconomic behavior, but I think that this claim is generally phony. The advocates no doubt believe what they say, but they seem to have stopped sniffing or to have lost their sense of smell altogether. [23]

Commenting on the Congressional session, The Economist asked whether agent-based models might better predict financial crises than DSGE models. [24]

Former Chief Economist and Senior Vice President of the World Bank Paul Romer [note 6] has criticized the "mathiness" of DSGE models [25] and dismisses the inclusion of "imaginary shocks" in DSGE models that ignore "actions that people take." [26] Romer submits a simplified [note 7] presentation of real business cycle (RBC) modelling, which, as he states, essentially involves two mathematical expressions: The well known formula of the quantity theory of money, and an identity that defines the growth accounting residual A as the difference between growth of output Y and growth of an index X of inputs in production.

Δ%A = Δ%Y Δ%X

Romer assigned to residual A the label "phlogiston" [note 8] while he criticized the lack of consideration given to monetary policy in DSGE analysis. [26] [note 9]

Joseph Stiglitz finds "staggering" shortcomings in the "fantasy world" the models create and argues that "the failure [of macroeconomics] were the wrong microfoundations, which failed to incorporate key aspects of economic behavior”. He suggested the models have failed to incorporate “insights from information economics and behavioral economics" and are "ill-suited for predicting or responding to a financial crisis.” [27] Oxford University's John Muellbauer put it this way: “It is as if the information economics revolution, for which George Akerlof, Michael Spence and Joe Stiglitz shared the Nobel Prize in 2001, had not occurred. The combination of assumptions, when coupled with the trivialisation of risk and uncertainty...render money, credit and asset prices largely irrelevant… [The models] typically ignore inconvenient truths.” [28] Nobel laureate Paul Krugman asked, "Were there any interesting predictions from DSGE models that were validated by events? If there were, I’m not aware of it." [29]

From heterodox economics

Austrians reject DSGE modelling. Critique of DSGE-style macromodelling is at the core of Austrian theory, where, as opposed to RBC and New Keynesian models where capital is homogeneous [note 10] capital is heterogeneous and multi-specific and, therefore, production functions for the multi-specific capital are simply discovered over time. Lawrence H. White concludes [30] that present-day mainstream macroeconomics is dominated by Walrasian DSGE models, with restrictions added to generate Keynesian properties:

Mises consistently attributed the boom-initiating shock to unexpectedly expansive policy by a central bank trying to lower the market interest rate. Hayek added two alternate scenarios. [One is where] fresh producer-optimism about investment raises the demand for loanable funds, and thus raises the natural rate of interest, but the central bank deliberately prevents the market rate from rising by expanding credit. [Another is where,] in response to the same kind of increase the demand for loanable funds, but without central bank impetus, the commercial banking system by itself expands credit more than is sustainable. [30]

Hayek had criticized Wicksell for the confusion of thinking that establishing a rate of interest consistent with intertemporal equilibrium [note 11] also implies a constant price level. Hayek posited that intertemporal equilibrium requires not a natural rate but the "neutrality of money," in the sense that money does not "distort" (influence) relative prices. [31]

Post-Keynesians reject the notions of macro-modelling typified by DSGE. They consider such attempts as "a chimera of authority," [32] pointing to the 2003 statement by Lucas, the pioneer of modern DSGE modelling:

Macroeconomics in [its] original sense [of preventing the recurrence of economic disasters] has succeeded. Its central problem of depression prevention has been solved, for all practical purposes, and has in fact been solved for many decades. [33]

A basic Post Keynesian presumption, which Modern Monetary Theory proponents share, and which is central to Keynesian analysis, is that the future is unknowable and so, at best, we can make guesses about it that would be based broadly on habit, custom, gut-feeling, [note 12] etc. [32] In DSGE modeling, the central equation for consumption supposedly provides a way in which the consumer links decisions to consume now with decisions to consume later and thus achieves maximum utility in each period. Our marginal Utility from consumption today must equal our marginal utility from consumption in the future, with a weighting parameter that refers to the valuation that we place on the future relative to today. And since the consumer is supposed to always the equation for consumption, this means that all of us do it individually, if this approach is to reflect the DSGE microfoundational notions of consumption. However, post-Keynesians state that: no consumer is the same with another in terms of random shocks and uncertainty of income (since some consumers spend will every cent of any extra income they receive while others, typically higher-income earners, spend comparatively little of any extra income); no consumer is the same with another in terms of access to credit; not every consumer really considers what they will be doing at the end of their life in any coherent way, so there is no concept of a "permanent lifetime income,", which is central to DSGE models; and, therefore, trying to "aggregate" all these differences into one, single "representative agent" is impossible. [32] These assumptions are similar to the assumptions made in the so-called Ricardian equivalence, whereby consumers are assumed to be forward looking and to internalize the government's budget constraints when making consumption decisions, and therefore taking decisions on the basis of practically perfect evaluations of available information. [32]

Extrinsic unpredictability, post-Keynesians state, has "dramatic consequences" for the standard, macroeconomic, forecasting, DSGE models used by governments and other institutions around the world. The mathematical basis of every DSGE model fails when distributions shift, since general-equilibrium theories rely heavily on ceteris paribus assumptions. [32] They point to the Bank of England's explicit admission [34] that none of the models they used and evaluated coped well during the financial crisis, which, for the Bank, "underscores the role that large structural breaks can have in contributing to forecast failure, even if they turn out to be temporary."

Evolution of viewpoints

Federal Reserve Bank of Minneapolis president Narayana Kocherlakota acknowledges that DSGE models were "not very useful" for analyzing the financial crisis of 2007-2010 but argues that the applicability of these models is "improving," and claims that there is growing consensus among macroeconomists that DSGE models need to incorporate both "price stickiness and financial market frictions." [5] Despite his criticism of DSGE modelling, he states that modern models are useful:

In the early 2000s, ...[the] problem of fit [note 13] disappeared for modern macro models with sticky prices. Using novel Bayesian estimation methods, Frank Smets and Raf Wouters [18] demonstrated that a sufficiently rich New Keynesian model could fit European data well. Their finding, along with similar work by other economists, has led to widespread adoption of New Keynesian models for policy analysis and forecasting by central banks around the world. [5]

Still, Kocherlakota observes that in "terms of fiscal policy (especially short-term fiscal policy), modern macro-modeling seems to have had little impact. ... [M]ost, if not all, of the motivation for the fiscal stimulus was based largely on the long-discarded models of the 1960s and 1970s. [5]

In 2010, Rochelle M. Edge, of the Federal Reserve System Board of Directors, contested that the work of Smets & Wouters has "led DSGE models to be taken more seriously by central bankers around the world" so that "DSGE models are now quite prominent tools for macroeconomic analysis at many policy institutions, with forecasting being one of the key areas where these models are used, in conjunction with other forecasting methods." [35]

University of Minnesota professor of economics V.V. Chari has pointed out that state-of-the-art DSGE models are more sophisticated than their critics suppose:

The models have all kinds of heterogeneity in behavior and decisions... people's objectives differ, they differ by age, by information, by the history of their past experiences. [36]

Chari also argued that current DSGE models frequently incorporate frictional unemployment, financial market imperfections, and sticky prices and wages, and therefore imply that the macroeconomy behaves in a suboptimal way which monetary and fiscal policy may be able to improve. [36] Columbia University's Michael Woodford concedes [37] that policies considered by DSGE models might not be Pareto optimal [note 14] and they may as well not satisfy some other social welfare criterion. Nonetheless, in replying to Mankiw, Woodford argues that the DSGE models commonly used by central banks today and strongly influencing policy makers like Ben Bernanke, do not provide an analysis so different from traditional Keynesian analysis:

It is true that the modeling efforts of many policy institutions can reasonably be seen as an evolutionary development within the macroeconomic modeling program of the postwar Keynesians; thus if one expected, with the early New Classicals, that adoption of the new tools would require building anew from the ground up, one might conclude that the new tools have not been put to use. But in fact they have been put to use, only not with such radical consequences as had once been expected. [38]

See also


  1. A "complete market", aka an "Arrow-Debreu market," or a "complete system of markets," is a market with two conditions: (a) negligible transaction costs, and therefore also perfect information, and (b) there is a price for every asset in every possible state of the world.
  2. In such friction-less labour markets, fluctuations in hours worked reflect movements along a given labour-supply curve or optimal movements of agents in and out of the labor force. See Chetty et al (2011).
  3. "One of the most famous papers in macroeconomics". Goutsmedt et al (2015)
  4. It has been suggested that the difference between RBC and New Keynesian models, when controlling for key supply channels, can be limited. See Cantore et al (2010)
  5. The model does not analyze individual European countries separately
  6. Romer is considered a pioneer of endogenous growth theory. See Paul Romer.
  7. In Romer's words, "stripped to its essentials". Romer (2016)
  8. The term is used "to remind ourselves of our ignorance," as Romer stated, and in honor of American economist Moses Abramovitz, whose 1956 paper had criticized the importance given to productivity increase in the modelling: "Since we know little about the causes of productivity increase, the indicated importance of this element may be taken to be some sort of measure of our ignorance about the causes of economic growth in the United States and some sort of indication of where we need to concentrate our attention." (Emphasis by Romer.) Abramovitz (1965)
  9. According to Romer, Prescott, in his University of Minnesota lectures to graduate students, was saying that "postal economics is more central to understanding the economy than monetary economics."
  10. Meaning that it is costless to switch from one investment into another
  11. The so-called "natural rate."
  12. See "animal spirits".
  13. By the term "[statistical] fit", Kocherlakota is referring to the "models of the 1960s and 1970s" that "were based on estimated supply and demand relationships, and so were specifically designed to fit the existing data well." Kocherlakota (2010)
  14. Any state of allocation of resources in which it is impossible to make any one individual better off without making at least one individual worse off is denoted as being "Pareto optimal."

Related Research Articles

New Keynesian economics is a school of contemporary macroeconomics that strives to provide microeconomic foundations for Keynesian economics. It developed partly as a response to criticisms of Keynesian macroeconomics by adherents of new classical macroeconomics.

This aims to be a complete article list of economics topics:

Monetary economics is the branch of economics that studies the different competing theories of money: it provides a framework for analyzing money and considers its functions, and it considers how money, for example fiat currency, can gain acceptance purely because of its convenience as a public good. The discipline has historically prefigured, and remains integrally linked to, macroeconomics. This branch also examines the effects of monetary systems, including regulation of money and associated financial institutions and international aspects.

Neo-Keynesian economics is a school of macroeconomic thought that was developed in the post-war period from the writings of John Maynard Keynes. A group of economists, attempted to interpret and formalize Keynes' writings and to synthesize it with the neoclassical models of economics. Their work has become known as the neoclassical synthesis and created the models that formed the core ideas of neo-Keynesian economics. These ideas dominated mainstream economics in the post-war period and formed the mainstream of macroeconomic thought in the 1950s, 1960s and 1970s.

A macroeconomic model is an analytical tool designed to describe the operation of the problems of economy of a country or a region. These models are usually designed to examine the comparative statics and dynamics of aggregate quantities such as the total amount of goods and services produced, total income earned, the level of employment of productive resources, and the level of prices.

John B. Taylor Mary and Robert Raymond Professor of Economics at Stanford University

John Brian Taylor is the Mary and Robert Raymond Professor of Economics at Stanford University, and the George P. Shultz Senior Fellow in Economics at Stanford University's Hoover Institution.

In economics, a menu cost is the cost to a firm resulting from changing its prices. The name stems from the cost of restaurants literally printing new menus, but economists use it to refer to the costs of changing nominal prices in general. In this broader definition, menu costs might include updating computer systems, re-tagging items, and hiring consultants to develop new pricing strategies as well as the literal costs of printing menus. More generally, the menu cost can be thought of as resulting from costs of information, decision and implementation resulting in bounded rationality. Because of this expense, firms sometimes do not always change their prices with every change in supply and demand, leading to nominal rigidity. Generally, the effect on the firm of small shifts in price is relatively minor compared to the costs of notifying the public of this new information. Therefore, the firm would rather exist in slight disequilibrium than incur the menu costs.

Michael Dean Woodford is an American macroeconomist and monetary theorist who currently teaches at Columbia University.

New classical macroeconomics, sometimes simply called new classical economics, is a school of thought in macroeconomics that builds its analysis entirely on a neoclassical framework. Specifically, it emphasizes the importance of rigorous foundations based on microeconomics, especially rational expectations.

Jordi Galí Spanish economist

Jordi Galí is a Spanish macroeconomist who is regarded as one of the main figures in New Keynesian macroeconomics today. He is currently the director of the Centre de Recerca en Economia Internacional at Universitat Pompeu Fabra and a Research Professor at the Barcelona Graduate School of Economics. After obtaining his doctorate from MIT in 1989 under the supervision of Olivier Blanchard, he held faculty positions at Columbia University and New York University before moving to Barcelona.

Outline of economics Overview of and topical guide to economics

The following outline is provided as an overview of and topical guide to economics:

History of macroeconomic thought

Macroeconomic theory has its origins in the study of business cycles and monetary theory. In general, early theorists believed monetary factors could not affect real factors such as real output. John Maynard Keynes attacked some of these "classical" theories and produced a general theory that described the whole economy in terms of aggregates rather than individual, microeconomic parts. Attempting to explain unemployment and recessions, he noticed the tendency for people and businesses to hoard cash and avoid investment during a recession. He argued that this invalidated the assumptions of classical economists who thought that markets always clear, leaving no surplus of goods and no willing labor left idle.

The new neoclassical synthesis (NNS) or new synthesis is the fusion of the major, modern macroeconomic schools of thought, new classical and New-Keynesianism, into a consensus on the best way to explain short-run fluctuations in the economy. This new synthesis is analogous to the neoclassical synthesis that combined neoclassical economics with Keynesian macroeconomics. The new synthesis provides the theoretical foundation for much of contemporary mainstream economics. It is an important part of the theoretical foundation for the work done by the Federal Reserve and many other central banks.

Real business-cycle theory is a class of new classical macroeconomics models in which business-cycle fluctuations to a large extent can be accounted for by real shocks. Unlike other leading theories of the business cycle, RBC theory sees business cycle fluctuations as the efficient response to exogenous changes in the real economic environment. That is, the level of national output necessarily maximizes expected utility, and governments should therefore concentrate on long-run structural policy changes and not intervene through discretionary fiscal or monetary policy designed to actively smooth out economic short-term fluctuations.

The Taylor contract or staggered contract was first formulated by John B. Taylor in his two articles, in 1979 "Staggered wage setting in a macro model'. and in 1980 "Aggregate Dynamics and Staggered Contracts". In its simplest form, one can think of two equal sized unions who set wages in an industry. Each period, one of the unions sets the nominal wage for two periods. This means that in any one period, only one of the unions can reset its wage and react to events that have just happened. When the union sets its wage, it sets it for a known and fixed period of time. Whilst it will know what is happening in the first period when it sets the new wage, it will have to form expectations about the factors in the second period that determine the optimal wage to set. Although the model was first used to model wage setting, in new Keynesian models that followed it was also used to model price-setting by firms.

A Calvo contract is the name given in macroeconomics to the pricing model that when a firm sets a nominal price there is a constant probability that a firm might be able to reset its price which is independent of the time since the price was last reset. The model was first put forward by Guillermo Calvo in his 1983 article "Staggered Prices in a Utility-Maximizing Framework". The original article was written in a continuous time mathematical framework, but nowadays is mostly used in its discrete time version. The Calvo model is the most common way to model nominal rigidity in new Keynesian DSGE macroeconomic models.

Stephanie Schmitt-Grohe is a German economist who currently works as a professor of economics at Columbia University. Schmitt-Grohe's research has been focused on macroeconomics as well as fiscal and monetary policy in open and closed economies. In 2004 she was awarded the Bernacer prize, for her research of monetary stabilization policies.


  1. 1 2 3 Christiano (2018)
  2. 1 2 3 4 Kydland & Prescott (1982)
  3. Long & Plosser (1983)
  4. Plosser (2012)
  5. 1 2 3 4 5 Kocherlakota (2010)
  6. 1 2 Cooley (1995)
  7. Backus et al (1992)
  8. Mussa (1986)
  9. Lucas (1976)
  10. 1 2 Goutsmedt et al (2015)
  11. 1 2 Harrison et al (2013)
  12. Woodford, 2003, pp. 11–12.
  13. Tovar, 2008, pp. 15–16.
  14. Nobel Price organization press release (2004)
  15. 1 2 Sbordone et al (2010)
  16. BBLM del Dipartimento del Tesoro, Microfoundations of DSGE Models: I Lecture, 7 June 2010
  17. 1 2 3 4 ECB (2009)
  18. 1 2 Smets & Wouters (2002)
  19. Fagan et al (2001)
  20. Kuodis (2015)
  21. Buiter (2009)
  22. Mankiw (2006)
  23. Solow (2010)
  24. Agents of change, The Economist, July 22, 2010.
  25. Romer (2015)
  26. 1 2 Romer (2016)
  27. Stiglitz (2018)
  28. Muellbauer (2010)
  29. Krugman (2016)
  30. 1 2 White (2015)
  31. Storr (2016)
  32. 1 2 3 4 5 Mitchell (2017)
  33. Lucas (2003)
  34. Fawcett et al (2015)
  35. Edge & Gürkaynak (2010)
  36. 1 2 Chari (2010)
  37. Woodford (2003) p.12
  38. Woodford (2008)


Further reading