Discretionary policy

Last updated

In macroeconomics, discretionary policy is an economic policy based on the ad hoc judgment of policymakers as opposed to policy set by predetermined rules. For instance, a central banker could make decisions on interest rates on a case-by-case basis instead of allowing a set rule, such as Friedman's k-percent rule, an inflation target following the Taylor rule, or a nominal income target to determine interest rates or the money supply. In practice, most policy actions are discretionary in nature.

Contents

"Discretionary policy" can refer to decision making in both monetary policy and fiscal policy. The opposite is a commitment policy.

Arguments against

Monetarist economists in particular have been opponents of the use of discretionary policy. According to Milton Friedman, the dynamics of change associated with the passage of time presents a timing problem for public policy. The reason this poses a problem is because a long and variable time lag exists between:

  1. the need for action and the recognition of that need;
  2. the recognition of a problem and the design and implementation of a policy response; and
  3. the implementation of the policy and the effect of the policy. [1] :145

It is because of these lags that Friedman argues that discretionary public policy will often be destabilizing. For this reason, he argued the case for general rules rather than discretionary policy.

Friedman formalized his argument in the context of monetary policy as follows. [2] The quantity equation says that

where M is the money supply, V is the velocity of money, and Y is nominal GDP. Expressing this in growth rates gives

where m, v, and y are the growth rates of the money supply, velocity and nominal GDP respectively. Suppose that the policymaker wishes for the variance of nominal GDP to be as low as possiblethat is, it defines a stabilizing approach to monetary policy as one which decreases nominal GDP variance. From the last equation we have

where refers to the standard deviation (square root of the variance) of the subscripted variable and refers to the correlation coefficient between the subscripted variables. With no use of discretionary policy or any rule giving fluctuations of the money supply, will equal zero and the target variance will simply be the exogenous variance of velocity, With the use of discretionary policy, on the other hand, all standard deviations in the above equation will be positive, and discretionary policy will have been stabilizing if and only if that is, if and only if

Thus the monetary authority would have to be sufficiently astute in its policy timing, in trying to counteract anticipated fluctuations in velocity, that the correlation of its money supply changes with velocity changes is not merely negative, but sufficiently negative to overcome the inherently GDP-variance-magnifying effects of money supply variation. Friedman believed that this condition for discretionary policy to be stabilizing is unlikely to be fulfilled in practice, because of the timing problems discussed above.

A related issue is the probable existence of multiplier uncertainty imperfect knowledge of the overall ultimate effect of a policy action of a given size. Generally multiplier uncertainty calls for more caution and the use of quantitatively smaller policy actions. [3]

Arguments for

Proponents of the use of discretionary policy, including in particular Keynesians, argue that our understanding of the workings of the economy is sufficiently astute, and the accessibility of detailed real-time economic data to policymakers is sufficiently great, that in practice discretionary policy has been stabilizing. For example, it is widely believed[ citation needed ] that the extreme expansion of the monetary base by the U.S. Federal Reserve and other central banks prevented the Great Recession of the 2000s decade from becoming a full-blown depression.

Related Research Articles

<span class="mw-page-title-main">Multivariate normal distribution</span> Generalization of the one-dimensional normal distribution to higher dimensions

In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One definition is that a random vector is said to be k-variate normally distributed if every linear combination of its k components has a univariate normal distribution. Its importance derives mainly from the multivariate central limit theorem. The multivariate normal distribution is often used to describe, at least approximately, any set of (possibly) correlated real-valued random variables each of which clusters around a mean value.

Electrical resistivity is a fundamental property of a material that measures how strongly it resists electric current, such as pure water which is an insulator. A low resistivity indicates a material that readily allows electric current. Resistivity is commonly represented by the Greek letter ρ (rho). The SI unit of electrical resistivity is the ohm-meter (Ω⋅m). For example, if a 1 m3 solid cube of material has sheet contacts on two opposite faces, and the resistance between these contacts is 1 Ω, then the resistivity of the material is 1 Ω⋅m.

<span class="mw-page-title-main">Correlation</span> Statistical concept

In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statistics it usually refers to the degree to which a pair of variables are linearly related. Familiar examples of dependent phenomena include the correlation between the height of parents and their offspring, and the correlation between the price of a good and the quantity the consumers are willing to purchase, as it is depicted in the so-called demand curve.

<span class="mw-page-title-main">Covariance matrix</span> Measure of covariance of components of a random vector

In probability theory and statistics, a covariance matrix is a square matrix giving the covariance between each pair of elements of a given random vector. Any covariance matrix is symmetric and positive semi-definite and its main diagonal contains variances.

In statistics, propagation of uncertainty is the effect of variables' uncertainties on the uncertainty of a function based on them. When the variables are the values of experimental measurements they have uncertainties due to measurement limitations which propagate due to the combination of variables in the function.

The Taylor rule is a monetary policy targeting rule. The rule was proposed in 1992 by American economist John B. Taylor for central banks to use to stabilize economic activity by appropriately setting short-term interest rates.

Modern portfolio theory (MPT), or mean-variance analysis, is a mathematical framework for assembling a portfolio of assets such that the expected return is maximized for a given level of risk. It is a formalization and extension of diversification in investing, the idea that owning different kinds of financial assets is less risky than owning only one type. Its key insight is that an asset's risk and return should not be assessed by itself, but by how it contributes to a portfolio's overall risk and return. It uses the variance of asset prices as a proxy for risk.

In monetary economics, the quantity theory of money is one of the directions of Western economic thought that emerged in the 16th-17th centuries. The QTM states that the general price level of goods and services is directly proportional to the amount of money in circulation, or money supply. For example, if the amount of money in an economy doubles, QTM predicts that price levels will also double. The theory was originally formulated by Renaissance mathematician Nicolaus Copernicus in 1517, and was influentially restated by philosophers John Locke, David Hume, Jean Bodin. The theory experienced a large surge in popularity with economists Anna Schwartz and Milton Friedman's book A Monetary History of the United States, published in 1963.

<span class="mw-page-title-main">Fisher transformation</span> Statistical transformation

In statistics, the Fisher transformation of a Pearson correlation coefficient is its inverse hyperbolic tangent (artanh). When the sample correlation coefficient r is near 1 or -1, its distribution is highly skewed, which makes it difficult to estimate confidence intervals and apply tests of significance for the population correlation coefficient ρ. The Fisher transformation solves this problem by yielding a variable whose distribution is approximately normally distributed, with a variance that is stable over different values of r.

<span class="mw-page-title-main">Mundell–Fleming model</span> Economic model

The Mundell–Fleming model, also known as the IS-LM-BoP model, is an economic model first set forth (independently) by Robert Mundell and Marcus Fleming. The model is an extension of the IS–LM model. Whereas the traditional IS-LM model deals with economy under autarky, the Mundell–Fleming model describes a small open economy.

<span class="mw-page-title-main">Lattice Boltzmann methods</span> Class of computational fluid dynamics methods

The lattice Boltzmann methods (LBM), originated from the lattice gas automata (LGA) method (Hardy-Pomeau-Pazzis and Frisch-Hasslacher-Pomeau models), is a class of computational fluid dynamics (CFD) methods for fluid simulation. Instead of solving the Navier–Stokes equations directly, a fluid density on a lattice is simulated with streaming and collision (relaxation) processes. The method is versatile as the model fluid can straightforwardly be made to mimic common fluid behaviour like vapour/liquid coexistence, and so fluid systems such as liquid droplets can be simulated. Also, fluids in complex environments such as porous media can be straightforwardly simulated, whereas with complex boundaries other CFD methods can be hard to work with.

In statistics, a pivotal quantity or pivot is a function of observations and unobservable parameters such that the function's probability distribution does not depend on the unknown parameters. A pivot quantity need not be a statistic—the function and its value can depend on the parameters of the model, but its distribution must not. If it is a statistic, then it is known as an ancillary statistic.

Bayesian linear regression is a type of conditional modeling in which the mean of one variable is described by a linear combination of other variables, with the goal of obtaining the posterior probability of the regression coefficients and ultimately allowing the out-of-sample prediction of the regressandconditional on observed values of the regressors. The simplest and most widely used version of this model is the normal linear model, in which given is distributed Gaussian. In this model, and under a particular choice of prior probabilities for the parameters—so-called conjugate priors—the posterior can be found analytically. With more arbitrarily chosen priors, the posteriors generally have to be approximated.

<span class="mw-page-title-main">Shallow water equations</span> Set of partial differential equations that describe the flow below a pressure surface in a fluid

The shallow-water equations (SWE) are a set of hyperbolic partial differential equations that describe the flow below a pressure surface in a fluid. The shallow-water equations in unidirectional form are also called Saint-Venant equations, after Adhémar Jean Claude Barré de Saint-Venant.

In probability theory and statistics, partial correlation measures the degree of association between two random variables, with the effect of a set of controlling random variables removed. When determining the numerical relationship between two variables of interest, using their correlation coefficient will give misleading results if there is another confounding variable that is numerically related to both variables of interest. This misleading information can be avoided by controlling for the confounding variable, which is done by computing the partial correlation coefficient. This is precisely the motivation for including other right-side variables in a multiple regression; but while multiple regression gives unbiased results for the effect size, it does not give a numerical value of a measure of the strength of the relationship between the two variables of interest.

<span class="mw-page-title-main">Demand for money</span> Concept in economics

In monetary economics, the demand for money is the desired holding of financial assets in the form of money: that is, cash or bank deposits rather than investments. It can refer to the demand for money narrowly defined as M1, or for money in the broader sense of M2 or M3.

The Cauchy momentum equation is a vector partial differential equation put forth by Cauchy that describes the non-relativistic momentum transport in any continuum.

In macroeconomics, multiplier uncertainty is lack of perfect knowledge of the multiplier effect of a particular policy action, such as a monetary or fiscal policy change, upon the intended target of the policy. For example, a fiscal policy maker may have a prediction as to the value of the fiscal multiplier—the ratio of the effect of a government spending change on GDP to the size of the government spending change—but is not likely to know the exact value of this ratio. Similar uncertainty may surround the magnitude of effect of a change in the monetary base or its growth rate upon some target variable, which could be the money supply, the exchange rate, the inflation rate, or GDP.

<span class="mw-page-title-main">Reinforced solid</span>

In solid mechanics, a reinforced solid is a brittle material that is reinforced by ductile bars or fibres. A common application is reinforced concrete. When the concrete cracks the tensile force in a crack is not carried any more by the concrete but by the steel reinforcing bars only. The reinforced concrete will continue to carry the load provided that sufficient reinforcement is present. A typical design problem is to find the smallest amount of reinforcement that can carry the stresses on a small cube. This can be formulated as an optimization problem.

The monetary/fiscal policy debate, otherwise known as the Ando–Modigliani/Friedman–Meiselman debate, was the exchange of viewpoints about the comparative efficiency of monetary policies and fiscal policies that originated with a work co-authored by Milton Friedman and David I. Meiselman and first published in 1963, as part of studies submitted to the Commission on Money and Credit.

References

  1. Friedman, Milton (1953). Essays in Positive Economics. University of Chicago Press.
  2. Friedman, Milton. "The effects of a full-employment policy on economic stability: A formal analysis", 1953, pp. 117–132 in Friedman, Milton. Essays in Positive Economics, University of Chicago Press, 1953.
  3. Brainard, William. (1967). "Uncertainty and the effectiveness of policy". American Economic Review . 57 (2): 411–425. JSTOR   1821642.