Gravity model of trade

Last updated
Shift of the world's economic center of gravity since 1980 and projected until 2050 2010.09.27-LSE-Research-Danny.Quah-Map.png
Shift of the world's economic center of gravity since 1980 and projected until 2050

The gravity model of international trade in international economics is a model that, in its traditional form, predicts bilateral trade flows based on the economic sizes and distance between two units. [2] Research shows that there is "overwhelming evidence that trade tends to fall with distance." [3]

Contents

The model was first introduced by Walter Isard in 1954. [4] The basic model for trade between two countries (i and j) takes the form of

In this formula G is a constant, F stands for trade flow, D stands for the distance and M stands for the economic dimensions of the countries that are being measured. The equation can be changed into a linear form for the purpose of econometric analyses by employing logarithms. The model has been used by economists to analyse the determinants of bilateral trade flows such as common borders, common languages, common legal systems, common currencies, common colonial legacies, and it has been used to test the effectiveness of trade agreements and organizations such as the North American Free Trade Agreement (NAFTA) and the World Trade Organization (WTO) (Head and Mayer 2014). The model has also been used in international relations to evaluate the impact of treaties and alliances on trade (Head and Mayer).

The model has also been applied to other bilateral flow data (also known as "dyadic" data) such as migration, traffic, remittances and foreign direct investment.

Theoretical justifications and research

The model has been an empirical success in that it accurately predicts trade flows between countries for many goods and services, but for a long time some scholars believed that there was no theoretical justification for the gravity equation. [5] However, a gravity relationship can arise in almost any trade model that includes trade costs that increase with distance.

The gravity model estimates the pattern of international trade. While the model’s basic form consists of factors that have more to do with geography and spatiality, the gravity model has been used to test hypotheses rooted in purer economic theories of trade as well. One such theory predicts that trade will be based on relative factor abundances. One of the common relative factor abundance models is the Heckscher–Ohlin model. Those countries with a relative abundance of one factor would be expected to produce goods that require a relatively large amount of that factor in their production. While a generally accepted theory of trade, many economists in the Chicago School believed that the Heckscher–Ohlin model alone was sufficient to describe all trade, while Bertil Ohlin himself argued that in fact the world is more complicated. Investigations into real-world trading patterns have produced a number of results that do not match the expectations of comparative advantage theories. Notably, a study by Wassily Leontief found that the United States, the most capital-endowed country in the world, actually exports more in labor-intensive industries. Comparative advantage in factor endowments would suggest the opposite would occur. Other theories of trade and explanations for this relationship were proposed in order to explain the discrepancy between Leontief’s empirical findings and economic theory. The problem has become known as the Leontief paradox.

An alternative theory, first proposed by Staffan Linder, predicts that patterns of trade will be determined by the aggregated preferences for goods within countries. Those countries with similar preferences would be expected to develop similar industries. With continued similar demand, these countries would continue to trade back and forth in differentiated but similar goods since both demand and produce similar products. For instance, both Germany and the United States are industrialized countries with a high preference for automobiles. Both countries have automobile industries, and both trade cars. The empirical validity of the Linder hypothesis is somewhat unclear. Several studies have found a significant impact of the Linder effect, but others have had weaker results. Studies that do not support Linder have only counted countries that actually trade; they do not input zero values for the dyads where trade could happen but does not. This has been cited as a possible explanation for their findings. Also, Linder never presented a formal model for his theory, so different studies have tested his hypothesis in different ways.

Elhanan Helpman and Paul Krugman asserted that the theory behind comparative advantage does not predict the relationships in the gravity model. Using the gravity model, countries with similar levels of income have been shown to trade more. Helpman and Krugman see this as evidence that these countries are trading in differentiated goods because of their similarities. This casts some doubt about the impact Heckscher–Ohlin has on the real world. Jeffrey Frankel sees the Helpman–Krugman setup here as distinct from Linder’s proposal. However, he does say Helpman–Krugman is different from the usual interpretation of Linder, but, since Linder made no clear model, the association between the two should not be completely discounted. Alan Deardorff adds the possibility, that, while not immediately apparent, the basic gravity model can be derived from Heckscher–Ohlin as well as the Linder and Helpman–Krugman hypotheses. Deardorff concludes that, considering how many models can be tied to the gravity model equation, it is not useful for evaluating the empirical validity of theories.

Bridging economic theory with empirical tests, James Anderson and Jeffrey Bergstrand develop econometric models, grounded in the theories of differentiated goods, which measure the gains from trade liberalizations and the magnitude of the border barriers on trade (see Home bias in trade puzzle). A recent synthesis of empirical research using the gravity equations, however, shows that the effect of border barriers on trade is relatively modest. [6]

Adding to the problem of bridging economic theory with empirical results, some economists have pointed to the possibility of intra-industry trade not as the result of differentiated goods, but because of “reciprocal dumping.” In these models, the countries involved are said to have imperfect competition and segmented markets in homogeneous goods, which leads to intra-industry trade as firms in imperfect competition seek to expand their markets to other countries and trade goods that are not differentiated yet for which they do not have a comparative advantage, since there is no specialization. This model of trade is consistent with the gravity model as it would predict that trade depends on country size.

The reciprocal dumping model has held up to some empirical testing, suggesting that the specialization and differentiated goods models for the gravity equation might not fully explain the gravity equation. Feenstra, Markusen, and Rose (2001) provided evidence for reciprocal dumping by assessing the home market effect in separate gravity equations for differentiated and homogeneous goods. The home market effect showed a relationship in the gravity estimation for differentiated goods, but showed the inverse relationship for homogeneous goods. The authors show that this result matches the theoretical predictions of reciprocal dumping playing a role in homogeneous markets.

Past research using the gravity model has also sought to evaluate the impact of various variables in addition to the basic gravity equation. Among these, price level and exchange rate variables have been shown to have a relationship in the gravity model that accounts for a significant amount of the variance not explained by the basic gravity equation. According to empirical results on price level, the effect of price level varies according to the relationship being examined. For instance, if exports are being examined, a relatively high price level on the part of the importer would be expected to increase trade with that country. A non-linear system of equations are used by Anderson and van Wincoop (2003) to account for the endogenous change in these price terms from trade liberalization. [7] A more simple method is to use a first order log-linearization of this system of equations (Baier and Bergstrand (2009)), or exporter-country-year and importer-country-year dummy variables. [8] For counterfactual analysis, however, one would still need to account for the change in world prices.

Econometric estimation of gravity equations

Since the gravity model for trade does not hold exactly, in econometric applications it is customary to specify

where represents volume of trade from country to country , and typically represent the GDPs for countries and , denotes the distance between the two countries, and represents an error term with expectation equal to 1.

The traditional approach to estimating this equation consists in taking logs of both sides, leading to a log-log model of the form (note: constant G becomes part of ):

However, this approach has two major problems. First, it obviously cannot be used when there are observations for which is equal to zero. Second, Santos Silva and Tenreyro (2006) argued that estimating the log-linearized equation by least squares (OLS) can lead to significant biases if the researcher believes the true model to be nonlinear in its parameters. As an alternative, these authors have suggested that the model should be estimated in its multiplicative form, i.e.,

using a Poisson pseudo-maximum likelihood (PPML) estimator based on the Poisson model usually used for count data. As shown by Santos Silva and Tenreyro (2006), PPML estimates of common gravity variables can be different from their OLS counterparts. In particular, they found that the trade-reducing effects of distance were smaller and that the effects of colonial ties were statistically insignificant.

Though PPML does allow the inclusion of observations where , it is not necessarily a perfect solution to the "zeroes" problem. Martin and Pham (2008) argued that using PPML on gravity severely biases estimates when zero trade flows are frequent and reflect non-random selection. [9] However, their results were challenged by Santos Silva and Tenreyro (2011), who argued that the simulation results of Martin and Pham (2008) are based on misspecified models and showed that the PPML estimator performs well even when the proportions of zeros is very large. [10] The latter argument assumes that the number of trading firms can be generated via a count data model, with zero trade flows in the data reflecting the probability that no firms engage in trade. This idea was formalized further by Eaton, Kortum, and Sotelo (2012), who advocated for using the bilateral expenditure share as the dependent variable in place of the level of bilateral trade flows. [11]

In applied work, the gravity model is often extended by including variables to account for language relationships, tariffs, contiguity, access to sea, colonial history, and exchange rate regimes. Yet the estimation of structural gravity, based on Anderson and van Wincoop (2003), requires the inclusion of importer and exporter fixed effects, thus limiting the gravity analysis to bilateral trade costs (Baldwin and Taglioni 2007). Aside from OLS and PPML, other methods for gravity estimation include Gamma Pseudo-maximum Likelihood and the "tetrads" method of Head, Mayer, and Ries (2010). The latter involves first transforming the dependent variable in order to cancel out any country-specific factors. This provides another way of focusing only on bilateral trade costs. [12]

See also

Notes

  1. Quah, Danny (2011). "The Global Economy's Shifting Centre of Gravity". Global Policy. 2 (1). Wiley: 3–9. doi: 10.1111/j.1758-5899.2010.00066.x . ISSN   1758-5880. S2CID   55154148.
  2. Bergstrand, Jeffrey H.; Egger, Peter H.; Toubal, Farid (2024). "Introduction to the special issue on: Gravity at sixty". European Economic Review: 104749. doi:10.1016/j.euroecorev.2024.104749. ISSN   0014-2921.
  3. Carrère, Céline; Mrázová, Monika; Neary, J. Peter (2020). "Gravity without Apology: The Science of Elasticities, Distance, and Trade". The Economic Journal. 130 (628): 880–910. doi: 10.1093/ej/ueaa034 . hdl: 10419/216556 .
  4. Isard, Walter (May 1954). "Location Theory and Trade Theory: Short-Run Analysis". Quarterly Journal of Economics. 68 (2): 305–320. doi:10.2307/1884452. JSTOR   1884452.
  5. Deardorff, Alan (1998). "Determinants of Bilateral Trade: Does Gravity Work in a Neoclassical World?" (PDF). The Regionalization of the World Economy.
  6. Havranek, Tomas; Irsova, Zuzana (2016). "Do Borders Really Slash Trade? A Meta-Analysis" (PDF). The IMF Economic Review. 65 (2): 365–396. doi:10.1057/s41308-016-0001-5. hdl: 2027.42/132988 . S2CID   195331674.
  7. Anderson, J.; van Wincoop, E. (2003). "Gravity with Gravitas: A Solution to the Border Puzzle" (PDF). American Economic Review. 93: 170–192. doi:10.1257/000282803321455214. hdl: 10532/3989 . S2CID   7277314.
  8. Baier, SL; Bergstrand, JH (2009). "Bonus Vetus OLS:"A Simple Method for Approximating International Trade-Cost Effects Using the Gravity Equation". Journal of International Economics. 77: 77–85. doi:10.1016/j.jinteco.2008.10.004.
  9. "Martin, William and Cong S. Pham, "Estimating the Gravity Model When Zero Trade Flows are Frequent." Working Paper" (PDF).
  10. Silva, J. M. C. Santos; Tenreyro, Silvana (August 2011). "Further simulation evidence on the performance of the Poisson pseudo-maximum likelihood estimator". Economics Letters. 112 (2): 220–222. doi:10.1016/j.econlet.2011.05.008.
  11. Eaton, Jonathan; Kortum, Samuel; Sotelo, Sebastian (2012). "Eaton, Jonathan, Samuel S. Kortum, and Sebastian Sotelo. "International trade: Linking micro and macro." No. w17864. National Bureau of Economic Research, 2012". Working Paper Series. doi:10.3386/w17864.
  12. Head, Keith; Mayer, Thierry; Ries, John (May 2010). "Head, Keith, Thierry Mayer, and John Ries. "The erosion of colonial trade linkages after independence." Journal of international Economics 81.1 (2010): 1-14". Journal of International Economics. 81 (1): 1–14. doi:10.1016/j.jinteco.2010.01.002. S2CID   8586110.

Related Research Articles

<span class="mw-page-title-main">Econometrics</span> Empirical statistical testing of economic theories

Econometrics is an application of statistical methods to economic data in order to give empirical content to economic relationships. More precisely, it is "the quantitative analysis of actual economic phenomena based on the concurrent development of theory and observation, related by appropriate methods of inference." An introductory economics textbook describes econometrics as allowing economists "to sift through mountains of data to extract simple relationships." Jan Tinbergen is one of the two founding fathers of econometrics. The other, Ragnar Frisch, also coined the term in the sense in which it is used today.

<span class="mw-page-title-main">Kaluza–Klein theory</span> Unified field theory

In physics, Kaluza–Klein theory is a classical unified field theory of gravitation and electromagnetism built around the idea of a fifth dimension beyond the common 4D of space and time and considered an important precursor to string theory. In their setup, the vacuum has the usual 3 dimensions of space and one dimension of time but with another microscopic extra spatial dimension in the shape of a tiny circle. Gunnar Nordström had an earlier, similar idea. But in that case, a fifth component was added to the electromagnetic vector potential, representing the Newtonian gravitational potential, and writing the Maxwell equations in five dimensions.

<span class="mw-page-title-main">Least squares</span> Approximation method in statistics

The method of least squares is a parameter estimation method in regression analysis based on minimizing the sum of the squares of the residuals made in the results of each individual equation.

In statistics, the Gauss–Markov theorem states that the ordinary least squares (OLS) estimator has the lowest sampling variance within the class of linear unbiased estimators, if the errors in the linear regression model are uncorrelated, have equal variances and expectation value of zero. The errors do not need to be normal, nor do they need to be independent and identically distributed. The requirement that the estimator be unbiased cannot be dropped, since biased estimators exist with lower variance. See, for example, the James–Stein estimator, ridge regression, or simply any degenerate estimator.

<span class="mw-page-title-main">Gumbel distribution</span> Particular case of the generalized extreme value distribution

In probability theory and statistics, the Gumbel distribution is used to model the distribution of the maximum of a number of samples of various distributions.

<span class="mw-page-title-main">Cobb–Douglas production function</span> Macroeconomic formula that describes productivity

In economics and econometrics, the Cobb–Douglas production function is a particular functional form of the production function, widely used to represent the technological relationship between the amounts of two or more inputs and the amount of output that can be produced by those inputs. The Cobb–Douglas form was developed and tested against statistical evidence by Charles Cobb and Paul Douglas between 1927 and 1947; according to Douglas, the functional form itself was developed earlier by Philip Wicksteed.

<span class="mw-page-title-main">Nonlinear regression</span> Regression analysis

In statistics, nonlinear regression is a form of regression analysis in which observational data are modeled by a function which is a nonlinear combination of the model parameters and depends on one or more independent variables. The data are fitted by a method of successive approximations (iterations).

<span class="mw-page-title-main">Trip distribution</span>

Trip distribution is the second component in the traditional four-step transportation forecasting model. This step matches tripmakers’ origins and destinations to develop a “trip table”, a matrix that displays the number of trips going from each origin to each destination. Historically, this component has been the least developed component of the transportation planning model.

The Solow residual is a number describing empirical productivity growth in an economy from year to year and decade to decade. Robert Solow, the Nobel Memorial Prize in Economic Sciences-winning economist, defined rising productivity as rising output with constant capital and labor input. It is a "residual" because it is the part of growth that is not accounted for by measures of capital accumulation or increased labor input. Increased physical throughput – i.e. environmental resources – is specifically excluded from the calculation; thus some portion of the residual can be ascribed to increased physical throughput. The example used is for the intracapital substitution of aluminium fixtures for steel during which the inputs do not alter. This differs in almost every other economic circumstance in which there are many other variables. The Solow residual is procyclical and measures of it are now called the rate of growth of multifactor productivity or total factor productivity, though Solow (1957) did not use these terms.

In physics, precisely in the study of the theory of general relativity and many alternatives to it, the post-Newtonian formalism is a calculational tool that expresses Einstein's (nonlinear) equations of gravity in terms of the lowest-order deviations from Newton's law of universal gravitation. This allows approximations to Einstein's equations to be made in the case of weak fields. Higher-order terms can be added to increase accuracy, but for strong fields, it may be preferable to solve the complete equations numerically. Some of these post-Newtonian approximations are expansions in a small parameter, which is the ratio of the velocity of the matter forming the gravitational field to the speed of light, which in this case is better called the speed of gravity. In the limit, when the fundamental speed of gravity becomes infinite, the post-Newtonian expansion reduces to Newton's law of gravity.

<span class="mw-page-title-main">Canonical quantum gravity</span> A formulation of general relativity

In physics, canonical quantum gravity is an attempt to quantize the canonical formulation of general relativity. It is a Hamiltonian formulation of Einstein's general theory of relativity. The basic theory was outlined by Bryce DeWitt in a seminal 1967 paper, and based on earlier work by Peter G. Bergmann using the so-called canonical quantization techniques for constrained Hamiltonian systems invented by Paul Dirac. Dirac's approach allows the quantization of systems that include gauge symmetries using Hamiltonian techniques in a fixed gauge choice. Newer approaches based in part on the work of DeWitt and Dirac include the Hartle–Hawking state, Regge calculus, the Wheeler–DeWitt equation and loop quantum gravity.

In statistics, M-estimators are a broad class of extremum estimators for which the objective function is a sample average. Both non-linear least squares and maximum likelihood estimation are special cases of M-estimators. The definition of M-estimators was motivated by robust statistics, which contributed new types of M-estimators. However, M-estimators are not inherently robust, as is clear from the fact that they include maximum likelihood estimators, which are in general not robust. The statistical procedure of evaluating an M-estimator on a data set is called M-estimation.

<span class="mw-page-title-main">Post-Newtonian expansion</span> Method of approximation in general relativity

In general relativity, post-Newtonian expansions are used for finding an approximate solution of Einstein field equations for the metric tensor. The approximations are expanded in small parameters that express orders of deviations from Newton's law of universal gravitation. This allows approximations to Einstein's equations to be made in the case of weak fields. Higher-order terms can be added to increase accuracy, but for strong fields sometimes it is preferable to solve the complete equations numerically. This method is a common mark of effective field theories. In the limit, when the small parameters are equal to 0, the post-Newtonian expansion reduces to Newton's law of gravity.

The Heckman correction is a statistical technique to correct bias from non-randomly selected samples or otherwise incidentally truncated dependent variables, a pervasive issue in quantitative social sciences when using observational data. Conceptually, this is achieved by explicitly modelling the individual sampling probability of each observation together with the conditional expectation of the dependent variable. The resulting likelihood function is mathematically similar to the tobit model for censored dependent variables, a connection first drawn by James Heckman in 1974. Heckman also developed a two-step control function approach to estimate this model, which avoids the computational burden of having to estimate both equations jointly, albeit at the cost of inefficiency. Heckman received the Nobel Memorial Prize in Economic Sciences in 2000 for his work in this field.

In statistics, a generalized estimating equation (GEE) is used to estimate the parameters of a generalized linear model with a possible unmeasured correlation between observations from different timepoints. Although some believe that GEEs are robust in everything, even with the wrong choice of working correlation matrix, generalized estimating equations are robust only to loss of consistency with the wrong choice.

In statistical mechanics, the cluster expansion is a power series expansion of the partition function of a statistical field theory around a model that is a union of non-interacting 0-dimensional field theories. Unlike the usual perturbation expansion which usually leads to a divergent asymptotic series, the cluster expansion may converge within a non-trivial region, in particular when the interaction is small and short-ranged.

Pitzer equations are important for the understanding of the behaviour of ions dissolved in natural waters such as rivers, lakes and sea-water. They were first described by physical chemist Kenneth Pitzer. The parameters of the Pitzer equations are linear combinations of parameters, of a virial expansion of the excess Gibbs free energy, which characterise interactions amongst ions and solvent. The derivation is thermodynamically rigorous at a given level of expansion. The parameters may be derived from various experimental data such as the osmotic coefficient, mixed ion activity coefficients, and salt solubility. They can be used to calculate mixed ion activity coefficients and water activities in solutions of high ionic strength for which the Debye–Hückel theory is no longer adequate. They are more rigorous than the equations of specific ion interaction theory, but Pitzer parameters are more difficult to determine experimentally than SIT parameters.

The home market effect is a hypothesized concentration of certain industries in large markets. The home market effect became part of New Trade Theory. Through trade theory, the home market effect is derived from models with returns to scale and transportation costs. When it is cheaper for an industry to operate in a single country because of returns to scale, an industry will base itself in the country where most of its products are consumed in order to minimize transportation costs. The home market effect implies a link between market size and exports that is not accounted for in trade models based solely on comparative advantage.

The Mincer earnings function is a single-equation model that explains wage income as a function of schooling and experience. It is named after Jacob Mincer. Thomas Lemieux argues it is "one of the most widely used models in empirical economics". The equation has been examined on many datasets. Typically the logarithm of earnings is modelled as the sum of years of education and a quadratic function of "years of potential experience".

Bimetric gravity or bigravity refers to two different classes of theories. The first class of theories relies on modified mathematical theories of gravity in which two metric tensors are used instead of one. The second metric may be introduced at high energies, with the implication that the speed of light could be energy-dependent, enabling models with a variable speed of light.

References

Information

Data