Part of a series on |
Economics |
---|
Computational economics is an interdisciplinary research discipline that combines methods in computational science and economics to solve complex economic problems. [1] This subject encompasses computational modeling of economic systems. Some of these areas are unique, while others established areas of economics by allowing robust data analytics and solutions of problems that would be arduous to research without computers and associated numerical methods. [2]
Computational methods have been applied in various fields of economics research, including but not limiting to:
Econometrics: Non-parametric approaches, semi-parametric approaches, and machine learning.
Dynamic systems modeling: Optimization, dynamic stochastic general equilibrium modeling, and agent-based modeling. [3]
Computational economics developed concurrently with the mathematization of the field. During the early 20th century, pioneers such as Jan Tinbergen and Ragnar Frisch advanced the computerization of economics and the growth of econometrics. As a result of advancements in Econometrics, regression models, hypothesis testing, and other computational statistical methods became widely adopted in economic research. On the theoretical front, complex macroeconomic models, including the real business cycle (RBC) model and dynamic stochastic general equilibrium (DSGE) models have propelled the development and application of numerical solution methods that rely heavily on computation. In the 21st century, the development of computational algorithms created new means for computational methods to interact with economic research. Innovative approaches such as machine learning models and agent-based modeling have been actively explored in different areas of economic research, offering economists an expanded toolkit that frequently differs in character from traditional methods.
Computational economics uses computer-based economic modeling to solve analytically and statistically formulated economic problems. A research program, to that end, is agent-based computational economics (ACE), the computational study of economic processes, including whole economies, as dynamic systems of interacting agents. [4] As such, it is an economic adaptation of the complex adaptive systems paradigm. [5] Here the "agent" refers to "computational objects modeled as interacting according to rules," not real people. [3] Agents can represent social, biological, and/or physical entities. The theoretical assumption of mathematical optimization by agents in equilibrium is replaced by the less restrictive postulate of agents with bounded rationality adapting to market forces, [6] including game-theoretical contexts. [7] Starting from initial conditions determined by the modeler, an ACE model develops forward through time driven solely by agent interactions. The scientific objective of the method is to test theoretical findings against real-world data in ways that permit empirically supported theories to cumulate over time. [8]
Machine learning models present a method to resolve vast, complex, unstructured data sets. Various machine learning methods such as the kernel method and random forest have been developed and utilized in data-mining and statistical analysis. These models provide superior classification, predictive capabilities, flexibility compared to traditional statistical models, such as that of the STAR method. Other methods, such as causal machine learning and causal tree, provide distinct advantages, including inference testing.
There are notable advantages and disadvantages of utilizing machine learning tools in economic research. In economics, a model is selected and analyzed at once. The economic research would select a model based on principle, then test/analyze the model with data, followed by cross-validation with other models. On the other hand, machine learning models have built in "tuning" effects. As the model conducts empirical analysis, it cross-validates, estimates, and compares various models concurrently. This process may yield more robust estimates than those of the traditional ones.
Traditional economics partially normalize the data based on existing principles, while machine learning presents a more positive/empirical approach to model fitting. Although Machine Learning excels at classification, predication and evaluating goodness of fit, many models lack the capacity for statistical inference, which are of greater interest to economic researchers. Machine learning models' limitations means that economists utilizing machine learning would need to develop strategies for robust, statistical causal inference, a core focus of modern empirical research. For example, economics researchers might hope to identify confounders, confidence intervals, and other parameters that are not well-specified in Machine Learning algorithms. [9]
Machine learning may effectively enable the development of more complicated heterogeneous economic models. Traditionally, heterogeneous models required extensive computational work. Since heterogeneity could be differences in tastes, beliefs, abilities, skills or constraints, optimizing a heterogeneous model is a lot more tedious than the homogeneous approach (representative agent). [10] The development of reinforced learning and deep learning may significantly reduce the complexity of heterogeneous analysis, creating models that better reflect agents' behaviors in the economy. [11]
The adoption and implementation of neural networks, deep learning in the field of computational economics may reduce the redundant work of data cleaning and data analytics, significantly lowering the time and cost of large scale data analytics and enabling researchers to collect, analyze data on a great scale. [12] This would encourage economic researchers to explore new modeling methods. In addition, reduced emphasis on data analysis would enable researchers to focus more on subject matters such as causal inference, confounding variables, and realism of the model. Under the proper guidance, machine learning models may accelerate the process of developing accurate, applicable economics through large scale empirical data analysis and computation. [13]
Dynamic modeling methods are frequently adopted in macroeconomic research to simulate economic fluctuations and test for the effects of policy changes. The DSGE one class of dynamic models relying heavily on computational techniques and solutions. DSGE models utilize micro-founded economic principles to capture characteristics of the real world economy in an environment with intertemporal uncertainty. Given their inherent complexity, DSGE models are in general analytically intractable, and are usually implemented numerically using computer software. One major advantage of DSGE models is that they facilitate the estimation of agents' dynamic choices with flexibility. However, many scholars have criticized DSGE models for their reliance on reduced-form assumptions that are largely unrealistic.
Utilizing computational tools in economic research has been the norm and foundation for a long time. Computational tools for economics include a variety of computer software that facilitate the execution of various matrix operations (e.g. matrix inversion) and the solution of systems of linear and nonlinear equations. Various programming languages are utilized in economic research for the purpose of data analytics and modeling. Typical programming languages used in computational economics research include C++, MATLAB, Julia, Python, R and Stata.
Among these programming languages, C++ as a compiled language performs the fastest, while Python as an interpreted language is the slowest. MATLAB, Julia, and R achieve a balance between performance and interpretability. As an early statistical analytics software, Stata was the most conventional programming language option. Economists embraced Stata as one of the most popular statistical analytics programs due to its breadth, accuracy, flexibility, and repeatability.
The following journals specialise in computational economics: ACM Transactions on Economics and Computation, [14] Computational Economics, [1] Journal of Applied Econometrics, [15] Journal of Economic Dynamics and Control [16] and the Journal of Economic Interaction and Coordination. [17]
Econometrics is an application of statistical methods to economic data in order to give empirical content to economic relationships. More precisely, it is "the quantitative analysis of actual economic phenomena based on the concurrent development of theory and observation, related by appropriate methods of inference." An introductory economics textbook describes econometrics as allowing economists "to sift through mountains of data to extract simple relationships." Jan Tinbergen is one of the two founding fathers of econometrics. The other, Ragnar Frisch, also coined the term in the sense in which it is used today.
New Keynesian economics is a school of macroeconomics that strives to provide microeconomic foundations for Keynesian economics. It developed partly as a response to criticisms of Keynesian macroeconomics by adherents of new classical macroeconomics.
Experimental economics is the application of experimental methods to study economic questions. Data collected in experiments are used to estimate effect size, test the validity of economic theories, and illuminate market mechanisms. Economic experiments usually use cash to motivate subjects, in order to mimic real-world incentives. Experiments are used to help understand how and why markets and other exchange systems function as they do. Experimental economics have also expanded to understand institutions and the law.
The Lucas critique argues that it is naïve to try to predict the effects of a change in economic policy entirely on the basis of relationships observed in historical data, especially highly aggregated historical data. More formally, it states that the decision rules of Keynesian models—such as the consumption function—cannot be considered as structural in the sense of being invariant with respect to changes in government policy variables. It was named after American economist Robert Lucas's work on macroeconomic policymaking.
Economists use the term representative agent to refer to the typical decision-maker of a certain type.
A macroeconomic model is an analytical tool designed to describe the operation of the problems of economy of a country or a region. These models are usually designed to examine the comparative statics and dynamics of aggregate quantities such as the total amount of goods and services produced, total income earned, the level of employment of productive resources, and the level of prices.
Thomas John Sargent is an American economist and the W.R. Berkley Professor of Economics and Business at New York University. He specializes in the fields of macroeconomics, monetary economics, and time series econometrics. As of 2020, he ranks as the 29th most cited economist in the world. He was awarded the Nobel Memorial Prize in Economics in 2011 together with Christopher A. Sims for their "empirical research on cause and effect in the macroeconomy".
Economics education or economic education is a field within economics that focuses on two main themes:
Complexity economics is the application of complexity science to the problems of economics. It relaxes several common assumptions in economics, including general equilibrium theory. While it does not reject the existence of an equilibrium, it sees such equilibria as "a special case of nonequilibrium", and as an emergent property resulting from complex interactions between economic agents. The complexity science approach has also been applied to computational economics.
Agent-based computational economics (ACE) is the area of computational economics that studies economic processes, including whole economies, as dynamic systems of interacting agents. As such, it falls in the paradigm of complex adaptive systems. In corresponding agent-based models, the "agents" are "computational objects modeled as interacting according to rules" over space and time, not real people. The rules are formulated to model behavior and social interactions based on incentives and information. Such rules could also be the result of optimization, realized through use of AI methods.
Dynamic stochastic general equilibrium modeling is a macroeconomic method which is often employed by monetary and fiscal authorities for policy analysis, explaining historical time-series data, as well as future forecasting purposes. DSGE econometric modelling applies general equilibrium theory and microeconomic principles in a tractable manner to postulate economic phenomena, such as economic growth and business cycles, as well as policy effects and market shocks.
John Duffy is an American economist. He is a professor of economics at the University of California, Irvine.
Mathematical economics is the application of mathematical methods to represent theories and analyze problems in economics. Often, these applied methods are beyond simple geometry, and may include differential and integral calculus, difference and differential equations, matrix algebra, mathematical programming, or other computational methods. Proponents of this approach claim that it allows the formulation of theoretical relationships with rigor, generality, and simplicity.
Macroeconomic theory has its origins in the study of business cycles and monetary theory. In general, early theorists believed monetary factors could not affect real factors such as real output. John Maynard Keynes attacked some of these "classical" theories and produced a general theory that described the whole economy in terms of aggregates rather than individual, microeconomic parts. Attempting to explain unemployment and recessions, he noticed the tendency for people and businesses to hoard cash and avoid investment during a recession. He argued that this invalidated the assumptions of classical economists who thought that markets always clear, leaving no surplus of goods and no willing labor left idle.
The methodology of econometrics is the study of the range of differing approaches to undertaking econometric analysis.
The ACEGES model is a decision support tool for energy policy by means of controlled computational experiments. The ACEGES tool is designed to be the foundation for large custom-purpose simulations of the global energy system. The ACEGES methodological framework, developed by Voudouris (2011) by extending Voudouris (2010), is based on the agent-based computational economics (ACE) paradigm. ACE is the computational study of economies modeled as evolving systems of autonomous interacting agents.
In economic theory and econometrics, the term heterogeneity refers to differences across the units being studied. For example, a macroeconomic model in which consumers are assumed to differ from one another is said to have heterogeneous agents.
There have been many criticisms of econometrics' usefulness as a discipline and perceived widespread methodological shortcomings in econometric modelling practices.
John Philip Rust is an American economist and econometrician. John Rust received his PhD from MIT in 1983 and taught at the University of Wisconsin, Yale University and University of Maryland before joining Georgetown University in 2012. John Rust was awarded Frisch Medal in 1992 and became the fellow of Econometric Society in 1993.
Anna Mikusheva is the Professor of Economics at Massachusetts Institute of Technology. She was the 2012 recipient of the Elaine Bennett Research Prize, a bi-annual prize that recognizes and celebrates research by a woman in the field of Economics, and was selected as a Sloan Research Fellow in 2013. She is a co-editor of the journal Econometric Theory.
{{cite book}}
: CS1 maint: multiple names: authors list (link){{cite journal}}
: Cite journal requires |journal=
(help)