Siddhartha Chib | |
---|---|
Alma mater | University of California, Santa Barbara |
Scientific career | |
Fields | Econometrics, Statistics |
Institutions | Washington University in St. Louis |
Thesis | Some Contributions to Likelihood Based Prediction Methods (1985) |
Academic advisors | Sreenivasa Rao Jammalamadaka Thomas F. Cooley |
Website | apps |
Siddhartha Chib is an econometrician and statistician, the Harry C. Hartkopf Professor of Econometrics and Statistics at Washington University in St. Louis. His work is primarily in Bayesian statistics, econometrics, and Markov chain Monte Carlo methods.
Key papers include Albert and Chib (1993) [1] which introduced an approach for binary and categorical response models based on latent variables that simplifies the Bayesian analysis of categorical response models; Chib and Greenberg (1995) [2] which provided a derivation of the Metropolis-Hastings algorithm from first principles, guidance on implementation and extensions to multiple-block versions; Chib (1995) [3] where a new method for calculating the marginal likelihood from the Gibbs output is developed; Chib and Jeliazkov (2001) [4] where the method of Chib (1995) is extended to output of Metropolis-Hastings chains; Basu and Chib (2003) [5] for a method for finding marginal likelihoods in Dirichlet process mixture models; Carlin and Chib (1995) [6] which developed a model-space jump method for Bayesian model choice via Markov chain Monte Carlo methods; Chib (1998) [7] which introduced a multiple-change point model that is estimated by the methods of Albert and Chib (1993) [8] and Chib (1996) [9] for hidden Markov processes; Kim, Shephard and Chib (1998) [10] which introduced an efficient inference approach for univariate and multivariate stochastic volatility models; [11] [12] and Chib and Greenberg (1998) [13] which developed the Bayesian analysis of the multivariate probit model.
He has also developed original methods for Bayesian inference in Tobit censored responses, [14] discretely observed diffusions, [15] univariate and multivariate ARMA processes, [16] [17] multivariate count responses, [18] causal inference, [19] [20] hierarchical models of longitudinal data, [21] nonparametric regression, [22] [23] and unconditional and conditional moment models. [24] [25]
He received a bachelor's degree from St. Stephen’s College, Delhi, in 1979, an M.B.A. from the Indian Institute of Management, Ahmedabad, in 1982, and a Ph.D. in economics from the University of California, Santa Barbara, in 1986. [26] His advisors were Sreenivasa Rao Jammalamadaka and Thomas F. Cooley.
He is a fellow of the American Statistical Association (2001), [27] the International Society of Bayesian Analysis (2012), [28] and the Journal of Econometrics (1996). [29]
The following outline is provided as an overview of and topical guide to statistics:
Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Fundamentally, Bayesian inference uses prior knowledge, in the form of a prior distribution in order to estimate posterior probabilities. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called "Bayesian probability".
A Bayesian network is a probabilistic graphical model that represents a set of variables and their conditional dependencies via a directed acyclic graph (DAG). While it is one of several forms of causal notation, causal networks are special cases of Bayesian networks. Bayesian networks are ideal for taking an event that occurred and predicting the likelihood that any one of several possible known causes was the contributing factor. For example, a Bayesian network could represent the probabilistic relationships between diseases and symptoms. Given symptoms, the network can be used to compute the probabilities of the presence of various diseases.
In statistics, Markov chain Monte Carlo (MCMC) is a class of algorithms used to draw samples from a probability distribution. Given a probability distribution, one can construct a Markov chain whose elements' distribution approximates it – that is, the Markov chain's equilibrium distribution matches the target distribution. The more steps that are included, the more closely the distribution of the sample matches the actual desired distribution.
Christian Gouriéroux is an econometrician who holds a Doctor of Philosophy in mathematics from the University of Rouen. He has the Professor exceptional level title from France. Gouriéroux is now a professor at University of Toronto and CREST, Paris [Center for Research in Economics and Statistics].
Bayesian inference of phylogeny combines the information in the prior and in the data likelihood to create the so-called posterior probability of trees, which is the probability that the tree is correct given the data, the prior and the likelihood model. Bayesian inference was introduced into molecular phylogenetics in the 1990s by three independent groups: Bruce Rannala and Ziheng Yang in Berkeley, Bob Mau in Madison, and Shuying Li in University of Iowa, the last two being PhD students at the time. The approach has become very popular since the release of the MrBayes software in 2001, and is now one of the most popular methods in molecular phylogenetics.
Neil Shephard, FBA, is an econometrician, currently Frank B. Baird Jr., Professor of Science in the Department of Economics and the Department of Statistics at Harvard University.
In statistics and econometrics, the multivariate probit model is a generalization of the probit model used to estimate several correlated binary outcomes jointly. For example, if it is believed that the decisions of sending at least one child to public school and that of voting in favor of a school budget are correlated, then the multivariate probit model would be appropriate for jointly predicting these two choices on an individual-specific basis. J.R. Ashford and R.R. Sowden initially proposed an approach for multivariate probit analysis. Siddhartha Chib and Edward Greenberg extended this idea and also proposed simulation-based inference methods for the multivariate probit model which simplified and generalized parameter estimation.
Bayesian econometrics is a branch of econometrics which applies Bayesian principles to economic modelling. Bayesianism is based on a degree-of-belief interpretation of probability, as opposed to a relative-frequency interpretation.
In financial econometrics, the Markov-switching multifractal (MSM) is a model of asset returns developed by Laurent E. Calvet and Adlai J. Fisher that incorporates stochastic volatility components of heterogeneous durations. MSM captures the outliers, log-memory-like volatility persistence and power variation of financial returns. In currency and equity series, MSM compares favorably with standard volatility models such as GARCH(1,1) and FIGARCH both in- and out-of-sample. MSM is used by practitioners in the financial industry to forecast volatility, compute value-at-risk, and price derivatives.
Anil K. Bera is an Indian-American econometrician. He is Professor of Economics at University of Illinois at Urbana–Champaign's Department of Economics. He is most noted for his work with Carlos Jarque on the Jarque–Bera test.
Herman Koene van Dijk is a Dutch economist Consultant at the Research Department of Norges Bank and Professor Emeritus at the Econometric Institute of the Erasmus University Rotterdam, known for his contributions in the field of Bayesian analysis.
Laurent-Emmanuel Calvet is a French economist and a professor of finance. He is Vice President Elect of the European Finance Association.
Alan Enoch Gelfand is an American statistician, and is currently the James B. Duke Professor of Statistics and Decision Sciences at Duke University. Gelfand’s research includes substantial contributions to the fields of Bayesian statistics, spatial statistics and hierarchical modeling.
Éric Moulines is a French researcher in statistical learning and signal processing. He received the silver medal from the CNRS in 2010, the France Télécom prize awarded in collaboration with the French Academy of Sciences in 2011. He was appointed a Fellow of the European Association for Signal Processing in 2012 and of the Institute of Mathematical Statistics in 2016. He is General Engineer of the Corps des Mines (X81).
Stéphane Bonhomme is a French economist currently at the University of Chicago, where he is the Ann L. and Lawrence B. Buttenwieser Professor of Economics. Bonhomme specializes in microeconometrics. His research involves latent variable modeling, modeling of unobserved heterogeneity in panel data, and its applications in labor economics, in particular the analysis of earnings inequality and dynamics.
In statistics, cluster analysis is the algorithmic grouping of objects into homogeneous groups based on numerical measurements. Model-based clustering bases this on a statistical model for the data, usually a mixture model. This has several advantages, including a principled statistical basis for clustering, and ways to choose the number of clusters, to choose the best clustering model, to assess the uncertainty of the clustering, and to identify outliers that do not belong to any group.