Robert E. Kass | |
---|---|
Born | Boston, Massachusetts, USA | September 7, 1952
Nationality | American |
Alma mater | University of Chicago (PhD) Antioch College (BA) |
Known for | Computational Neuroscience, Bayesian Statistics |
Awards | R. A. Fisher Lectureship, Elected to the National Academy of Sciences |
Scientific career | |
Fields | Statistics |
Institutions | Carnegie Mellon University |
Thesis | (1980) |
Doctoral advisor | Stephen Stigler |
Website | www |
Robert E. Kass is the Maurice Falk University Professor of Statistics and Computational Neuroscience in the Department of Statistics and Data Science, the Machine Learning Department, and the Neuroscience Institute at Carnegie Mellon University.
Born in Boston, Massachusetts (1952), Kass earned a Bachelor of Arts degree in mathematics from Antioch College, and a PhD degree in Statistics from the University of Chicago in 1980, where his advisor was Stephen Stigler. Kass is the son of the late Harvard medical researcher Edward H. Kass [1] and stepson of the late Amalie M. Kass. His sister is the bioethicist Nancy Kass.
Kass's early research was on differential geometry in statistics, [2] which formed the basis for his book Geometrical Foundations of Asymptotic Inference [3] (with Paul Vos), and on Bayesian methods. Since 2000 his research has focused on statistical methods in neuroscience.
Kass's best-known work includes a comprehensive re-evaluation of Bayesian hypothesis testing and model selection, [4] [5] and the selection of prior distributions, [6] the relationship of Bayes and Empirical Bayes methods, [7] Bayesian asymptotics, [8] [9] the application of point process statistical models to neural spiking data, [10] [11] the challenges of multiple spike train analysis, [12] [13] the state-space approach to brain-computer interface, [14] and the brain's apparent ability to solve the credit assignment problem during brain-controlled robotic movement. [15] Kass's book Analysis of Neural Data [16] (with Emery Brown and Uri Eden) was published in 2014. Kass has also written on statistics education and the use of statistics, including the articles, "What is Statistics?", [17] "Statistical Inference: The Big Picture," [18] and "Ten Simple Rules for Effective Statistical Practice". [19]
Kass has served Chair of the Section for Bayesian Statistical Science of the American Statistical Association, Chair of the Statistics Section of the American Association for the Advancement of Science, founding Editor-in-Chief of Bayesian Analysis (journal), and Executive Editor (editor-in-chief) of the international review journal Statistical Science. At Carnegie Mellon University he was Department Head of Statistics from 1995 to 2004 and Interim Co-director of the joint CMU–University of Pittsburgh Center for the Neural Basis of Cognition 2015–2018. [20] [21]
Kass is an elected Fellow of the American Statistical Association, the Institute of Mathematical Statistics, and the American Association for the Advancement of Science, and an elected member of the National Academy of Sciences. [22] For his work on statistical modeling of neural synchrony, [23] in 2013 he received the Outstanding Statistical Application Award from the American Statistical Association, and in 2017 he received the R.A. Fisher Award and Lectureship, now known as the COPSS Distinguished Achievement Award and Lectureship, from the Committee of Presidents of Statistical Societies.
Statistical inference is the process of using data analysis to infer properties of an underlying distribution of probability. Inferential statistical analysis infers properties of a population, for example by testing hypotheses and deriving estimates. It is assumed that the observed data set is sampled from a larger population.
In statistics, Markov chain Monte Carlo (MCMC) is a class of algorithms used to draw samples from a probability distribution. Given a probability distribution, one can construct a Markov chain whose elements' distribution approximates it – that is, the Markov chain's equilibrium distribution matches the target distribution. The more steps that are included, the more closely the distribution of the sample matches the actual desired distribution.
The Granger causality test is a statistical hypothesis test for determining whether one time series is useful in forecasting another, first proposed in 1969. Ordinarily, regressions reflect "mere" correlations, but Clive Granger argued that causality in economics could be tested for by measuring the ability to predict the future values of a time series using prior values of another time series. Since the question of "true causality" is deeply philosophical, and because of the post hoc ergo propter hoc fallacy of assuming that one thing preceding another can be used as a proof of causation, econometricians assert that the Granger test finds only "predictive causality". Using the term "causality" alone is a misnomer, as Granger-causality is better described as "precedence", or, as Granger himself later claimed in 1977, "temporally related". Rather than testing whether Xcauses Y, the Granger causality tests whether X forecastsY.
Stephen Elliott Fienberg was a professor emeritus in the Department of Statistics, the Machine Learning Department, Heinz College, and Cylab at Carnegie Mellon University. Fienberg was the founding co-editor of the Annual Review of Statistics and Its Application and of the Journal of Privacy and Confidentiality.
Zoubin Ghahramani FRS is a British-Iranian researcher and Professor of Information Engineering at the University of Cambridge. He holds joint appointments at University College London and the Alan Turing Institute. and has been a Fellow of St John's College, Cambridge since 2009. He was Associate Research Professor at Carnegie Mellon University School of Computer Science from 2003–2012. He was also the Chief Scientist of Uber from 2016 until 2020. He joined Google Brain in 2020 as senior research director. He is also Deputy Director of the Leverhulme Centre for the Future of Intelligence.
Emery Neal Brown is an American statistician, computational neuroscientist, and anesthesiologist. He is the Warren M. Zapol Professor of Anesthesia at Harvard Medical School and at Massachusetts General Hospital (MGH), and a practicing anesthesiologist at MGH. At MIT he is the Edward Hood Taplin Professor of Medical Engineering and professor of computational neuroscience, the associate director of the Institute for Medical Engineering and Science, and the Director of the Harvard–MIT Program in Health Sciences and Technology.
Marcel Just is D. O. Hebb Professor of Psychology at Carnegie Mellon University. His research uses brain imaging (fMRI) in high-level cognitive tasks to study the neuroarchitecture of cognition. Just's areas of expertise include psycholinguistics, object recognition, and autism, with particular attention to cognitive and neural substrates. Just directs the Center for Cognitive Brain Imaging and is a member of the Center for the Neural Basis of Cognition at CMU.
Bayesian approaches to brain function investigate the capacity of the nervous system to operate in situations of uncertainty in a fashion that is close to the optimal prescribed by Bayesian statistics. This term is used in behavioural sciences and neuroscience and studies associated with this term often strive to explain the brain's cognitive abilities based on statistical principles. It is frequently assumed that the nervous system maintains internal probabilistic models that are updated by neural processing of sensory information using methods approximating those of Bayesian probability.
Larry Alan Wasserman is a Canadian-American statistician and a professor in the Department of Statistics & Data Science and the Machine Learning Department at Carnegie Mellon University.
A Bayesian Confidence Propagation Neural Network (BCPNN) is an artificial neural network inspired by Bayes' theorem, which regards neural computation and processing as probabilistic inference. Neural unit activations represent probability ("confidence") in the presence of input features or categories, synaptic weights are based on estimated correlations and the spread of activation corresponds to calculating posterior probabilities. It was originally proposed by Anders Lansner and Örjan Ekeberg at KTH Royal Institute of Technology. This probabilistic neural network model can also be run in generative mode to produce spontaneous activations and temporal sequences.
Stuart Alan Geman is an American mathematician, known for influential contributions to computer vision, statistics, probability theory, machine learning, and the neurosciences. He and his brother, Donald Geman, are well known for proposing the Gibbs sampler, and for the first proof of convergence of the simulated annealing algorithm.
Eric Poe Xing is an American computer scientist whose research spans machine learning, computational biology, and statistical methodology. Xing is founding President of the world’s first artificial intelligence university, Mohamed bin Zayed University of Artificial Intelligence (MBZUAI).
Alan Enoch Gelfand is an American statistician, and is currently the James B. Duke Professor of Statistics and Decision Sciences at Duke University. Gelfand’s research includes substantial contributions to the fields of Bayesian statistics, spatial statistics and hierarchical modeling.
Elizabeth H. Slate is an American statistician, interested in the Bayesian statistics of longitudinal data and applications to health. She is the Duncan McLean and Pearl Levine Fairweather Professor of Statistics at Florida State University. Some of Slate's most heavily cited work concerns the effects of selenium on cancer. Slate's research has also included work on the early detection of osteoarthritis.
Kathryn M. Roeder is an American statistician known for her development of statistical methods to uncover the genetic basis of complex disease and her contributions to mixture models, semiparametric inference, and multiple testing. Roeder holds positions as professor of statistics and professor of computational biology at Carnegie Mellon University, where she leads a project focused on discovering genes associated with autism.
Raquel Prado is a Venezuelan Bayesian statistician. She is a professor of statistics in the Jack Baskin School of Engineering of the University of California, Santa Cruz, and has been elected president of the International Society for Bayesian Analysis for the 2019 term.
Siddhartha Chib is an econometrician and statistician, the Harry C. Hartkopf Professor of Econometrics and Statistics at Washington University in St. Louis. His work is primarily in Bayesian statistics, econometrics, and Markov chain Monte Carlo methods.
Bayesian history matching is a statistical method for calibrating complex computer models. The equations inside many scientific computer models contain parameters which have a true value, but that true value is often unknown; history matching is one technique for learning what these parameters could be.
Roderick Joseph Alexander Little is an academic statistician, whose main research contributions lie in the statistical analysis of data with missing values and the analysis of complex sample survey data. Little is Richard D. Remington Distinguished University Professor of Biostatistics in the Department of Biostatistics at the University of Michigan, where he also holds academic appointments in the Department of Statistics and the Institute for Social Research.
Michael David Escobar is an American biostatistician who is known for Bayesian nonparametrics, mixture models.