Robert Kass

Last updated
Robert E. Kass
Born (1952-09-07) September 7, 1952 (age 71)
NationalityAmerican
Alma mater University of Chicago (PhD)
Antioch College (BA)
Known for Computational Neuroscience, Bayesian Statistics
Awards R. A. Fisher Lectureship, Elected to the National Academy of Sciences
Scientific career
Fields Statistics
Institutions Carnegie Mellon University
Thesis  (1980)
Doctoral advisor Stephen Stigler
Website www.stat.cmu.edu/~kass/

Robert E. Kass is the Maurice Falk University Professor of Statistics and Computational Neuroscience in the Department of Statistics and Data Science, the Machine Learning Department, and the Neuroscience Institute at Carnegie Mellon University.

Contents

Early life and education

Born in Boston, Massachusetts (1952), Kass earned a Bachelor of Arts degree in mathematics from Antioch College, and a PhD degree in Statistics from the University of Chicago in 1980, where his advisor was Stephen Stigler. Kass is the son of the late Harvard medical researcher Edward H. Kass [1] and stepson of the late Amalie M. Kass. His sister is the bioethicist Nancy Kass.

Research and publications

Kass's early research was on differential geometry in statistics, [2] which formed the basis for his book Geometrical Foundations of Asymptotic Inference [3] (with Paul Vos), and on Bayesian methods. Since 2000 his research has focused on statistical methods in neuroscience.

Kass's best-known work includes a comprehensive re-evaluation of Bayesian hypothesis testing and model selection, [4] [5] and the selection of prior distributions, [6] the relationship of Bayes and Empirical Bayes methods, [7] Bayesian asymptotics, [8] [9] the application of point process statistical models to neural spiking data, [10] [11] the challenges of multiple spike train analysis, [12] [13] the state-space approach to brain-computer interface, [14] and the brain's apparent ability to solve the credit assignment problem during brain-controlled robotic movement. [15] Kass's book Analysis of Neural Data [16] (with Emery Brown and Uri Eden) was published in 2014. Kass has also written on statistics education and the use of statistics, including the articles, "What is Statistics?", [17] "Statistical Inference: The Big Picture," [18] and "Ten Simple Rules for Effective Statistical Practice". [19]

Professional and administrative activities

Kass has served Chair of the Section for Bayesian Statistical Science of the American Statistical Association, Chair of the Statistics Section of the American Association for the Advancement of Science, founding Editor-in-Chief of Bayesian Analysis (journal), and Executive Editor (editor-in-chief) of the international review journal Statistical Science. At Carnegie Mellon University he was Department Head of Statistics from 1995 to 2004 and Interim Co-director of the joint CMU–University of Pittsburgh Center for the Neural Basis of Cognition 2015–2018. [20] [21]

Honors

Kass is an elected Fellow of the American Statistical Association, the Institute of Mathematical Statistics, and the American Association for the Advancement of Science, and an elected member of the National Academy of Sciences. [22] For his work on statistical modeling of neural synchrony, [23] in 2013 he received the Outstanding Statistical Application Award from the American Statistical Association, and in 2017 he received the R.A. Fisher Award and Lectureship, now known as the COPSS Distinguished Achievement Award and Lectureship, from the Committee of Presidents of Statistical Societies.

Related Research Articles

<span class="mw-page-title-main">Statistical inference</span> Process of using data analysis

Statistical inference is the process of using data analysis to infer properties of an underlying distribution of probability. Inferential statistical analysis infers properties of a population, for example by testing hypotheses and deriving estimates. It is assumed that the observed data set is sampled from a larger population.

In statistics, Markov chain Monte Carlo (MCMC) is a class of algorithms used to draw samples from a probability distribution. Given a probability distribution, one can construct a Markov chain whose elements' distribution approximates it – that is, the Markov chain's equilibrium distribution matches the target distribution. The more steps that are included, the more closely the distribution of the sample matches the actual desired distribution.

<span class="mw-page-title-main">Granger causality</span> Statistical hypothesis test for forecasting

The Granger causality test is a statistical hypothesis test for determining whether one time series is useful in forecasting another, first proposed in 1969. Ordinarily, regressions reflect "mere" correlations, but Clive Granger argued that causality in economics could be tested for by measuring the ability to predict the future values of a time series using prior values of another time series. Since the question of "true causality" is deeply philosophical, and because of the post hoc ergo propter hoc fallacy of assuming that one thing preceding another can be used as a proof of causation, econometricians assert that the Granger test finds only "predictive causality". Using the term "causality" alone is a misnomer, as Granger-causality is better described as "precedence", or, as Granger himself later claimed in 1977, "temporally related". Rather than testing whether Xcauses Y, the Granger causality tests whether X forecastsY.

Stephen Elliott Fienberg was a professor emeritus in the Department of Statistics, the Machine Learning Department, Heinz College, and Cylab at Carnegie Mellon University. Fienberg was the founding co-editor of the Annual Review of Statistics and Its Application and of the Journal of Privacy and Confidentiality.

<span class="mw-page-title-main">Zoubin Ghahramani</span> British-Iranian machine learning researcher

Zoubin Ghahramani FRS is a British-Iranian researcher and Professor of Information Engineering at the University of Cambridge. He holds joint appointments at University College London and the Alan Turing Institute. and has been a Fellow of St John's College, Cambridge since 2009. He was Associate Research Professor at Carnegie Mellon University School of Computer Science from 2003–2012. He was also the Chief Scientist of Uber from 2016 until 2020. He joined Google Brain in 2020 as senior research director. He is also Deputy Director of the Leverhulme Centre for the Future of Intelligence.

Emery Neal Brown is an American statistician, computational neuroscientist, and anesthesiologist. He is the Warren M. Zapol Professor of Anesthesia at Harvard Medical School and at Massachusetts General Hospital (MGH), and a practicing anesthesiologist at MGH. At MIT he is the Edward Hood Taplin Professor of Medical Engineering and professor of computational neuroscience, the associate director of the Institute for Medical Engineering and Science, and the Director of the Harvard–MIT Program in Health Sciences and Technology.

<span class="mw-page-title-main">Marcel Just</span>

Marcel Just is D. O. Hebb Professor of Psychology at Carnegie Mellon University. His research uses brain imaging (fMRI) in high-level cognitive tasks to study the neuroarchitecture of cognition. Just's areas of expertise include psycholinguistics, object recognition, and autism, with particular attention to cognitive and neural substrates. Just directs the Center for Cognitive Brain Imaging and is a member of the Center for the Neural Basis of Cognition at CMU.

Bayesian approaches to brain function investigate the capacity of the nervous system to operate in situations of uncertainty in a fashion that is close to the optimal prescribed by Bayesian statistics. This term is used in behavioural sciences and neuroscience and studies associated with this term often strive to explain the brain's cognitive abilities based on statistical principles. It is frequently assumed that the nervous system maintains internal probabilistic models that are updated by neural processing of sensory information using methods approximating those of Bayesian probability.

<span class="mw-page-title-main">Larry A. Wasserman</span> Canadian statistician

Larry Alan Wasserman is a Canadian-American statistician and a professor in the Department of Statistics & Data Science and the Machine Learning Department at Carnegie Mellon University.

A Bayesian Confidence Propagation Neural Network (BCPNN) is an artificial neural network inspired by Bayes' theorem, which regards neural computation and processing as probabilistic inference. Neural unit activations represent probability ("confidence") in the presence of input features or categories, synaptic weights are based on estimated correlations and the spread of activation corresponds to calculating posterior probabilities. It was originally proposed by Anders Lansner and Örjan Ekeberg at KTH Royal Institute of Technology. This probabilistic neural network model can also be run in generative mode to produce spontaneous activations and temporal sequences.

<span class="mw-page-title-main">Stuart Geman</span> American mathematician

Stuart Alan Geman is an American mathematician, known for influential contributions to computer vision, statistics, probability theory, machine learning, and the neurosciences. He and his brother, Donald Geman, are well known for proposing the Gibbs sampler, and for the first proof of convergence of the simulated annealing algorithm.

<span class="mw-page-title-main">Eric Xing</span>

Eric Poe Xing is an American computer scientist whose research spans machine learning, computational biology, and statistical methodology. Xing is founding President of the world’s first artificial intelligence university, Mohamed bin Zayed University of Artificial Intelligence (MBZUAI).

Alan Enoch Gelfand is an American statistician, and is currently the James B. Duke Professor of Statistics and Decision Sciences at Duke University. Gelfand’s research includes substantial contributions to the fields of Bayesian statistics, spatial statistics and hierarchical modeling.

Elizabeth H. Slate is an American statistician, interested in the Bayesian statistics of longitudinal data and applications to health. She is the Duncan McLean and Pearl Levine Fairweather Professor of Statistics at Florida State University. Some of Slate's most heavily cited work concerns the effects of selenium on cancer. Slate's research has also included work on the early detection of osteoarthritis.

Kathryn M. Roeder is an American statistician known for her development of statistical methods to uncover the genetic basis of complex disease and her contributions to mixture models, semiparametric inference, and multiple testing. Roeder holds positions as professor of statistics and professor of computational biology at Carnegie Mellon University, where she leads a project focused on discovering genes associated with autism.

Raquel Prado is a Venezuelan Bayesian statistician. She is a professor of statistics in the Jack Baskin School of Engineering of the University of California, Santa Cruz, and has been elected president of the International Society for Bayesian Analysis for the 2019 term.

Siddhartha Chib is an econometrician and statistician, the Harry C. Hartkopf Professor of Econometrics and Statistics at Washington University in St. Louis. His work is primarily in Bayesian statistics, econometrics, and Markov chain Monte Carlo methods.

Bayesian history matching is a statistical method for calibrating complex computer models. The equations inside many scientific computer models contain parameters which have a true value, but that true value is often unknown; history matching is one technique for learning what these parameters could be.

<span class="mw-page-title-main">Roderick J. A. Little</span> Ph.D. University of London 1974

Roderick Joseph Alexander Little is an academic statistician, whose main research contributions lie in the statistical analysis of data with missing values and the analysis of complex sample survey data. Little is Richard D. Remington Distinguished University Professor of Biostatistics in the Department of Biostatistics at the University of Michigan, where he also holds academic appointments in the Department of Statistics and the Institute for Social Research.

Michael David Escobar is an American biostatistician who is known for Bayesian nonparametrics, mixture models.

References

  1. "Edward H. Kass, M.D., Eulogy".
  2. Kass, Robert E. (1989). "The Geometry of Asymptotic Inference (with discussion)". Statistical Science. 4 (3): 188–234. doi: 10.1214/SS/1177012480 . JSTOR   2245626. S2CID   119728605.
  3. Kass, Robert E.; Vos, Paul (1997-03-07). Geometrical Foundations of Asymptotic Inference. doi:10.1002/9781118165980. ISBN   9780471826682.
  4. Kass, Robert E.; Raftery, Adrian (2012-02-27). "Bayes Factors". Journal of the American Statistical Association. 90 (430): 773–795. doi:10.1080/01621459.1995.10476572.
  5. Kass, Robert E.; Wasserman, Larry A. (1995). "A Reference Bayesian Test for Nested Hypotheses and its Relationship to the Schwarz Criterion". Journal of the American Statistical Association. 90 (431): 928–934. doi:10.1080/01621459.1995.10476592. S2CID   120491167 via Taylor & Francis Online.
  6. Kass, Robert E.; Wasserman, Larry A. (1996). "The Selection of Prior Distributions by Formal Rules". Journal of the American Statistical Association. 91 (435): 1343–1370. doi:10.1080/01621459.1996.10477003. S2CID   53645083 via Taylor & Francis Online.
  7. Kass, Robert E.; Steffey, Duane (2012-03-12). "Approximate Bayesian Inference in Conditionally Independent Hierarchical Models (Parametric Empirical Bayes Models)". Journal of the American Statistical Association. 84 (407): 717–726. doi:10.1080/01621459.1989.10478825.
  8. Kass, Robert E.; Tierney, Richard L. (1989). "Fully Exponential Laplace Approximations to Expectations and Variances of Nonpositive Functions". Journal of the American Statistical Association. 84 (407): 710–716. doi:10.1080/01621459.1989.10478824. S2CID   16075665 via Taylor & Francis Online.
  9. Kass, Robert E., Tierney, Richard L. and Kadane, Joseph B. (1990) The validity of posterior expansions based on Laplace's method, Essays in Honor of George Bernard, eds. S. Geisser, J.S. Hodges, S.J. Press, and A. Zellner, Amsterdam: North Holland, 473-488.
  10. Kass, Robert E.; Ventura, Valerie (2001). "A spike-train probability model, Neural Computation". Neural Computation. 13 (8): 1713–1720. doi:10.1162/08997660152469314. PMID   11506667. S2CID   1840562 via MIT Press Direct.
  11. DiMatteo, Illaria; Genovese, Christopher R.; Kass, Robert E. (2001-12-01). "Bayesian curve‐fitting with free‐knot splines". Biometrika. 88 (4): 1055–1071. doi:10.1093/biomet/88.4.1055.
  12. Brown, Emery N.; Mitra, Partha P.; Kass, Robert E. (2004-04-27). "Multiple neural spike train data analysis: state-of-the-art and future challenges". Nature Neuroscience. 7 (4): 456–461. doi:10.1038/nn1228. PMID   15114358. S2CID   562815.
  13. Kass, Robert E.; Ventura, Valerie; Brown, Emery N. (2005-07-01). "Statistical Issues in the Analysis of Neuronal Data". Journal of Neurophysiology. 94 (1): 8–25. doi:10.1152/jn.00648.2004. PMID   15985692.
  14. Brockwell, Anthony E.; Rojas, A.L.; Kass, Robert E. (2004-04-01). "Recursive Bayesian Decoding of Motor Cortical Signals by Particle Filtering". Journal of Neurophysiology. 91 (4): 1899–1907. doi:10.1152/jn.00438.2003. PMID   15010499. S2CID   15092944.
  15. Jarosiewicz, Beata; Chase, Steven M.; Farser, George W.; Velliste, Meel; Kass, Robert E.; Schwartz, Andrew B. (2008-12-01). "Functional network reorganization during learning in a brain-computer interface paradigm". PNAS. 105 (49): 19486–19491. Bibcode:2008PNAS..10519486J. doi: 10.1073/pnas.0808113105 . PMC   2614787 . PMID   19047633.
  16. Kass, Robert E.; Brown, Emery N.; Eden, Uri (2014). Analysis of Neural Data. Springer Series in Statistics. Wiley. doi:10.1007/978-1-4614-9602-1. ISBN   978-1-4614-9602-1.
  17. Kass, Robert E.; Brown, Emery N. (2008-09-01). "What Is Statistics?". The American Statistician. 63 (2): 105–110. doi:10.1198/tast.2009.0019. S2CID   120522019.
  18. Kass, Robert E. (2011-06-11). "Statistical Inference: The Big Picture". Statistical Science. 26 (1): 1–9. doi:10.1214/10-STS337. PMC   3153074 . PMID   21841892.
  19. Kass, Robert E.; Caffo, Brian S.; Davidian, Marie; Meng, Xiao-Li; Reid, Nancy (2016-06-06). "Ten Simple Rules for Effective Statistical Practice". PLOS Comput Biol. 12 (6): e1004961. Bibcode:2016PLSCB..12E4961K. doi: 10.1371/journal.pcbi.1004961 . PMC   4900655 . PMID   27281180.
  20. University, Carnegie Mellon. "Robert Kass - Statistics & Data Science - Dietrich College of Humanities and Social Sciences - Carnegie Mellon University". www.cmu.edu. Retrieved 2023-05-30.
  21. "Kass Elected to National Academy of Sciences – CNBC" . Retrieved 2023-05-30.
  22. Simmons, Abby (2023-05-09). "Kass Elected to National Academy of Sciences - News - Carnegie Mellon University". www.cmu.edu. Retrieved 2023-05-30.
  23. Kass, Robert E.; Kelly, Ryan C.; Loh, Wei-Liem (2011-07-13). "Assessment of synchrony in multiple neural spike trains using loglinear point process models". The Annals of Applied Statistics. 5 (2B): 1262–1292. arXiv: 1107.5872 . Bibcode:2011arXiv1107.5872K. doi:10.1214/10-AOAS429. PMC   3152213 . PMID   21837263.