James O. Berger

Last updated
James O. Berger
James Berger Oberwolfach 2005.jpg
Born
James Orvis Berger

(1950-04-06) 6 April 1950 (age 74)
Nationality American
Alma mater Cornell University
Known for Bayesian inference, Statistical hypothesis testing, Computer experiments
Awards COPSS Presidents' Award (1985)
National Academy of Sciences (2003)
Guggenheim Fellowship
IMS R. A. Fisher Lectureship
Scientific career
Fields Statistician, Bayesian
Institutions Purdue University
Duke University
Thesis Admissibility in Location Parameter Problems  (1974)
Doctoral advisor Lawrence D. Brown
Doctoral students Dipak K. Dey

James Orvis Berger (born April 6, 1950, in Minneapolis, Minnesota) [1] is an American statistician best known for his work on Bayesian statistics and decision theory. He won the COPSS Presidents' Award, one of the two highest awards in statistics, in 1985 at the age of 35. He received a Ph.D. in mathematics from Cornell University in 1974. He was a faculty member in the Department of Statistics at Purdue University until 1997, at which time he moved to the Institute of Statistics and Decision Sciences (now the Department of Statistical Science) at Duke University, where he is currently the Arts and Sciences Professor of Statistics. He was also director of the Statistical and Applied Mathematical Sciences Institute from 2002 to 2010, and has been a visiting professor at the University of Chicago since 2011. [1] [2] [3]

Contents

Contributions to science

Berger has worked on the decision theoretic bases of Bayesian inference, including advances on the Stein phenomenon [4] [5] during and after his thesis. He has also greatly contributed to advances in the so-called objective Bayes approach where prior distributions are constructed from the structure of the sampling distributions and/or of frequentist properties. He is also recognized for his analysis of the opposition between Bayesian and frequentist visions on testing statistical hypotheses, with criticisms of the use of p-values [6] and critical levels.

Awards and honors

Berger has received numerous awards for his work: Guggenheim Fellowship, the COPSS Presidents' Award and the R. A. Fisher Lectureship. He was elected as a Fellow of the American Statistical Association and to the National Academy of Sciences in 2003. [7] In 2004, he was awarded an honorary Doctor of Science degree by Purdue University. [8]

Bibliography

Related Research Articles

Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief.

<span class="mw-page-title-main">Statistical inference</span> Process of using data analysis

Statistical inference is the process of using data analysis to infer properties of an underlying distribution of probability. Inferential statistical analysis infers properties of a population, for example by testing hypotheses and deriving estimates. It is assumed that the observed data set is sampled from a larger population.

<span class="mw-page-title-main">Statistical hypothesis test</span> Method of statistical inference

A statistical hypothesis test is a method of statistical inference used to decide whether the data sufficiently support a particular hypothesis. A statistical hypothesis test typically involves a calculation of a test statistic. Then a decision is made, either by comparing the test statistic to a critical value or equivalently by evaluating a p-value computed from the test statistic. Roughly 100 specialized statistical tests have been defined.

Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Fundamentally, Bayesian inference uses prior knowledge, in the form of a prior distribution in order to estimate posterior probabilities. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called "Bayesian probability".

<span class="mw-page-title-main">Debabrata Basu</span> Indian statistician

Debabrata Basu was an Indian statistician who made fundamental contributions to the foundations of statistics. Basu invented simple examples that displayed some difficulties of likelihood-based statistics and frequentist statistics; Basu's paradoxes were especially important in the development of survey sampling. In statistical theory, Basu's theorem established the independence of a complete sufficient statistic and an ancillary statistic.

Stephen Elliott Fienberg was a professor emeritus in the Department of Statistics, the Machine Learning Department, Heinz College, and Cylab at Carnegie Mellon University. Fienberg was the founding co-editor of the Annual Review of Statistics and Its Application and of the Journal of Privacy and Confidentiality.

<span class="mw-page-title-main">Lawrence D. Brown</span> American statistician

Lawrence David (Larry) Brown was Miers Busch Professor and Professor of Statistics at the Wharton School of the University of Pennsylvania in Philadelphia, Pennsylvania. He is known for his groundbreaking work in a broad range of fields including decision theory, recurrence and partial differential equations, nonparametric function estimation, minimax and adaptation theory, and the analysis of census data and call-center data.

The COPSS Presidents' Award is given annually by the Committee of Presidents of Statistical Societies to a young statistician in recognition of outstanding contributions to the profession of statistics. The COPSS Presidents' Award is generally regarded as one of the highest honours in the field of statistics, along with the International Prize in Statistics.

Statistics, in the modern sense of the word, began evolving in the 18th century in response to the novel needs of industrializing sovereign states.

The foundations of statistics consist of the mathematical and philosophical basis for arguments and inferences made using statistics. This includes the justification for the methods of statistical inference, estimation and hypothesis testing, the quantification of uncertainty in the conclusions of statistical arguments, and the interpretation of those conclusions in probabilistic terms. A valid foundation can be used to explain statistical paradoxes such as Simpson's paradox, provide a precise description of observed statistical laws, and guide the application of statistical conclusions in social and scientific applications.

David George Clayton, is a British statistician and epidemiologist. He is titular Professor of Biostatistics in the University of Cambridge and Wellcome Trust and Juvenile Diabetes Research Foundation Principal Research Fellow in the Diabetes and Inflammation Laboratory, where he chairs the statistics group. Clayton is an ISI highly cited researcher placing him in the top 250 most cited scientists in the mathematics world over the last 20 years.

<span class="mw-page-title-main">Nancy Reid</span> Canadian statistician

Nancy Margaret Reid is a Canadian theoretical statistician. She is a professor at the University of Toronto where she holds a Canada Research Chair in Statistical Theory. In 2015 Reid became Director of the Canadian Institute for Statistical Sciences.

Raymond James Carroll is an American statistician, and Distinguished Professor of statistics, nutrition and toxicology at Texas A&M University. He is a recipient of 1988 COPSS Presidents' Award and 2002 R. A. Fisher Lectureship. He has made fundamental contributions to measurement error model, nonparametric and semiparametric modeling.

Roger Lee Berger is an American statistician and professor, co-author of Statistical Inference, first published in 1990 with collaborator George Casella.

<span class="mw-page-title-main">Dipak K. Dey</span> Indian-American statistician

Dipak Kumar Dey is an Indian-American statistician best known for his work on Bayesian methodologies. He is currently the Board of Trustees Distinguished Professor in the Department of Statistics at the University of Connecticut. Dey has an international reputation as a statistician as well as a data scientist. Since he earned a Ph.D. degree in statistics from Purdue University in 1980, Dey has made tremendous contributions to the development of modern statistics, especially in Bayesian analysis, decision science and model selection. Dey has published more than 10 books and edited volumes, and over 260 research articles in peer-refereed national and international journals. In addition, the statistical methodologies that he has developed has found wide applications in a plethora of interdisciplinary and applied fields, such as biometry and bioinformatics, genetics, econometrics, environmental science, and social science. Dey has supervised 40 Ph.D. students, and presented more than 200 professional talks in colloquia, seminars and conferences all over the world. During his career, Dey has been a visiting professor or scholar at many institutions or research centers around the world, such as Macquarie University, Pontificia Universidad Católica de Chile,, University of São Paulo, University of British Columbia, Statistical and Applied Mathematical Sciences Institute, etc. Dey is an elected fellow of the American Association for the Advancement of Science, the American Statistical Association, the Institute of Mathematical Statistics, the International Society for Bayesian Analysis and the International Statistical Institute.

John D. Storey is the William R. Harman '63 and Mary-Love Harman Professor in Genomics at Princeton University. His research is focused on statistical inference of high-dimensional data, particularly genomic data. Storey was the founding director of the Princeton University Center for Statistics and Machine Learning.

Robert E. Kass is the Maurice Falk Professor of Statistics and Computational Neuroscience in the Department of Statistics and Data Science, the Machine Learning Department, and the Neuroscience Institute at Carnegie Mellon University.

<span class="mw-page-title-main">Linda Zhao</span> Chinese statistician

Linda Hong Zhao is a Chinese-American statistician. She is a Professor of Statistics and at the Wharton School of the University of Pennsylvania. She is a Fellow of the Institute of Mathematical Statistics. Zhao specializes in modern machine learning methods.

Guosheng Yin is a statistician, data scientist, educator and researcher in Biostatistics, Statistics, machine learning, and AI. Presently, Guosheng Yin is Chair in Statistics in Department of Mathematics at Imperial College London. Previously, he served as the Head of Department and the Patrick S C Poon Endowed Chair in Statistics and Actuarial Science, at the University of Hong Kong. Before he joined the University of Hong Kong, Yin worked at the University of Texas M.D. Anderson Cancer Center till 2009 as a tenured Associate Professor of Biostatistics.

Veronika Ročková is a Bayesian statistician. Born in Czechoslovakia, and educated in the Czech Republic, Belgium, and the Netherlands, she works in the US as a professor of econometrics and statistics and James S. Kemper Faculty Scholar at the University of Chicago. Her research studies methods including variable selection, high-dimensional inference, non-convex optimization, likelihood-free inference, and the spike-and-slab LASSO, and also includes applications in biomedical statistics.

References

  1. 1 2 Wolpert, Robert L. (2004). "A Conversation with James O. Berger". Statistical Science. 19 (1): 205–218. doi: 10.1214/088342304000000053 .
  2. "ISI Highly Cited: James O. Berger". ISI Web of Knowledge. 2003.{{cite journal}}: Cite journal requires |journal= (help)
  3. "Statistical and Applied Mathematical Sciences Institute". Archived from the original on 2008-09-30.
  4. Berger, J. O. (1982). "Selecting a Minimax Estimator of a Multivariate Normal Mean". The Annals of Statistics. 10: 81–92. doi: 10.1214/aos/1176345691 .
  5. Brown, L. (1980). "Examples of Berger's Phenomenon in the Estimation of Independent Normal Means". The Annals of Statistics. 8 (3): 572–585. doi: 10.1214/aos/1176345009 .
  6. Sellke, Thomas; Bayarri, M. J.; Berger, James O. (2001). "Calibration of p Values for Testing Precise Null Hypotheses". The American Statistician . 55 (1): 62–71. doi:10.1198/000313001300339950. JSTOR   2685531. S2CID   396772.
  7. "Statistician James O. Berger Elected to National Academy of Sciences". PR Newswire. 2003.{{cite journal}}: Cite journal requires |journal= (help)
  8. "James O. Berger: Doctor of Science". Purdue University. 2004.{{cite journal}}: Cite journal requires |journal= (help)