George C. Casella | |
---|---|
Born | |
Died | June 17, 2012 61) | (aged
Nationality | American |
Education | Fordham University Purdue University |
Scientific career | |
Fields | Statistics |
Institutions | Rutgers University Cornell University University of Florida |
Thesis | Minimax Ridge Regression Estimation (1977) |
Doctoral advisor | Leon Jay Gleser [1] |
Doctoral students |
George Casella (January 22, 1951 – June 17, 2012) was a Distinguished Professor in the Department of Statistics at the University of Florida. He died from multiple myeloma. [2]
Casella completed his undergraduate education at Fordham University and graduate education at Purdue University. He served on the faculty of Rutgers University, Cornell University, and the University of Florida. His contributions focused on the area of statistics including Monte Carlo methods, model selection, and genomic analysis. [2] He was particularly active in Bayesian and empirical Bayes methods, with works connecting with the Stein phenomenon, on assessing and accelerating the convergence of Markov chain Monte Carlo methods, as in his Rao–Blackwellization technique, [3] and recasting lasso as Bayesian posterior mode estimation with independent Laplace priors. [4]
Casella was named as a Fellow of the American Statistical Association and the Institute of Mathematical Statistics in 1988, and he was made an Elected Fellow of the International Statistical Institute in 1989. In 2009, he was made a Foreign Member of the Spanish Royal Academy of Sciences. [5]
Statistical inference is the process of using data analysis to infer properties of an underlying distribution of probability. Inferential statistical analysis infers properties of a population, for example by testing hypotheses and deriving estimates. It is assumed that the observed data set is sampled from a larger population.
The following outline is provided as an overview of and topical guide to statistics:
Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Fundamentally, Bayesian inference uses prior knowledge, in the form of a prior distribution in order to estimate posterior probabilities. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law. In the philosophy of decision theory, Bayesian inference is closely related to subjective probability, often called "Bayesian probability".
Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. The name comes from the Monte Carlo Casino in Monaco, where the primary developer of the method, physicist Stanislaw Ulam, was inspired by his uncle's gambling habits.
The method of least squares is a parameter estimation method in regression analysis based on minimizing the sum of the squares of the residuals made in the results of each individual equation.
In statistics, point estimation involves the use of sample data to calculate a single value which is to serve as a "best guess" or "best estimate" of an unknown population parameter. More formally, it is the application of a point estimator to the data to obtain a point estimate.
In statistics, Markov chain Monte Carlo (MCMC) is a class of algorithms used to draw samples from a probability distribution. Given a probability distribution, one can construct a Markov chain whose elements' distribution approximates it – that is, the Markov chain's equilibrium distribution matches the target distribution. The more steps that are included, the more closely the distribution of the sample matches the actual desired distribution.
Bayesian statistics is a theory in the field of statistics based on the Bayesian interpretation of probability, where probability expresses a degree of belief in an event. The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other interpretations of probability, such as the frequentist interpretation, which views probability as the limit of the relative frequency of an event after many trials. More concretely, analysis in Bayesian methods codifies prior knowledge in the form of a prior distribution.
Information geometry is an interdisciplinary field that applies the techniques of differential geometry to study probability theory and statistics. It studies statistical manifolds, which are Riemannian manifolds whose points correspond to probability distributions.
In mathematical statistics, the Fisher information is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information.
Calyampudi Radhakrishna Rao was an Indian-American mathematician and statistician. He was professor emeritus at Pennsylvania State University and research professor at the University at Buffalo. Rao was honoured by numerous colloquia, honorary degrees, and festschrifts and was awarded the US National Medal of Science in 2002. The American Statistical Association has described him as "a living legend" whose work has influenced not just statistics, but has had far reaching implications for fields as varied as economics, genetics, anthropology, geology, national planning, demography, biometry, and medicine." The Times of India listed Rao as one of the top 10 Indian scientists of all time.
Debabrata Basu was an Indian statistician who made fundamental contributions to the foundations of statistics. Basu invented simple examples that displayed some difficulties of likelihood-based statistics and frequentist statistics; Basu's paradoxes were especially important in the development of survey sampling. In statistical theory, Basu's theorem established the independence of a complete sufficient statistic and an ancillary statistic.
In statistics, resampling is the creation of new samples based on one observed sample. Resampling methods are:
Jefferson Morris Gill is Distinguished Professor of Government, and of Mathematics & Statistics, the Director of the Center for Data Science, the Editor of Political Analysis, and a member of the Center for Behavioral Neuroscience at American University as of the Fall of 2017.
Computational statistics, or statistical computing, is the study which is the intersection of statistics and computer science, and refers to the statistical methods that are enabled by using computational methods. It is the area of computational science specific to the mathematical science of statistics. This area is fast developing. The view that the broader concept of computing must be taught as part of general statistical education is gaining momentum.
Erich Leo Lehmann was a German-born American statistician, who made a major contribution to nonparametric hypothesis testing. He is one of the eponyms of the Lehmann–Scheffé theorem and of the Hodges–Lehmann estimator of the median of a population.
James Orvis Berger is an American statistician best known for his work on Bayesian statistics and decision theory. He won the COPSS Presidents' Award, one of the two highest awards in statistics, in 1985 at the age of 35. He received a Ph.D. in mathematics from Cornell University in 1974. He was a faculty member in the Department of Statistics at Purdue University until 1997, at which time he moved to the Institute of Statistics and Decision Sciences at Duke University, where he is currently the Arts and Sciences Professor of Statistics. He was also director of the Statistical and Applied Mathematical Sciences Institute from 2002 to 2010, and has been a visiting professor at the University of Chicago since 2011.
Jayanta Kumar Ghosh was an Indian statistician, an emeritus professor at Indian Statistical Institute and a professor of statistics at Purdue University.
Dirk Pieter Kroese is a Dutch-Australian mathematician and statistician, and Professor at the University of Queensland. He is known for several contributions to applied probability, kernel density estimation, Monte Carlo methods and rare-event simulation. He is, with Reuven Rubinstein, a pioneer of the Cross-Entropy (CE) method.
Christian P. Robert is a French statistician, specializing in Bayesian statistics and Monte Carlo methods.