Bani K. Mallick

Last updated
Bani K. Mallick
Alma mater University of Connecticut
Scientific career
Fields Bayesian statistics
Institutions Texas A&M University
Thesis Bayesian Modeling Incorporating Unknown Monotone Functions
Doctoral advisor Alan E. Gelfand

Bani K. Mallick is a Distinguished Professor [1] and Susan M. Arseven `75 Chair in Data Science and Computational Statistics in the Department of Statistics at Texas A&M University in College Station. [2] He is the Director of the Center for Statistical Bioinformatics. Mallick is well known for his contribution to the theory and practice of Bayesian semiparametric methods and uncertainty quantification. Mallick is an elected fellow of American Association for the Advancement of Science, [3] American Statistical Association, Institute of Mathematical Statistics, International Statistical Institute and the Royal Statistical Society. He received the Distinguished research award [4] from Texas A&M University and the Young Researcher award from the International Indian Statistical Association. [5]

Mallick's areas of research include semiparametric classification and regression, hierarchical spatial modeling, inverse problem, uncertainty quantification and Bioinformatics. He is renowned for his ability to do major collaborative research with scientists from myriad fields beyond his own, including nuclear engineering, petroleum engineering, industrial engineering, traffic mapping.[ citation needed ] He has coauthored or co-edited six books and more than 200 research publications.

Mallick earned his undergraduate from the Presidency University, Kolkata, MS from the Calcutta University and Ph.D from the University of Connecticut.

Bibliography

Related Research Articles

Bayesian statistics is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a degree of belief in an event. The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other interpretations of probability, such as the frequentist interpretation that views probability as the limit of the relative frequency of an event after many trials.

<span class="mw-page-title-main">Kriging</span> Method of interpolation

In statistics, originally in geostatistics, kriging or Kriging, also known as Gaussian process regression, is a method of interpolation based on Gaussian process governed by prior covariances. Under suitable assumptions of the prior, kriging gives the best linear unbiased prediction (BLUP) at unsampled locations. Interpolating methods based on other criteria such as smoothness may not yield the BLUP. The method is widely used in the domain of spatial analysis and computer experiments. The technique is also known as Wiener–Kolmogorov prediction, after Norbert Wiener and Andrey Kolmogorov.

Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system can be divided and allocated to different sources of uncertainty in its inputs. A related practice is uncertainty analysis, which has a greater focus on uncertainty quantification and propagation of uncertainty; ideally, uncertainty and sensitivity analysis should be run in tandem.

There are two main uses of the term calibration in statistics that denote special types of statistical inference problems. "Calibration" can mean

The Bayes factor is a ratio of two competing statistical models represented by their marginal likelihood, and is used to quantify the support for one model over the other. The models in questions can have a common set of parameters, such as a null hypothesis and an alternative, but this is not necessary; for instance, it could also be a non-linear model compared to its linear approximation. The Bayes factor can be thought of as a Bayesian analog to the likelihood-ratio test, but since it uses the (integrated) marginal likelihood instead of the maximized likelihood, both tests only coincide under simple hypotheses. Also, in contrast with null hypothesis significance testing, Bayes factors support evaluation of evidence in favor of a null hypothesis, rather than only allowing the null to be rejected or not rejected.

Ridge regression is a method of estimating the coefficients of multiple-regression models in scenarios where the independent variables are highly correlated. It has been used in many fields including econometrics, chemistry, and engineering. Also known as Tikhonov regularization, named for Andrey Tikhonov, it is a method of regularization of ill-posed problems. it is particularly useful to mitigate the problem of multicollinearity in linear regression, which commonly occurs in models with large numbers of parameters. In general, the method provides improved efficiency in parameter estimation problems in exchange for a tolerable amount of bias.

<span class="mw-page-title-main">C. R. Rao</span> Indian-American mathematician

Calyampudi Radhakrishna Rao FRS, commonly known as C. R. Rao, is an Indian-American mathematician and statistician. He is currently professor emeritus at Pennsylvania State University and Research Professor at the University at Buffalo. Rao has been honoured by numerous colloquia, honorary degrees, and festschrifts and was awarded the US National Medal of Science in 2002. The American Statistical Association has described him as "a living legend whose work has influenced not just statistics, but has had far reaching implications for fields as varied as economics, genetics, anthropology, geology, national planning, demography, biometry, and medicine." The Times of India listed Rao as one of the top 10 Indian scientists of all time. Rao is also a Senior Policy and Statistics advisor for the Indian Heart Association non-profit focused on raising South Asian cardiovascular disease awareness.

In statistics, classification is the problem of identifying which of a set of categories (sub-populations) an observation belongs to. Examples are assigning a given email to the "spam" or "non-spam" class, and assigning a diagnosis to a given patient based on observed characteristics of the patient.

Multilevel models are statistical models of parameters that vary at more than one level. An example could be a model of student performance that contains measures for individual students as well as measures for classrooms within which the students are grouped. These models can be seen as generalizations of linear models, although they can also extend to non-linear models. These models became much more popular after sufficient computing power and software became available.

Uncertainty quantification (UQ) is the science of quantitative characterization and reduction of uncertainties in both computational and real world applications. It tries to determine how likely certain outcomes are if some aspects of the system are not exactly known. An example would be to predict the acceleration of a human body in a head-on crash with another car: even if the speed was exactly known, small differences in the manufacturing of individual cars, how tightly every bolt has been tightened, etc., will lead to different results that can only be predicted in a statistical sense.

In statistics, multivariate adaptive regression splines (MARS) is a form of regression analysis introduced by Jerome H. Friedman in 1991. It is a non-parametric regression technique and can be seen as an extension of linear models that automatically models nonlinearities and interactions between variables.

<span class="mw-page-title-main">Jayanta Kumar Ghosh</span>

Jayanta Kumar Ghosh was an Indian statistician, an emeritus professor at Indian Statistical Institute and a professor of statistics at Purdue University.

Raymond James Carroll is an American statistician, and Distinguished Professor of statistics, nutrition and toxicology at Texas A&M University. He is a recipient of 1988 COPSS Presidents' Award and 2002 R. A. Fisher Lectureship. He has made fundamental contributions to measurement error model, nonparametric and semiparametric modeling.

David Brian Dunson is an American statistician who is Arts and Sciences Distinguished Professor of Statistical Science, Mathematics and Electrical & Computer Engineering at Duke University. His research focuses on developing statistical methods for complex and high-dimensional data. Particular themes of his work include the use of Bayesian hierarchical models, methods for learning latent structure in complex data, and the development of computationally efficient algorithms for uncertainty quantification. He is currently serving as joint Editor of the Journal of the Royal Statistical Society, Series B.

<span class="mw-page-title-main">Outline of machine learning</span> Overview of and topical guide to machine learning

The following outline is provided as an overview of and topical guide to machine learning. Machine learning is a subfield of soft computing within computer science that evolved from the study of pattern recognition and computational learning theory in artificial intelligence. In 1959, Arthur Samuel defined machine learning as a "field of study that gives computers the ability to learn without being explicitly programmed". Machine learning explores the study and construction of algorithms that can learn from and make predictions on data. Such algorithms operate by building a model from an example training set of input observations in order to make data-driven predictions or decisions expressed as outputs, rather than following strictly static program instructions.

<span class="mw-page-title-main">Dipak K. Dey</span> Indian-American statistician

Dipak Kumar Dey is an Indian-American statistician best known for his work on Bayesian methodologies. He is currently the Board of Trustees Distinguished Professor in the Department of Statistics at the University of Connecticut. Dey has an international reputation as a statistician as well as a data scientist. Since he earned a Ph.D. degree in statistics from Purdue University in 1980, Dey has made tremendous contributions to the development of modern statistics, especially in Bayesian analysis, decision science and model selection. Dey has published more than 10 books and edited volumes, and over 260 research articles in peer-refereed national and international journals. In addition, the statistical methodologies that he has developed has found wide applications in a plethora of interdisciplinary and applied fields, such as biometry and bioinformatics, genetics, econometrics, environmental science, and social science. Dey has supervised 40 Ph.D. students, and presented more than 200 professional talks in colloquia, seminars and conferences all over the world. During his career, Dey has been a visiting professor or scholar at many institutions or research centers around the world, such as Macquarie University, Pontificia Universidad Católica de Chile,, University of São Paulo, University of British Columbia, Statistical and Applied Mathematical Sciences Institute, etc. Dey is an elected fellow of the American Association for the Advancement of Science, the American Statistical Association, the Institute of Mathematical Statistics, the International Society for Bayesian Analysis and the International Statistical Institute.

Nonlinear mixed-effects models constitute a class of statistical models generalizing linear mixed-effects models. Like linear mixed-effects models, they are particularly useful in settings where there are multiple measurements within the same statistical units or when there are dependencies between measurements on related statistical units. Nonlinear mixed-effects models are applied in many fields including medicine, public health, pharmacology, and ecology.

Ajit C. Tamhane is a professor in the Department of Industrial Engineering and Management Sciences (IEMS) at Northwestern University and also holds a courtesy appointment in the Department of Statistics.

Probabilistic numerics is a scientific field at the intersection of statistics, machine learning and applied mathematics, where tasks in numerical analysis including finding numerical solutions for integration, linear algebra, optimisation and differential equations are seen as problems of statistical, probabilistic, or Bayesian inference.

References

  1. "List of Distinguished Professors | Office of the Dean of Faculties". Archived from the original on 2013-12-20. Retrieved 2013-12-19.
  2. "Bani K. Mallick".
  3. "AAAS Council Elects 388 New AAAS Fellows | American Association for the Advancement of Science".
  4. "Archived copy" (PDF). Archived from the original (PDF) on 2013-12-20. Retrieved 2013-12-19.{{cite web}}: CS1 maint: archived copy as title (link)
  5. "Past IISA Award Recipients | International Indian Statistical Association".