Yee-Whye Teh | |
---|---|
Alma mater | University of Waterloo (BMath) University of Toronto (PhD) |
Known for | Hierarchical Dirichlet process Deep belief networks |
Scientific career | |
Fields | Machine learning Artificial intelligence Statistics Computer science [1] |
Institutions | University of Oxford DeepMind University College London University of California, Berkeley National University of Singapore [2] |
Thesis | Bethe free energy and contrastive divergence approximations for undirected graphical models (2003) |
Doctoral advisor | Geoffrey Hinton [3] |
Website | www |
Yee-Whye Teh is a professor of statistical machine learning in the Department of Statistics, University of Oxford. [4] [5] Prior to 2012 he was a reader at the Gatsby Charitable Foundation computational neuroscience unit at University College London. [6] His work is primarily in machine learning, artificial intelligence, statistics and computer science. [1] [7]
Teh was educated at the University of Waterloo and the University of Toronto where he was awarded a PhD in 2003 for research supervised by Geoffrey Hinton. [3] [8]
Teh was a postdoctoral fellow at the University of California, Berkeley and the National University of Singapore before he joined University College London as a lecturer. [2]
Teh was one of the original developers of deep belief networks [9] and of hierarchical Dirichlet processes. [10]
Teh was a keynote speaker at Uncertainty in Artificial Intelligence (UAI) 2019, and was invited to give the Breiman lecture at the Conference on Neural Information Processing Systems (NeurIPS) 2017. [11] He served as program co-chair of the International Conference on Machine Learning (ICML) in 2017, one of the premier conferences in machine learning. [4]
Geoffrey Everest Hinton is a British-Canadian cognitive psychologist and computer scientist, most noted for his work on artificial neural networks. From 2013 to 2023, he divided his time working for Google and the University of Toronto, before publicly announcing his departure from Google in May 2023, citing concerns about the risks of artificial intelligence (AI) technology. In 2017, he co-founded and became the chief scientific advisor of the Vector Institute in Toronto.
The School of Informatics is an academic unit of the University of Edinburgh, in Scotland, responsible for research, teaching, outreach and commercialisation in informatics. It was created in 1998 from the former department of artificial intelligence, the Centre for Cognitive Science and the department of computer science, along with the Artificial Intelligence Applications Institute (AIAI) and the Human Communication Research Centre.
Gary William Gibbons is a British theoretical physicist.
Jonathan A. Jones is a professor in atomic and laser physics at the University of Oxford, and a fellow and tutor in physics at Brasenose College, Oxford.
Donald Michie was a British researcher in artificial intelligence. During World War II, Michie worked for the Government Code and Cypher School at Bletchley Park, contributing to the effort to solve "Tunny", a German teleprinter cipher.
(John) Martin Elliott Hyland is professor of mathematical logic at the University of Cambridge and a fellow of King's College, Cambridge. His interests include mathematical logic, category theory, and theoretical computer science.
Peter Dayan is a British neuroscientist and computer scientist who is director at the Max Planck Institute for Biological Cybernetics in Tübingen, Germany. He is co-author of Theoretical Neuroscience, an influential textbook on computational neuroscience. He is known for applying Bayesian methods from machine learning and artificial intelligence to understand neural function and is particularly recognized for relating neurotransmitter levels to prediction errors and Bayesian uncertainties. He has pioneered the field of reinforcement learning (RL) where he helped develop the Q-learning algorithm, and made contributions to unsupervised learning, including the wake-sleep algorithm for neural networks and the Helmholtz machine.
Peter John Hore is a British chemist and academic. He is a Professor of Chemistry at the University of Oxford and fellow of Corpus Christi College, Oxford. He is the author of two Oxford Chemistry Primers on Nuclear Magnetic Resonance (NMR) and research articles primarily in the area of NMR, electron paramagnetic resonance (EPR), spin chemistry and magnetoreception during bird migration.
Hugh Christopher Longuet-Higgins was a British scholar and teacher. He was the Professor of Theoretical Chemistry at the University of Cambridge for 13 years until 1967 when he moved to the University of Edinburgh to work in the developing field of cognitive science. He made many significant contributions to our understanding of molecular science. He was also a gifted amateur musician, both as performer and composer, and was keen to advance the scientific understanding of this art. He was the founding editor of the journal Molecular Physics.
Christopher Michael Bishop is a British computer scientist. He is a Microsoft Technical Fellow and Director of Microsoft Research AI4Science. He is also Honorary Professor of Computer Science at the University of Edinburgh, and a Fellow of Darwin College, Cambridge. Chris was a founding member of the UK AI Council, and in 2019 he was appointed to the Prime Minister’s Council for Science and Technology.
In probability theory, a Pitman–Yor process denoted PY(d, θ, G0), is a stochastic process whose sample path is a probability distribution. A random sample from this process is an infinite discrete probability distribution, consisting of an infinite set of atoms drawn from G0, with weights drawn from a two-parameter Poisson-Dirichlet distribution. The process is named after Jim Pitman and Marc Yor.
James Julian Bennett Jack is a New Zealand physiologist.
In statistics and machine learning, the hierarchical Dirichlet process (HDP) is a nonparametric Bayesian approach to clustering grouped data. It uses a Dirichlet process for each group of data, with the Dirichlet processes for all groups sharing a base distribution which is itself drawn from a Dirichlet process. This method allows groups to share statistical strength via sharing of clusters across groups. The base distribution being drawn from a Dirichlet process is important, because draws from a Dirichlet process are atomic probability measures, and the atoms will appear in all group-level Dirichlet processes. Since each atom corresponds to a cluster, clusters are shared across all groups. It was developed by Yee Whye Teh, Michael I. Jordan, Matthew J. Beal and David Blei and published in the Journal of the American Statistical Association in 2006, as a formalization and generalization of the infinite hidden Markov model published in 2002.
Radford M. Neal is a professor emeritus at the Department of Statistics and Department of Computer Science at the University of Toronto, where he holds a research chair in statistics and machine learning.
E-Theses Online Service (EThOS) is a bibliographic database and union catalogue of electronic theses provided by the British Library, the National Library of the United Kingdom. As of February 2022 EThOS provides access to over 500,000 doctoral theses awarded by over 140 UK higher education institutions, with around 3000 new thesis records added every month.
Bubacarr Bah is a Gambian mathematician and chair of Data Science at the African Institute for Mathematical Sciences (AIMS). He is an assistant professor at Stellenbosch University and a member of the Google advanced technology external advisory council.
Jonathan M. Austyn is Professor of Immunobiology at the University of Oxford and a Fellow of Wolfson College, Oxford. He has taught immunology over many years, and designed the Master of Science course in Integrated Immunology at the University of Oxford, which he co-directs.
An energy-based model (EBM) is a form of generative model (GM) imported directly from statistical physics to learning. GMs learn an underlying data distribution by analyzing a sample dataset. Once trained, a GM can produce other datasets that also match the data distribution. EBMs provide a unified framework for many probabilistic and non-probabilistic approaches to such learning, particularly for training graphical and other structured models.
Neil David Lawrence is the DeepMind Professor of Machine Learning at the University of Cambridge in the Department of Computer Science and Technology, senior AI fellow at the Alan Turing Institute and visiting professor at the University of Sheffield.
Mark A. Girolami is a British civil engineer, statistician and data engineer. He has held the Sir Kirby Laing Professorship of Civil Engineering in the Department of Engineering at the University of Cambridge since 2019. He has been the chief scientist of the Alan Turing Institute since 2021. He is a Fellow of Christ's College, Cambridge, and winner of a Royal Society Wolfson Research Merit Award. Girolami is a founding editor of the journal Data-Centric Engineering, and also served as the program director for data-centric engineering at Turing.