Yee Whye Teh

Last updated
Yee-Whye Teh
Alma mater University of Waterloo (BMath)
University of Toronto (PhD)
Known for Hierarchical Dirichlet process
Deep belief networks
Scientific career
Fields Machine learning
Artificial intelligence
Statistics
Computer science [1]
Institutions University of Oxford
DeepMind
University College London
University of California, Berkeley
National University of Singapore [2]
Thesis Bethe free energy and contrastive divergence approximations for undirected graphical models  (2003)
Doctoral advisor Geoffrey Hinton [3]
Website www.stats.ox.ac.uk/~teh/ OOjs UI icon edit-ltr-progressive.svg

Yee-Whye Teh is a professor of statistical machine learning in the Department of Statistics, University of Oxford. [4] [5] Prior to 2012 he was a reader at the Gatsby Charitable Foundation computational neuroscience unit at University College London. [6] His work is primarily in machine learning, artificial intelligence, statistics and computer science. [1] [7]

Contents

Education

Teh was educated at the University of Waterloo and the University of Toronto where he was awarded a PhD in 2003 for research supervised by Geoffrey Hinton. [3] [8]

Research and career

Teh was a postdoctoral fellow at the University of California, Berkeley and the National University of Singapore before he joined University College London as a lecturer. [2]

Teh was one of the original developers of deep belief networks [9] and of hierarchical Dirichlet processes. [10]

Awards and honours

Teh was a keynote speaker at Uncertainty in Artificial Intelligence (UAI) 2019, and was invited to give the Breiman lecture at the Conference on Neural Information Processing Systems (NeurIPS) 2017. [11] He served as program co-chair of the International Conference on Machine Learning (ICML) in 2017, one of the premier conferences in machine learning. [4]

References

  1. 1 2 Yee Whye Teh publications indexed by Google Scholar OOjs UI icon edit-ltr-progressive.svg
  2. 1 2 "Yee-Whye Teh, Professor of Statistical Machine Learning". stats.ox.ac.uk.
  3. 1 2 Yee Whye Teh at the Mathematics Genealogy Project OOjs UI icon edit-ltr-progressive.svg
  4. 1 2 www.stats.ox.ac.uk/~teh/ OOjs UI icon edit-ltr-progressive.svg
  5. Gram-Hansen, Bradley (2021). Extending probabilistic programming systems and applying them to real-world simulators. ox.ac.uk (DPhil thesis). University of Oxford. OCLC   1263818188. EThOS   uk.bl.ethos.833365.
  6. Gasthaus, Jan Alexander (2020). Hierarchical Bayesian nonparametric models for power-law sequences. ucl.ac.uk (PhD thesis). University College London. OCLC   1197757196. EThOS   uk.bl.ethos.807804. Lock-green.svg
  7. Yee Whye Teh at DBLP Bibliography Server OOjs UI icon edit-ltr-progressive.svg
  8. Whye Teh, Yee (2003). Bethe free energy and contrastive divergence approximations for undirected graphical models. utoronto.ca (PhD thesis). University of Toronto. hdl:1807/122253. OCLC   56683361. ProQuest   305242430.
  9. Geoffrey E. Hinton; Simon Osindero; Yee-Whye Teh (1 July 2006). "A fast learning algorithm for deep belief nets". Neural Computation . 18 (7): 1527–1554. doi:10.1162/NECO.2006.18.7.1527. ISSN   0899-7667. PMID   16764513. Zbl   1106.68094. Wikidata   Q33996665.
  10. Yee W. Teh; Michael I. Jordan; Matthew J. Beal; David M. Blei (2005). "Sharing Clusters among Related Groups: Hierarchical Dirichlet Processes" (PDF). Advances in Neural Information Processing Systems 17. Advances in Neural Information Processing Systems. Wikidata   Q77688418.
  11. "On Bayesian Deep Learning and Deep Bayesian Learning". nips.cc.