Michael Kearns (computer scientist)

Last updated
Michael Justin Kearns
Born
California
Alma mater University of California at Berkeley (BS, 1985)
Harvard University (PhD, 1989)
AwardsMember of U.S. National Academy of Sciences (2021)
Fellow of the American Academy of Arts and Sciences (2012)
ACM Fellow (2014) [1]
Fellow of the Association for the Advancement of Artificial Intelligence (2003)
Scientific career
Institutions University of Pennsylvania (2002–)
AT&T Bell Labs (1991–2001)
Thesis The Computational Complexity of Machine Learning  (1989)
Doctoral advisor Leslie Valiant
Other academic advisors Ronald Rivest (postdoctoral, MIT)
Richard M. Karp (postdoctoral, UC Berkeley)
Doctoral students Jennifer Wortman Vaughan
Other notable students John Langford (postdoctoral visitor)
Website www.cis.upenn.edu/~mkearns/

Michael Justin Kearns is an American computer scientist, professor and National Center Chair at the University of Pennsylvania, the founding director of Penn's Singh Program in Networked & Social Systems Engineering (NETS), the founding director of Warren Center for Network and Data Sciences, and also holds secondary appointments in Penn's Wharton School and department of Economics. He is a leading researcher in computational learning theory and algorithmic game theory, and interested in machine learning, artificial intelligence, computational finance, algorithmic trading, computational social science and social networks. [1] He previously led the Advisory and Research function in Morgan Stanley's Artificial Intelligence Center of Excellence team, [2] and is currently an Amazon Scholar within Amazon Web Services. [3]

Contents

Biography

Kearns was born into an academic family, where his father David R Kearns is Professor Emeritus at University of California, San Diego in chemistry, who won Guggenheim Fellowship in 1969, [4] and his uncle Thomas R. Kearns is Professor Emeritus at Amherst College in Philosophy and Law, Jurisprudence, and Social Thought. His paternal grandfather Clyde W. Kearns was a pioneer in insecticide toxicology and was a professor at University of Illinois at Urbana–Champaign in Entomology, [5] and his maternal grandfather Chen Shou-Yi (1899–1978) was a professor at Pomona College in history and literature, who was born in Canton (Guangzhou, China) into a family noted for their scholarship and educational leadership. [6] [7]

Kearns received his B.S. degree at the University of California at Berkeley in math and computer science in 1985, and Ph.D. in computer science from Harvard University in 1989, under the supervision of Turing award winner Leslie Valiant. His doctoral dissertation was The Computational Complexity of Machine Learning, later published by MIT press as part of the ACM Doctoral Dissertation Award Series in 1990. Before joining AT&T Bell Labs in 1991, he continued with postdoctoral positions at the Laboratory for Computer Science at MIT hosted by Ronald Rivest, and at the International Computer Science Institute (ICSI) in UC Berkeley hosted by Richard M. Karp, both of whom are Turing award winners.

Kearns is currently a full professor and National Center Chair at the University of Pennsylvania, where his appointment is split across the Department of Computer and Information Science, and Statistics and Operations and Information Management in the Wharton School. Prior to joining the Penn faculty in 2002, he spent a decade (1991–2001) in AT&T Labs and Bell Labs, including as head of the AI department with colleagues including Michael L. Littman, David A. McAllester, and Richard S. Sutton; Secure Systems Research department; and Machine Learning department with members such as Michael Collins and the leader Fernando Pereira. Other AT&T Labs colleagues in Algorithms and Theoretical Computer Science included Yoav Freund, Ronald Graham, Mehryar Mohri, Robert Schapire, and Peter Shor, as well as Sebastian Seung, Yann LeCun, Corinna Cortes, and Vladimir Vapnik (the V in VC dimension).

Kearns was named Fellow of the Association for Computing Machinery (2014) for contributions to machine learning, [1] and a fellow of the American Academy of Arts and Sciences (2012).

His former graduate students and postdoctoral visitors include Ryan W. Porter, John Langford, and Jennifer Wortman Vaughan.

Kearns' work has been reported by media, such as MIT Technology Review (2014) Can a Website Help You Decide to Have a Kid?, Bloomberg News (2014) Schneiderman (and Einstein) Pressure High-Speed Trading and NPR audio (2012) Online Education Grows Up, And For Now, It's Free.

Academic life

Computational learning theory

Kearns and Umesh Vazirani published An introduction to computational learning theory, which has been a standard text on computational learning theory since it was published in 1994.

Weak learnability and the origin of Boosting algorithms

The question "is weakly learnability equivalent to strong learnability?" posed by Kearns and Valiant (Unpublished manuscript 1988, ACM Symposium on Theory of Computing 1989) [8] [9] is the origin of boosting machine learning algorithms, which got a positive answer by Robert Schapire (1990, proof by construction, not practical) and Yoav Freund (1993, by voting, not practical) and then they developed the practical AdaBoost (European Conference on Computational Learning Theory 1995, Journal of Computer and System Sciences 1997), an adaptive boosting algorithm that won the prestigious Gödel Prize (2003).

Honors and awards

For contributions to machine learning, artificial intelligence, and algorithmic game theory and computational social science. [1]

Selected works

Widely used as a text book in computational learning theory courses. [11]
Based on his 1989 doctoral dissertation;
ACM Doctoral Dissertation Award Series in 1990
The open question: is weakly learnability equivalent to strong learnability?;
The origin of boosting algorithms;
Important publication in machine learning.

See also

Related Research Articles

<span class="mw-page-title-main">Ron Rivest</span> American cryptographer

Ronald Linn Rivest is a cryptographer and computer scientist whose work has spanned the fields of algorithms and combinatorics, cryptography, machine learning, and election integrity. He is an Institute Professor at the Massachusetts Institute of Technology (MIT), and a member of MIT's Department of Electrical Engineering and Computer Science and its Computer Science and Artificial Intelligence Laboratory.

In machine learning, boosting is an ensemble meta-algorithm for primarily reducing bias, and also variance in supervised learning, and a family of machine learning algorithms that convert weak learners to strong ones. Boosting is based on the question posed by Kearns and Valiant : "Can a set of weak learners create a single strong learner?" A weak learner is defined to be a classifier that is only slightly correlated with the true classification. In contrast, a strong learner is a classifier that is arbitrarily well-correlated with the true classification.

In computational learning theory, probably approximately correct (PAC) learning is a framework for mathematical analysis of machine learning. It was proposed in 1984 by Leslie Valiant.

In computer science, computational learning theory is a subfield of artificial intelligence devoted to studying the design and analysis of machine learning algorithms.

<span class="mw-page-title-main">Charles E. Leiserson</span> American computer scientist

Charles Eric Leiserson is a computer scientist, specializing in the theory of parallel computing and distributed computing, and particularly practical applications thereof. As part of this effort, he developed the Cilk multithreaded language. He invented the fat-tree interconnection network, a hardware-universal interconnection network used in many supercomputers, including the Connection Machine CM5, for which he was network architect. He helped pioneer the development of VLSI theory, including the retiming method of digital optimization with James B. Saxe and systolic arrays with H. T. Kung. He conceived of the notion of cache-oblivious algorithms, which are algorithms that have no tuning parameters for cache size or cache-line length, but nevertheless use cache near-optimally. He developed the Cilk language for multithreaded programming, which uses a provably good work-stealing algorithm for scheduling. Leiserson coauthored the standard algorithms textbook Introduction to Algorithms together with Thomas H. Cormen, Ronald L. Rivest, and Clifford Stein.

Indeterminacy in concurrent computation is concerned with the effects of indeterminacy in concurrent computation. Computation is an area in which indeterminacy is becoming increasingly important because of the massive increase in concurrency due to networking and the advent of many-core computer architectures. These computer systems make use of arbiters which gives rise to indeterminacy.

<span class="mw-page-title-main">Leslie Valiant</span> British American computer scientist

Leslie Gabriel Valiant is a British American computer scientist and computational theorist. He was born to a chemical engineer father and a translator mother. He is currently the T. Jefferson Coolidge Professor of Computer Science and Applied Mathematics at Harvard University. Valiant was awarded the Turing Award in 2010, having been described by the A.C.M. as a heroic figure in theoretical computer science and a role model for his courage and creativity in addressing some of the deepest unsolved problems in science; in particular for his "striking combination of depth and breadth".

<span class="mw-page-title-main">Vijay Vazirani</span> Indian American professor of computer science

Vijay Virkumar Vazirani is an Indian American distinguished professor of computer science in the Donald Bren School of Information and Computer Sciences at the University of California, Irvine.

<span class="mw-page-title-main">Christos Papadimitriou</span> Greek computer scientist (b. 1949)

Christos Charilaos Papadimitriou is a Greek theoretical computer scientist and the Donovan Family Professor of Computer Science at Columbia University.

<span class="mw-page-title-main">Knuth Prize</span> Prize given by ACM and IEEE for outstanding contributions to the foundations of computer science

The Donald E. Knuth Prize is a prize for outstanding contributions to the foundations of computer science, named after the American computer scientist Donald E. Knuth.

Umesh Virkumar Vazirani is an Indian–American academic who is the Roger A. Strauch Professor of Electrical Engineering and Computer Science at the University of California, Berkeley, and the director of the Berkeley Quantum Computation Center. His research interests lie primarily in quantum computing. He is also a co-author of a textbook on algorithms.

Manfred Klaus Warmuth is a computer scientist known for his pioneering research in computational learning theory. He is a Distinguished Professor emeritus at the University of California, Santa Cruz.

Robert Elias Schapire is an American computer scientist, former David M. Siegel '83 Professor in the computer science department at Princeton University, and has recently moved to Microsoft Research. His primary specialty is theoretical and applied machine learning.

Piotr Indyk is Thomas D. and Virginia W. Cabot Professor in the Theory of Computation Group at the Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology.

<span class="mw-page-title-main">Constantinos Daskalakis</span> Greek computer scientist

Constantinos Daskalakis is a Greek theoretical computer scientist. He is a professor at MIT's Electrical Engineering and Computer Science department and a member of the MIT Computer Science and Artificial Intelligence Laboratory. He was awarded the Rolf Nevanlinna Prize and the Grace Murray Hopper Award in 2018.

Dana Angluin is a professor emeritus of computer science at Yale University. She is known for foundational work in computational learning theory and distributed computing.

<span class="mw-page-title-main">Noam Nisan</span> Israeli computer scientist

Noam Nisan is an Israeli computer scientist, a professor of computer science at the Hebrew University of Jerusalem. He is known for his research in computational complexity theory and algorithmic game theory.

In computational learning theory, Occam learning is a model of algorithmic learning where the objective of the learner is to output a succinct representation of received training data. This is closely related to probably approximately correct (PAC) learning, where the learner is evaluated on its predictive power of a test set.

<span class="mw-page-title-main">Cynthia Rudin</span> American computer scientist and statistician

Cynthia Diane Rudin is an American computer scientist and statistician specializing in machine learning and known for her work in interpretable machine learning. She is the director of the Interpretable Machine Learning Lab at Duke University, where she is a professor of computer science, electrical and computer engineering, statistical science, and biostatistics and bioinformatics. In 2022, she won the Squirrel AI Award for Artificial Intelligence for the Benefit of Humanity from the Association for the Advancement of Artificial Intelligence (AAAI) for her work on the importance of transparency for AI systems in high-risk domains.

References

  1. 1 2 3 4 MICHAEL KEARNS (2014). "ACM Fellows 2014". acm.org. ACM. Retrieved January 10, 2015.
  2. "Morgan Stanley Hires Ex-SAC Capital Artificial Intelligence Expert". Bloomberg News . 26 June 2018.
  3. "Amazon Scholar: Michael Kearns". 26 June 2020.
  4. David R. Kearns 1969 Guggenheim Fellowship Chemistry
  5. "Symposium honoring Clyde W. Kearns, Pioneer in insecticide toxicology". Pesticide Biochemistry and Physiology. 22 (2): ii–iii. 1984. doi:10.1016/0048-3575(84)90081-6.
  6. Eber, Irene. "Chen Shou Yi". School of Education Studies. Claremont Graduate University. Archived from the original on 31 August 2014. Retrieved 13 February 2021.
  7. Irene Eber. "Chen Shou-yi, 1899-1978". acmcgu.edu. Archived from the original on August 31, 2014. Retrieved January 10, 2015. In the growth and development of Asian Studies on the West Coast, the Claremont Colleges and Professor Chen occupy a leading place.
  8. Michael Kearns (1988). "Thoughts on Hypothesis Boosting (Unpublished manuscript (Machine Learning class project, December 1988))" (PDF). Retrieved January 10, 2015.{{cite journal}}: Cite journal requires |journal= (help)
  9. Michael Kearns; Leslie Valiant (1989). "Crytographic limitations on learning Boolean formulae and finite automata". Proceedings of the twenty-first annual ACM symposium on Theory of computing - STOC '89. ACM. pp. 433–444. doi:10.1145/73007.73049. ISBN   0897913078. S2CID   536357 . Retrieved January 10, 2015.
  10. "News from the National Academy of Sciences". April 26, 2021. Retrieved July 4, 2021. Newly elected members and their affiliations at the time of election are: … Kearns, Michael; professor, department of computer and information science, University of Pennsylvania, Philadelphia, entry in member directory: "Member Directory". National Academy of Sciences. Retrieved July 4, 2021.
  11. Columbia University. "Introduction to Computational Learning Theory". cs.columbia.edu. Retrieved January 9, 2015.
the speakers include Stephen Cook and Michael O. Rabin, both of whom are Turing award winners, and Vijay Vazirani.