Leslie Valiant

Last updated

Leslie Valiant

FRS
Leslie Valiant (34913684313).jpg
Valiant in 2012
Born
Leslie Gabriel Valiant

(1949-03-28) 28 March 1949 (age 74)
NationalityBritish
Alma mater
Known for
Awards
Scientific career
Fields Mathematics
Theoretical computer science
Computational learning theory
Theoretical neuroscience
Institutions
Thesis Decision Procedures for Families of Deterministic Pushdown Automata  (1974)
Doctoral advisor Mike Paterson [3]
Doctoral students
Website people.seas.harvard.edu/~valiant

Leslie Gabriel Valiant FRS [4] [5] (born 28 March 1949) is a British American [6] computer scientist and computational theorist. [7] [8] He was born to a chemical engineer father and a translator mother. [9] He is currently the T. Jefferson Coolidge Professor of Computer Science and Applied Mathematics at Harvard University. [10] [11] [12] [13] Valiant was awarded the Turing Award in 2010, having been described by the A.C.M. as a heroic figure in theoretical computer science and a role model for his courage and creativity in addressing some of the deepest unsolved problems in science; in particular for his "striking combination of depth and breadth". [6]

Contents

Education

Valiant was educated at King's College, Cambridge, [14] [6] Imperial College London, [14] [6] and the University of Warwick where he received a PhD in computer science in 1974. [15] [3]

Research and career

Valiant is world-renowned for his work in Theoretical Computer Science. Among his many contributions to Complexity Theory, he introduced the notion of #P-completeness ("Sharp-P completeness") to explain why enumeration and reliability problems are intractable. He created the Probably Approximately Correct or PAC model of learning that introduced the field of Computational Learning Theory and became a theoretical basis for the development of Machine Learning. He also introduced the concept of Holographic Algorithms inspired by the Quantum Computation model. In computer systems, he is most well-known for introducing the Bulk Synchronous Parallel processing model. Analogous to the von Neumann model for a single computer architecture, BSP has been an influential model for parallel and distributed computing architectures. Recent examples are Google adopting it for computation at large scale via MapReduce, MillWheel, [16] Pregel [17] and Dataflow, and Facebook creating a graph analytics system capable of processing over 1 trillion edges. [18] [19] There have also been active open-source projects to add explicit BSP programming as well as other high-performance parallel programming models derived from BSP. Popular examples are Hadoop, Spark, Giraph, Hama, Beam and Dask. His earlier work in Automata Theory includes an algorithm for context-free parsing, which is still the asymptotically fastest known. He also works in Computational Neuroscience focusing on understanding memory and learning.

Valiant's 2013 book is Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World . [20] In it he argues, among other things, that evolutionary biology does not explain the rate at which evolution occurs, writing, for example, "The evidence for Darwin's general schema for evolution being essentially correct is convincing to the great majority of biologists. This author has been to enough natural history museums to be convinced himself. All this, however, does not mean the current theory of evolution is adequately explanatory. At present the theory of evolution can offer no account of the rate at which evolution progresses to develop complex mechanisms or to maintain them in changing environments."

Valiant started teaching at Harvard University in 1982 and is currently the T. Jefferson Coolidge Professor of Computer Science and Applied Mathematics in the Harvard School of Engineering and Applied Sciences. Prior to 1982 he taught at Carnegie Mellon University, the University of Leeds, and the University of Edinburgh.

Awards and honors

Valiant received the Nevanlinna Prize in 1986, the Knuth Prize in 1997, the EATCS Award in 2008, [21] and the Turing Award in 2010. [22] [23] He was elected a Fellow of the Royal Society (FRS) in 1991, [4] a Fellow of the Association for the Advancement of Artificial Intelligence (AAAI) in 1992, [24] and a member of the United States National Academy of Sciences in 2001. [25] Valiant's nomination for the Royal Society reads:

Leslie Valiant has contributed in a decisive way to the growth of theoretical computer science. His work is concerned mainly with quantifying mathematically the resource costs of solving problems on a computer. In early work (1975), he found the asymptotically fastest algorithm known for recognising context-free languages. At the same time, he pioneered the use of communication properties of graphs for analysing computations. In 1977, he defined the notion of ‘sharp-P’ (#P)-completeness and established its utility in classifying counting or enumeration problems according to computational tractability. The first application was to counting matchings (the matrix permanent function). In 1984, Leslie introduced a definition of inductive learning that, for the first time, reconciles computational feasibility with the applicability to nontrivial classes of logical rules to be learned. This notion, later called ‘probably approximately correct learning’, became a theoretical basis for the development of machine learning. In 1989, he formulated the concept of bulk synchronous computation as a unifying principle for parallel computation. Leslie received the Nevanlinna Prize in 1986, and the Turing Award in 2010. [26]

The citation for his A.M. Turing Award reads:

For transformative contributions to the theory of computation, including the theory of probably approximately correct (PAC) learning, the complexity of enumeration and of algebraic computation, and the theory of parallel and distributed computing. [6]

Personal life

His two sons Gregory Valiant [27] and Paul Valiant [28] are both also theoretical computer scientists. [8]

Related Research Articles

<span class="mw-page-title-main">Computer science</span> Study of computation

Computer science is the study of computation, information, and automation. Computer science spans theoretical disciplines to applied disciplines. Though more often considered an academic discipline, computer science is closely related to computer programming.

In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm.

<span class="mw-page-title-main">Discrete mathematics</span> Study of discrete mathematical structures

Discrete mathematics is the study of mathematical structures that can be considered "discrete" rather than "continuous". Objects studied in discrete mathematics include integers, graphs, and statements in logic. By contrast, discrete mathematics excludes topics in "continuous mathematics" such as real numbers, calculus or Euclidean geometry. Discrete objects can often be enumerated by integers; more formally, discrete mathematics has been characterized as the branch of mathematics dealing with countable sets. However, there is no exact definition of the term "discrete mathematics".

The #P-complete problems form a complexity class in computational complexity theory. The problems in this complexity class are defined by having the following two properties:

Computer science is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is the ACM Computing Classification System devised by the Association for Computing Machinery.

<span class="mw-page-title-main">Richard M. Karp</span> American mathematician

Richard Manning Karp is an American computer scientist and computational theorist at the University of California, Berkeley. He is most notable for his research in the theory of algorithms, for which he received a Turing Award in 1985, The Benjamin Franklin Medal in Computer and Cognitive Science in 2004, and the Kyoto Prize in 2008.

<span class="mw-page-title-main">Theoretical computer science</span> Subfield of computer science and mathematics

Theoretical computer science (TCS) is a subset of general computer science and mathematics that focuses on mathematical aspects of computer science such as the theory of computation, lambda calculus, and type theory.

Algorithmic learning theory is a mathematical framework for analyzing machine learning problems and algorithms. Synonyms include formal learning theory and algorithmic inductive inference. Algorithmic learning theory is different from statistical learning theory in that it does not make use of statistical assumptions and analysis. Both algorithmic and statistical learning theory are concerned with machine learning and can thus be viewed as branches of computational learning theory.

<span class="mw-page-title-main">Computational learning theory</span> Theory of machine learning

In computer science, computational learning theory is a subfield of artificial intelligence devoted to studying the design and analysis of machine learning algorithms.

Solomonoff's theory of inductive inference is a mathematical theory of induction introduced by Ray Solomonoff, based on probability theory and theoretical computer science. In essence, Solomonoff's induction derives the posterior probability of any computable theory, given a sequence of observed data. This posterior probability is derived from Bayes' rule and some universal prior, that is, a prior that assigns a positive probability to any computable theory.

<span class="mw-page-title-main">Concurrency (computer science)</span> Ability to execute a task in a non-serial manner

In computer science, concurrency is the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the outcome. This allows for parallel execution of the concurrent units, which can significantly improve overall speed of the execution in multi-processor and multi-core systems. In more technical terms, concurrency refers to the decomposability of a program, algorithm, or problem into order-independent or partially-ordered components or units of computation.

<span class="mw-page-title-main">Vijay Vazirani</span> Indian American professor of computer science

Vijay Virkumar Vazirani is an Indian American distinguished professor of computer science in the Donald Bren School of Information and Computer Sciences at the University of California, Irvine.

<span class="mw-page-title-main">Knuth Prize</span>

The Donald E. Knuth Prize is a prize for outstanding contributions to the foundations of computer science, named after the American computer scientist Donald E. Knuth.

In theoretical computer science, a pointer machine is an atomistic abstract computational machine model akin to the random-access machine. A pointer algorithm could also be an algorithm restricted to the pointer machine model.

The bulk synchronous parallel (BSP) abstract computer is a bridging model for designing parallel algorithms. It is similar to the parallel random access machine (PRAM) model, but unlike PRAM, BSP does not take communication and synchronization for granted. In fact, quantifying the requisite synchronization and communication is an important part of analyzing a BSP algorithm.

In computability theory, super-recursive algorithms are a generalization of ordinary algorithms that are more powerful, that is, compute more than Turing machines. The term was introduced by Mark Burgin, whose book "Super-recursive algorithms" develops their theory and presents several mathematical models. Turing machines and other mathematical models of conventional algorithms allow researchers to find properties of recursive algorithms and their computations. In a similar way, mathematical models of super-recursive algorithms, such as inductive Turing machines, allow researchers to find properties of super-recursive algorithms and their computations.

The Annual ACM Symposium on Theory of Computing (STOC) is an academic conference in the field of theoretical computer science. STOC has been organized annually since 1969, typically in May or June; the conference is sponsored by the Association for Computing Machinery special interest group SIGACT. Acceptance rate of STOC, averaged from 1970 to 2012, is 31%, with the rate of 29% in 2012.

<i>Information Processing Letters</i> Academic journal

Information Processing Letters is a peer-reviewed scientific journal in the field of computer science, published by Elsevier. The aim of the journal is to enable fast dissemination of results in the field of information processing in the form of short papers. Submissions are limited to nine double-spaced pages.

Michael Justin Kearns is an American computer scientist, professor and National Center Chair at the University of Pennsylvania, the founding director of Penn's Singh Program in Networked & Social Systems Engineering (NETS), the founding director of Warren Center for Network and Data Sciences, and also holds secondary appointments in Penn's Wharton School and department of Economics. He is a leading researcher in computational learning theory and algorithmic game theory, and interested in machine learning, artificial intelligence, computational finance, algorithmic trading, computational social science and social networks. He previously led the Advisory and Research function in Morgan Stanley's Artificial Intelligence Center of Excellence team, and is currently an Amazon Scholar within Amazon Web Services.

References

  1. Valiant, L.; Vazirani, V. (1986). "NP is as easy as detecting unique solutions" (PDF). Theoretical Computer Science. 47: 85–93. doi: 10.1016/0304-3975(86)90135-0 .
  2. Valiant, L. G. (1979). "The Complexity of Enumeration and Reliability Problems". SIAM Journal on Computing . 8 (3): 410–421. doi:10.1137/0208032.
  3. 1 2 3 Leslie Valiant at the Mathematics Genealogy Project
  4. 1 2 "Leslie Valiant FRS". London: Royal Society. 1991.
  5. DServe Archive Catalog Show
  6. 1 2 3 4 5 "Leslie G. Valiant - A.M. Turing Award Laureate". A.M. Turing Award. Retrieved 9 January 2019.
  7. Hoffmann, L. (2011). "Q&A: Leslie Valiant discusses machine learning, parallel computing, and computational neuroscience". Communications of the ACM . 54 (6): 128. doi: 10.1145/1953122.1953152 .
  8. 1 2 Anon (2017). "Valiant, Prof. Leslie Gabriel" . Who's Who (online Oxford University Press  ed.). Oxford: A & C Black. doi:10.1093/ww/9780199540884.013.U40928.(Subscription or UK public library membership required.)
  9. "A. M. Turing Award Oral History Interview with Leslie Gabriel Valiant" (PDF).
  10. Leslie Valiant author profile page at the ACM Digital Library
  11. Wigderson, A. (2009). "The work of Leslie Valiant". Proceedings of the 41st annual ACM symposium on Symposium on theory of computing - STOC '09. pp. 1–2. doi:10.1145/1536414.1536415. ISBN   9781605585062. S2CID   15370663.
  12. Leslie G. Valiant at DBLP Bibliography Server OOjs UI icon edit-ltr-progressive.svg
  13. Valiant, Leslie (1984). "A theory of the learnable" (PDF). Communications of the ACM. 27 (11): 1134–1142. doi:10.1145/1968.1972. S2CID   12837541.
  14. 1 2 "CV of Leslie G. Valiant" (PDF). Harvard University. Retrieved 9 January 2019.
  15. Valiant, Leslie (1973). Decision procedures for families of deterministic pushdown automata. warwick.ac.uk (PhD thesis). University of Warwick. OCLC   726087468. EThOS   uk.bl.ethos.475930.
  16. MillWheel: Fault-Tolerant Stream Processing at Internet Scale
  17. Pregel: a system for large-scale graph processing
  18. A comparison of state-of-the-art graph processing systems.
  19. One Trillion Edges: Graph Processing at Facebook-Scale
  20. https://www.hachettebookgroup.com/titles/leslie-valiant/probably-approximately-correct/9780465037902/?lens=basic-books, ISBN   9780465032716
  21. David Peleg The EATCS Award 2008 – Laudatio for Professor Leslie Valiant European Association of Theoretical Computer Science.
  22. Josh Fishman "‘Probably Approximately Correct’ Inventor, From Harvard U., Wins Turing Award" Chronicle of Higher Education 9 March 2011.
  23. ACM Turing Award Goes to Innovator in Machine Learning ACM Computing News
  24. Elected AAAI Fellows Association for the Advancement of Artificial Intelligence.
  25. Member Directory: Leslie G. Valiant National Academy of Sciences.
  26. https://royalsociety.org/people/leslie-valiant-12451/ Royal Society biography
  27. Gregory Valiant Homepage
  28. Paul Valiant's homepage

CC BY icon-80x15.png  This article incorporates text available under the CC BY 4.0 license.