Alfred Aho | |
---|---|
Born | Alfred Vaino Aho August 9, 1941 |
Nationality | Canadian American |
Alma mater | |
Known for | |
Awards |
|
Scientific career | |
Fields | Computer science |
Institutions | Columbia University |
Thesis | Indexed Grammars: An Extension of Context Free Grammars (1968) |
Doctoral advisor | John Hopcroft [1] |
Doctoral students |
Alfred Vaino Aho (born August 9, 1941) is a Canadian computer scientist best known for his work on programming languages, compilers, and related algorithms, and his textbooks on the art and science of computer programming. [2] [3] [4]
Aho was elected into the National Academy of Engineering in 1999 for his contributions to the fields of algorithms and programming tools.
He and his long-time collaborator Jeffrey Ullman are the recipients of the 2020 Turing Award, generally recognized as the highest distinction in computer science. [5]
Aho received a B.A.Sc. (1963) in Engineering Physics from the University of Toronto, then an M.A. (1965) and Ph.D. (1967) in Electrical Engineering/Computer Science from Princeton University. [6] He conducted research at Bell Labs from 1967 to 1991, and again from 1997 to 2002 as Vice President of the Computing Sciences Research Center. [7] Since 1995, he has held the Lawrence Gussman Professorship in Computer Science at Columbia University. He served as chair of the department from 1995 to 1997, and again in the spring of 2003. [8]
In his PhD thesis Aho created indexed grammars [9] and the nested-stack automaton [10] as vehicles for extending the power of context-free languages, but retaining many of their decidability and closure properties. One application of indexed grammars is modelling parallel rewriting systems, [11] particularly in biological applications. [12]
After graduating from Princeton, Aho joined the Computing Sciences Research Center at Bell Labs where he devised efficient regular expression and string-pattern matching algorithms that he implemented in the first versions of the Unix tools egrep
and fgrep
. The fgrep
algorithm has become known as the Aho–Corasick algorithm; it is used by several bibliographic search-systems, including the one developed by Margaret J. Corasick, and by other string-searching applications. [13]
At Bell Labs, Aho worked closely with Steve Johnson and Jeffrey Ullman to develop efficient algorithms for analyzing and translating programming languages. [14] Steve Johnson used the bottom-up LALR parsing algorithms to create the syntax-analyzer generator yacc, [15] and Michael E. Lesk and Eric Schmidt used Aho's regular-expression pattern-matching algorithms to create the lexical-analyzer generator lex. [16] The lex and yacc tools and their derivatives have been used to develop the front ends of many of today's programming language compilers. [17]
Aho and Ullman wrote a series of textbooks on compiling techniques that codified the theory relevant to compiler design. Their 1977 textbook Principles of Compiler Design had a green dragon on the front cover and became known as "the green dragon book". In 1986 Aho and Ullman were joined by Ravi Sethi to create a new edition, "the red dragon book" (which was briefly shown in the 1995 movie Hackers ), and in 2006 also by Monica Lam to create "the purple dragon book". The dragon books are used for university courses as well as industry references. [18]
In 1974, Aho, John Hopcroft, and Ullman wrote The Design and Analysis of Computer Algorithms, [19] codifying some of their early research on algorithms. This book became one of the most highly cited books in computer science for several decades and helped to stimulate the creation of algorithms and data structures as a central course in the computer science curriculum. [20]
Aho is also widely known for his co-authorship of the AWK programming language with Peter J. Weinberger and Brian Kernighan (the "A" stands for "Aho"). [21] As of 2010 [update] Aho's research interests include programming languages, compilers, algorithms, and quantum computing. He is part of the Language and Compilers research-group at Columbia University. [22]
Overall, his works have been cited 81,040 times and he has an h-index of 66, as of May 8, 2019. [23]
Aho has received many prestigious honors, including the IEEE's John von Neumann Medal and membership in the National Academy of Engineering and the National Academy of Sciences. He was elected a Fellow of the American Academy of Arts and Sciences in 2003. [24] He holds honorary doctorates from the University of Waterloo, [25] from the University of Helsinki, [25] and from the University of Toronto. [26] He is a Fellow of the American Association for the Advancement of Science, ACM, Bell Labs, and IEEE. [20]
Aho has twice served as chair of the Advisory Committee for the Computer and Information Science and Engineering Directorate of the National Science Foundation. He is a past president of the ACM Special Interest Group on Algorithms and Computability Theory. [27] Aho, Hopcroft, and Ullman were co-recipients of the 2017 C&C Prize awarded by NEC Corporation. [28] He and Ullman were named recipients of the 2020 Turing Award on March 31, 2021. [5]
Aho has taught at Columbia University in New York City since 1995. He won the Great Teacher Award from the Society of Columbia Graduates in 2003. [29] [30]
A finite-state machine (FSM) or finite-state automaton, finite automaton, or simply a state machine, is a mathematical model of computation. It is an abstract machine that can be in exactly one of a finite number of states at any given time. The FSM can change from one state to another in response to some inputs; the change from one state to another is called a transition. An FSM is defined by a list of its states, its initial state, and the inputs that trigger each transition. Finite-state machines are of two types—deterministic finite-state machines and non-deterministic finite-state machines. For any non-deterministic finite-state machine, an equivalent deterministic one can be constructed.
Niklaus Emil Wirth was a Swiss computer scientist. He designed several programming languages, including Pascal, and pioneered several classic topics in software engineering. In 1984, he won the Turing Award, generally recognized as the highest distinction in computer science, "for developing a sequence of innovative computer languages".
In theoretical computer science and formal language theory, a regular language is a formal language that can be defined by a regular expression, in the strict sense in theoretical computer science.
The ACM A. M. Turing Award is an annual prize given by the Association for Computing Machinery (ACM) for contributions of lasting and major technical importance to computer science. It is generally recognized as the highest distinction in the field of computer science and is often referred to as the "Nobel Prize of Computing".
Compilers: Principles, Techniques, and Tools is a computer science textbook by Alfred V. Aho, Monica S. Lam, Ravi Sethi, and Jeffrey D. Ullman about compiler construction for programming languages. First published in 1986, it is widely regarded as the classic definitive compiler technology text.
Principles of Compiler Design, by Alfred Aho and Jeffrey Ullman, is a classic textbook on compilers for computer programming languages. Both of the authors won the 2020 Turing award for their work on compilers.
John Edward Hopcroft is an American theoretical computer scientist. His textbooks on theory of computation and data structures are regarded as standards in their fields. He is a professor emeritus at Cornell University, co-director of the Center on Frontiers of Computing Studies at Peking University, and the director of the John Hopcroft Center for Computer Science at Shanghai Jiao Tong University.
Top-down parsing in computer science is a parsing strategy where one first looks at the highest level of the parse tree and works down the parse tree by using the rewriting rules of a formal grammar. LL parsers are a type of parser that uses a top-down parsing strategy.
In computer science, an ambiguous grammar is a context-free grammar for which there exists a string that can have more than one leftmost derivation or parse tree. Every non-empty context-free language admits an ambiguous grammar by introducing e.g. a duplicate rule. A language that only admits ambiguous grammars is called an inherently ambiguous language. Deterministic context-free grammars are always unambiguous, and are an important subclass of unambiguous grammars; there are non-deterministic unambiguous grammars, however.
Jeffrey David Ullman is an American computer scientist and the Stanford W. Ascherman Professor of Engineering, Emeritus, at Stanford University. His textbooks on compilers, theory of computation, data structures, and databases are regarded as standards in their fields. He and his long-time collaborator Alfred Aho are the recipients of the 2020 Turing Award, generally recognized as the highest distinction in computer science.
The graph isomorphism problem is the computational problem of determining whether two finite graphs are isomorphic.
Seymour Ginsburg was an American pioneer of automata theory, formal language theory, and database theory, in particular; and computer science, in general. His work was influential in distinguishing theoretical Computer Science from the disciplines of Mathematics and Electrical Engineering.
The following tables list the computational complexity of various algorithms for common mathematical operations.
Indexed languages are a class of formal languages discovered by Alfred Aho; they are described by indexed grammars and can be recognized by nested stack automata.
In formal language theory, deterministic context-free languages (DCFL) are a proper subset of context-free languages. They are the context-free languages that can be accepted by a deterministic pushdown automaton. DCFLs are always unambiguous, meaning that they admit an unambiguous grammar. There are non-deterministic unambiguous CFLs, so DCFLs form a proper subset of unambiguous CFLs.
David Gries is an American computer scientist at Cornell University, mainly known for his books The Science of Programming (1981) and A Logical Approach to Discrete Math.
In computer science, the Hunt–Szymanski algorithm, also known as Hunt–McIlroy algorithm, is a solution to the longest common subsequence problem. It was one of the first non-heuristic algorithms used in diff which compares a pair of files each represented as a sequence of lines. To this day, variations of this algorithm are found in incremental version control systems, wiki engines, and molecular phylogenetics research software.
In computational complexity theory, a problem is NP-complete when:
In computer science, the Method of Four Russians is a technique for speeding up algorithms involving Boolean matrices, or more generally algorithms involving matrices in which each cell may take on only a bounded number of possible values.