Tim Teitelbaum

Last updated
Tim Teitelbaum
Born
Ray Teitelbaum

(1943-04-12) April 12, 1943 (age 81)
Alma mater Massachusetts Institute of Technology (SB)
Carnegie Mellon University (PhD)
Scientific career
Institutions Cornell University (1973-present)
GrammaTech (1988-2019)
Institut National de Recherche en Informatique et en Automatique (INRIA) in Rocquencourt, France (1982-83)
Thesis Minimum Distance Analysis of Syntax Errors in Computer Programs  (1975)
Doctoral advisor Nico Habermann
Doctoral students Thomas W. Reps [1] [2]
Susan B. Horwitz [3]
Bill Pugh [4]
Yanhong Annie Liu [5]
Website www.cs.cornell.edu/info/people/tt/tim_teitelbaum.html

(Ray) Tim Teitelbaum (born April 12, 1943, United States) is an American computer scientist known for his early work on integrated development environments (IDEs), syntax-directed editing, and incremental computation. He is Professor Emeritus at Cornell University. As an educator and faculty member of the Cornell University Computer Science Department since 1973, he was recognized for his large-scale teaching of introductory programming, and for his mentoring of highly successful graduate students. As a businessman, he is known for having co-founded GrammaTech, Inc. and for having been its sole CEO from 1988 to 2019.

Contents

Education

Teitelbaum was educated at Massachusetts Institute of Technology and Carnegie Mellon University. [1]

Career and research

In 1978, Teitelbaum created the Cornell Program Synthesizer, one of the seminal systems that demonstrated the power of tightly integrating a collection of program development tools, all deeply knowledgeable about a programming language and its semantics, into one unified framework. The Cornell Program Synthesizer used PL/CS, a variant of the PL/C language dialect developed at Cornell. [6] Teitelbaum's more than 45 lectures and demonstrations of this early IDE during 1979–82, as well as the credo of his 1981 paper [6] co-authored with graduate student Thomas Reps, asserted:

Programs are not text; they are hierarchical compositions of computational structures and should be edited, executed, and debugged in an environment that consistently acknowledges and reinforces this viewpoint.

This was followed in 1984 by the Synthesizer Generator, also done in collaboration with Reps, which allowed a Program Synthesizer to be generated for different programming languages based on supplying attribute grammars. [7]

Motivated by the importance of immediate feedback in interactive systems such as IDEs, Teitelbaum's research in the 1980s and 1990s focused on the problem of incremental computation:

Given a program P written in language L, and the result of executing P on input x, how can one efficiently determine the result of running P on input x′, where the difference between x and x′ is some small increment x′-x.

In a body of work with his graduate students, Teitelbaum investigated this problem for a range of languages L that included attribute grammars, SQL, first-order functional languages, and the lambda calculus. In addition to incremental evaluation methods, the work also included program transformation methods, i.e., the automatic derivation from P of an incremental program P′, where executing P′ on previous result P(x), increment x′-x, and auxiliary information retained from previous executions, efficiently performs the same computation as executing P on input x′.

Teitelbaum's work at GrammaTech aimed at the design and implementation of tools that assist in making software safer and more secure. [8] Techniques include static program analysis and dynamic program analysis of both source code and machine code.

Awards and honors

Teitelbaum was co-recipient of the Association for Computing Machinery SIGSOFT Retrospective Impact Paper Award (2010) for his 1984 paper [7] co-authored with Thomas Reps on the Synthesizer Generator. [9] [10]

Related Research Articles

In computing, a compiler is a computer program that translates computer code written in one programming language into another language. The name "compiler" is primarily used for programs that translate source code from a high-level programming language to a low-level programming language to create an executable program.

An integrated development environment (IDE) is a software application that provides comprehensive facilities for software development. An IDE normally consists of at least a source-code editor, build automation tools, and a debugger. Some IDEs, such as IntelliJ IDEA, Eclipse and Lazarus contain the necessary compiler, interpreter or both; others, such as SharpDevelop and NetBeans, do not.

In programming language theory, lazy evaluation, or call-by-need, is an evaluation strategy which delays the evaluation of an expression until its value is needed and which also avoids repeated evaluations.

In computer science, a compiler-compiler or compiler generator is a programming tool that creates a parser, interpreter, or compiler from some form of formal description of a programming language and machine.

<span class="mw-page-title-main">PL/C</span> Programming language developed at Cornell University

PL/C is an instructional dialect of the programming language PL/I, developed at the Department of Computer Science of Cornell University in the early 1970s in an effort headed by Professor Richard W. Conway and graduate student Thomas R. Wilcox. PL/C was developed with the specific goal of being used for teaching programming. The PL/C compiler, which implemented almost all of the large PL/I language, had the unusual capability of never failing to compile a program, through the use of extensive automatic correction of many syntax errors and by converting any remaining syntax errors to output statements. This was important because, at the time, students submitted their programs on IBM punch cards and might not get their output back for several hours. Over 250 other universities adopted PL/C; as one late-1970s textbook on PL/I noted, "PL/C ... the compiler for PL/I developed at Cornell University ... is widely used in teaching programming." Similarly, a mid-late-1970s survey of programming languages said that "PL/C is a widely used dialect of PL/I."

A domain-specific language (DSL) is a computer language specialized to a particular application domain. This is in contrast to a general-purpose language (GPL), which is broadly applicable across domains. There are a wide variety of DSLs, ranging from widely used languages for common domains, such as HTML for web pages, down to languages used by only one or a few pieces of software, such as MUSH soft code. DSLs can be further subdivided by the kind of language, and include domain-specific markup languages, domain-specific modeling languages, and domain-specific programming languages. Special-purpose computer languages have always existed in the computer age, but the term "domain-specific language" has become more popular due to the rise of domain-specific modeling. Simpler DSLs, particularly ones used by a single application, are sometimes informally called mini-languages.

In computer-based language recognition, ANTLR, or ANother Tool for Language Recognition, is a parser generator that uses a LL(*) algorithm for parsing. ANTLR is the successor to the Purdue Compiler Construction Tool Set (PCCTS), first developed in 1989, and is under active development. Its maintainer is Professor Terence Parr of the University of San Francisco.

In programming and software development, fuzzing or fuzz testing is an automated software testing technique that involves providing invalid, unexpected, or random data as inputs to a computer program. The program is then monitored for exceptions such as crashes, failing built-in code assertions, or potential memory leaks. Typically, fuzzers are used to test programs that take structured inputs. This structure is specified, e.g., in a file format or protocol and distinguishes valid from invalid input. An effective fuzzer generates semi-valid inputs that are "valid enough" in that they are not directly rejected by the parser, but do create unexpected behaviors deeper in the program and are "invalid enough" to expose corner cases that have not been properly dealt with.

A read–eval–print loop (REPL), also termed an interactive toplevel or language shell, is a simple interactive computer programming environment that takes single user inputs, executes them, and returns the result to the user; a program written in a REPL environment is executed piecewise. The term usually refers to programming interfaces similar to the classic Lisp machine interactive environment. Common examples include command-line shells and similar environments for programming languages, and the technique is very characteristic of scripting languages.

Thread Level Speculation (TLS), also known as Speculative Multi-threading, or Speculative Parallelization, is a technique to speculatively execute a section of computer code that is anticipated to be executed later in parallel with the normal execution on a separate independent thread. Such a speculative thread may need to make assumptions about the values of input variables. If these prove to be invalid, then the portions of the speculative thread that rely on these input variables will need to be discarded and squashed. If the assumptions are correct the program can complete in a shorter time provided the thread was able to be scheduled efficiently.

A structure editor, also structured editor or projectional editor, is any document editor that is cognizant of the document's underlying structure. Structure editors can be used to edit hierarchical or marked up text, computer programs, diagrams, chemical formulas, and any other type of content with clear and well-defined structure. In contrast, a text editor is any document editor used for editing plain text files.

Natural-language programming (NLP) is an ontology-assisted way of programming in terms of natural-language sentences, e.g. English. A structured document with Content, sections and subsections for explanations of sentences forms a NLP document, which is actually a computer program. Natural language programming is not to be mixed up with natural language interfacing or voice control where a program is first written and then communicated with through natural language using an interface added on. In NLP the functionality of a program is organised only for the definition of the meaning of sentences. For instance, NLP can be used to represent all the knowledge of an autonomous robot. Having done so, its tasks can be scripted by its users so that the robot can execute them autonomously while keeping to prescribed rules of behaviour as determined by the robot's user. Such robots are called transparent robots as their reasoning is transparent to users and this develops trust in robots. Natural language use and natural-language user interfaces include Inform 7, a natural programming language for making interactive fiction, Shakespeare, an esoteric natural programming language in the style of the plays of William Shakespeare, and Wolfram Alpha, a computational knowledge engine, using natural-language input. Some methods for program synthesis are based on natural-language programming.

<span class="mw-page-title-main">Incremental computing</span> Software feature

Incremental computing, also known as incremental computation, is a software feature which, whenever a piece of data changes, attempts to save time by only recomputing those outputs which depend on the changed data. When incremental computing is successful, it can be significantly faster than computing new outputs naively. For example, a spreadsheet software package might use incremental computation in its recalculation features, to update only those cells containing formulas which depend on the changed cells.

GrammaTech is a cybersecurity research services company based in Ithaca, New York. The company was founded in 1988 as a technology spin-off of Cornell University. GrammaTech software research services include the following; software analysis, vulnerability detection and mitigation, binary transformation and hardening, and autonomous computing. In September 2023, Battery Ventures acquired GrammaTech's software products division, including the CodeSonar and CodeSentry product lines. Thus establishing a new, independent entity that will operate under the CodeSecure, Inc. name and be headquartered in Bethesda, Maryland.

<span class="mw-page-title-main">History of compiler construction</span>

In computing, a compiler is a computer program that transforms source code written in a programming language or computer language, into another computer language. The most common reason for transforming source code is to create an executable program.

Thomas W. Reps is an American computer scientist known for his contributions to automatic program analysis. Dr. Reps is Professor of Computer Science in the Computer Sciences Department of the University of Wisconsin–Madison, which he joined in 1985. Reps is the author or co-author of four books and more than one hundred seventy-five papers describing his research. His work has covered a wide variety of topics, including program slicing, data-flow analysis, pointer analysis, model checking, computer security, instrumentation, language-based program-development environments, the use of program profiling in software testing, software renovation, incremental algorithms, and attribute grammars.

Susan Beth Horwitz was an American computer scientist noted for her research on programming languages and software engineering, and in particular on program slicing and dataflow-analysis. She had several best paper and an impact paper award mentioned below under awards.

A language workbench is a tool or set of tools that enables software development in the language-oriented programming software development paradigm. A language workbench will typically include tools to support the definition, reuse and composition of domain-specific languages together with their integrated development environment. Language workbenches were introduced and popularized by Martin Fowler in 2005.

Flix is a functional, imperative, and logic programming language developed at Aarhus University, with funding from the Independent Research Fund Denmark, and by a community of open source contributors. The Flix language supports algebraic data types, pattern matching, parametric polymorphism, currying, higher-order functions, extensible records, channel and process-based concurrency, and tail call elimination. Two notable features of Flix are its type and effect system and its support for first-class Datalog constraints.

References

  1. 1 2 Tim Teitelbaum at the Mathematics Genealogy Project OOjs UI icon edit-ltr-progressive.svg
  2. "Home Page of Prof. Thomas W. Reps".
  3. "Susan B. Horwitz".
  4. "Bill Pugh".
  5. "Yanhong Annie Liu".
  6. 1 2 Teitelbaum, T.; T. Reps (September 1981). "The Cornell Program Synthesizer: A syntax-directed programming environment". Communications of the ACM. 24 (9): 563–573. doi: 10.1145/358746.358755 . S2CID   14317073.
  7. 1 2 Reps, Thomas; Teitelbaum, Tim (1984). "The synthesizer generator". Proceedings of the first ACM SIGSOFT/SIGPLAN software engineering symposium on Practical software development environments - SDE 1. pp. 42–48. doi:10.1145/800020.808247. ISBN   0897911318. S2CID   18641509.
  8. "GrammaTech".
  9. Reps, Thomas W.; Teitelbaum, Tim (1989). The Synthesizer Generator. doi:10.1007/978-1-4613-9623-9. ISBN   978-1-4613-9625-3. S2CID   28068694.
  10. Reps, Thomas W.; Teitelbaum, Tim (1989). The Synthesizer Generator Reference Manual. doi:10.1007/978-1-4613-9633-8. ISBN   978-0-387-96910-7. S2CID   19706490.