John Cocke

Last updated
John Cocke
John Cocke.jpg
Born(1925-05-30)May 30, 1925
DiedJuly 16, 2002(2002-07-16) (aged 77)
Alma mater Duke University
Known for RISC
CYK algorithm
Awards ACM Turing Award (1987)
Computer Pioneer Award (1989)
National Medal of Technology (1991)
National Medal of Science (1994)
IEEE John von Neumann Medal (1994)
Computer History Museum Fellow (2002)
Scientific career
Fields Computer Science
Institutions IBM

John Cocke (May 30, 1925  July 16, 2002) was an American computer scientist recognized for his large contribution to computer architecture and optimizing compiler design. He is considered by many to be "the father of RISC architecture." [1]

He attended Duke University, where he received his Bachelor's degree in Mechanical Engineering in 1946 and his Ph.D. in Mathematics in 1956. Cocke spent his entire career as an industrial researcher for IBM, from 1956 to 1992.

Perhaps the project where his innovations were most noted was in the IBM 801 minicomputer, where his realization that matching the design of the architecture's instruction set to the relatively simple instructions actually emitted by compilers could allow high performance at a low cost.

He is one of the inventors of the CYK algorithm (C for Cocke). He was also involved in the pioneering speech recognition and machine translation work at IBM in the 1970s and 1980s, and is credited by Frederick Jelinek with originating the idea of using a trigram language model for speech recognition. [2]

Cocke was appointed IBM Fellow in 1972. He won the Eckert-Mauchly Award in 1985, ACM Turing Award in 1987, [3] the National Medal of Technology in 1991 and the National Medal of Science in 1994, [4] [5] IEEE John von Neumann Medal in 1984, The Franklin Institute's Certificate of Merit in 1996, the Seymour Cray Computer Engineering Award in 1999, and The Benjamin Franklin Medal in 2000.

In 2002, he was made a Fellow of the Computer History Museum "for his development and implementation of reduced instruction set computer architecture and program optimization technology." [6]

He was born in Charlotte, North Carolina and died in Valhalla, New York.

Related Research Articles

Fred Brooks American computer scientist (born 1931)

Frederick Phillips "Fred" Brooks Jr. is an American computer architect, software engineer, and computer scientist, best known for managing the development of IBM's System/360 family of computers and the OS/360 software support package, then later writing candidly about the process in his seminal book The Mythical Man-Month. Brooks has received many awards, including the National Medal of Technology in 1985 and the Turing Award in 1999.

Reduced instruction set computer Computer whose instruction set architecture allows it to have fewer cycles per instruction than a complex instruction set computer

A reduced instruction set computer, or RISC, is a computer instruction set that allows a computer's microprocessor to have fewer cycles per instruction (CPI) than a complex instruction set computer (CISC).

Turing Award American annual computer science prize

The ACM A.M. Turing Award is an annual prize given by the Association for Computing Machinery (ACM) to an individual selected for contributions "of lasting and major technical importance to the computer field". The Turing Award is generally recognized as the highest distinction in computer science.

The 801 was an experimental minicomputer designed by IBM. The resulting architecture was used in various roles in IBM into the 1980s.

John Backus American mathematician

John Warner Backus was an American computer scientist. He directed the team that invented and implemented FORTRAN, the first widely used high-level programming language, and was the inventor of the Backus–Naur form (BNF), a widely used notation to define formal language syntax. He later did research into the function-level programming paradigm, presenting his findings in his influential 1977 Turing Award lecture "Can Programming Be Liberated from the von Neumann Style?"

John Hopcroft American computer scientist

John Edward Hopcroft is an American theoretical computer scientist. His textbooks on theory of computation and data structures are regarded as standards in their fields. He is the IBM Professor of Engineering and Applied Mathematics in Computer Science at Cornell University.

John L. Hennessy American computer scientist

John Leroy Hennessy is an American computer scientist, academician, businessman, and Chair of Alphabet Inc. Hennessy is one of the founders of MIPS Computer Systems Inc. as well as Atheros and served as the tenth President of Stanford University. Hennessy announced that he would step down in the summer of 2016. He was succeeded as President by Marc Tessier-Lavigne. Marc Andreessen called him "the godfather of Silicon Valley."

von Neumann architecture computer architecture where code and data share a common bus

The von Neumann architecture—also known as the von Neumann model or Princeton architecture—is a computer architecture based on a 1945 description by the mathematician and physicist John von Neumann and others in the First Draft of a Report on the EDVAC. That document describes a design architecture for an electronic digital computer with these components:

David Patterson (computer scientist) American computer scientist

David Andrew Patterson is an American computer pioneer and academic who has held the position of Professor of Computer Science at the University of California, Berkeley since 1976. He announced retirement in 2016 after serving nearly forty years, becoming a distinguished engineer at Google. He currently is Vice Chair of the Board of Directors of the RISC-V Foundation, and the Pardee Professor of Computer Science, Emeritus at UC Berkeley.

MIPS was a research project conducted by John L. Hennessy at Stanford University between 1981 and 1984. MIPS investigated a type of instruction set architecture (ISA) now called Reduced Instruction Set Computer (RISC), its implementation as a microprocessor with very large scale integration (VLSI) semiconductor technology, and the effective exploitation of RISC architectures with optimizing compilers. MIPS, together with the IBM 801 and Berkeley RISC, were the three research projects that pioneered and popularized RISC technology in the mid-1980s. In recognition of the impact MIPS made on computing, Hennessey was awarded the IEEE John von Neumann Medal in 2000 by the IEEE, the Eckert–Mauchly Award in 2001 by the Association for Computing Machinery, the Seymour Cray Computer Engineering Award in 2001 by the IEEE Computer Society, and, again with David Patterson, the Turing Award in 2017 by the ACM.

Frederick Jelinek was a Czech-American researcher in information theory, automatic speech recognition, and natural language processing. He is well known for his oft-quoted statement, "Every time I fire a linguist, the performance of the speech recognizer goes up".

Wen-mei Hwu is the Walter J. Sanders III-AMD Endowed Chair professor in Electrical and Computer Engineering in the Coordinated Science Laboratory at the University of Illinois at Urbana-Champaign. His research is on compiler design, computer architecture, computer microarchitecture, and parallel processing. He is a principal investigator for the petascale Blue Waters supercomputer, is co-director of the Universal Parallel Computing Research Center (UPCRC), and is principal investigator for the first NVIDIA CUDA Center of Excellence at UIUC. At the Illinois Coordinated Science Lab, Hwu leads the IMPACT Research Group and is director of the OpenIMPACT project – which has delivered new compiler and computer architecture technologies to the computer industry since 1987. From 1997 to 1999, Hwu served as the chairman of the Computer Engineering Program at Illinois. Since 2009, Hwu has served as chief technology officer at MulticoreWare Inc., leading the development of compiler tools for heterogeneous platforms. The OpenCL compilers developed by his team at MulticoreWare are based on the LLVM framework and have been deployed by leading semiconductor companies.

Frances E. Allen American computer scientist

Frances Elizabeth "Fran" Allen is an American computer scientist and pioneer in the field of optimizing compilers. Allen was the first female IBM Fellow and in 2006 became the first woman to win the Turing Award. Her achievements include seminal work in compilers, program optimization, and parallelization. Since 2002, she has been a Fellow Emerita from IBM.

The ACS-1 and ACS-360 are two related supercomputers designed by IBM as part of the IBM Advanced Computing Systems project from 1961 to 1969. Although the designs were never finished and no models ever went into production, the project spawned a number of organizational techniques and architectural innovations that have since become incorporated into nearly all high-performance computers in existence today. Many of the ideas resulting from the project directly influenced the development of the IBM RS/6000 and, more recently, have contributed to the Explicitly Parallel Instruction Computing (EPIC) computing paradigm used by Intel and HP in high-performance processors.

Krishna V. Palem is a computer scientist and engineer of Indian origin and is the Kenneth and Audrey Kennedy Professor of Computing at Rice University and the director of Institute for Sustainable Nanoelectronics (ISNE) at Nanyang Technological University (NTU). He is recognized for his "pioneering contributions to the algorithmic, compilation, and architectural foundations of embedded computing", as stated in the citation of his 2009 Wallace McDowell Award, the "highest technical award made solely by the IEEE Computer Society".

In computing, a compiler is a computer program that transforms source code written in a programming language or computer language, into another computer language. The most common reason for transforming source code is to create an executable program.

Computer architecture Set of rules and methods that describe the functionality, organization, and implementation of computer systems

In computer engineering, computer architecture is a set of rules and methods that describe the functionality, organization, and implementation of computer systems. Some definitions of architecture define it as describing the capabilities and programming model of a computer but not a particular implementation. In other definitions computer architecture involves instruction set architecture design, microarchitecture design, logic design, and implementation.

Guang R. Gao is a computer scientist and a Professor of Electrical and Computer Engineering at the University of Delaware. Gao is a founder and Chief Scientist of ETI.

A high-level language computer architecture (HLLCA) is a computer architecture designed to be targeted by a specific high-level language, rather than the architecture being dictated by hardware considerations. It is accordingly also termed language-directed computer design, coined in McKeeman (1967) and primarily used in the 1960s and 1970s. HLLCAs were popular in the 1960s and 1970s, but largely disappeared in the 1980s. This followed the dramatic failure of the Intel 432 (1981) and the emergence of optimizing compilers and reduced instruction set computing (RISC) architecture and RISC-like CISC architectures, and the later development of just-in-time compilation for HLLs. A detailed survey and critique can be found in Ditzel & Patterson (1980).


  1. Schofield, Jack (2002-07-27). "John Cocke". The Guardian. Guardian Media Group . Retrieved 2011-05-10. Cocke's idea was to use fewer instructions, but design chips that performed simple instructions very quickly. [...] Later, this approach became known as reduced instruction set computing (Risc) [...]
  2. Jelinek, Frederick, "The Dawn of Statistical ASR and MT", Computational Linguistics, 35(4), 2009, pp. 483-494, doi: 10.1162/coli.2009.35.4.35401
  3. John Cocke, The search for performance in scientific processors: the Turing Award lecture. Communications of the ACM, Volume 31 Issue 3, March 1988, Pages 250-253. doi:10.1145/42392.42394
  4. "National Science Foundation - The President's National Medal of Science". Retrieved 2014-06-19.
  5. "John Cocke". Retrieved 21 December 2015.
  6. "John Cocke". Computer History Museum. Archived from the original on 2013-05-09. Retrieved 2013-05-23.