John Darlington | |
---|---|
Nationality | British |
Occupation(s) | Academic, researcher and author |
Academic background | |
Education | B.Sc. (Econ) (1969) Ph.D., Artificial Intelligence (1973) |
Alma mater | London School of Economics University of Edinburgh |
Academic work | |
Institutions | University of Edinburgh Imperial College London |
John Darlington is a British academic,researcher and author. He is an Emeritus Professor at Imperial College London. He was Director of the London e-Science Centre and was head of the Functional Programming and Social Computing Sections at Imperial. [1]
Darlington is known for the early work he did on program transformation and functional programming. In his thesis in 1972,Darlington popularized the idea of program transformation,i.e. manipulating programs into alternative forms,preserving their semantics while altering their operational characteristics. [2]
Darlington completed his B.Sc. (Econ) in 1969 from the London School of Economics and his Ph.D. in Artificial Intelligence from the Department of Machine Intelligence at the University of Edinburgh in 1973. He was a Research Fellow at the Edinburgh University from 1973 to 1977. Later he was a Visiting Research Fellow at IBM Yorktown Heights and the Stanford Research Institute. [1]
In 1977,Darlington joined Imperial College as a lecturer in the Department of Computing,becoming a Reader in 1982 and a Full Professor in 1985. At Imperial College,he held several positions as Director of Centres aimed at developing the application of parallel and novel computer architectures. These include,the Imperial College/Fujitsu Parallel Computing Research Centre (1994-2000),the Imperial College Parallel Computing Centre (1996-2002),the London e-Science Centre (2002-2005) and the Imperial College Internet Centre (2005-2008). [1]
In 2015 Darlington became an Emeritus Professor at Imperial College. He retired in 2016,aged 69. [1]
Darlington is known for early work he did on program transformation and functional programming. In his thesis in 1972 Darlington introduced the idea of program transformation,i.e. manipulating programs into alternative forms,preserving their semantics while altering their operational characteristics [3] In subsequent work with his supervisor,Rod Burstall,Darlington developed the unfold/fold calculus for program transformation [4] This system of six rewrite rules has become classic and forms the basis of a great deal of work in many areas that continues to this day. [5] From this work Burstall and Darlington introduced a novel functional language,NPL,based on Kleene Recursion Equations that made an early contribution to the development of the multi-equational,pattern matching style of pure functional programming [6]
Darlington was an early proponent of functional programming languages and the declarative approach in general. He founded and led the Functional Programming Section in the Department of Computing at Imperial College in 1977,served on IFIP Working Group 2.8 and led the development of Hope+,an extension of Hope,which itself was the successor language to NPL. [7] This early work helped pave the way for later developments such as Haskell.
Darlington’s early unifying insight was to show that,with the right notation,computer programs could be treated as mathematical,formally manipulable,objects. The advantages of this approach were realised in subsequent research,resulting in innovations including:parallel machine design,the ALICE functional graph reduction machine (1985), [8] [9] a forerunner of the commercial ICL Goldrush parallel database machine (1992);co-ordination forms (1996) [10] c.f. map/reduce and market-based service computing. [11] [12] [13] collaborative with Sun Microsystems,c.f. cloud computing.
Darlington’s research in parallel computing led to Fujitsu founding the Imperial College/Fujitsu Parallel Computing Research Centre,opened in 1994 by HRH the Princess Royal. [14] Fujitsu donated a novel 128 processor AP1000 machine,valued at over £1M,and the Centre,with Darlington as Director,operated an open multi-disciplinary parallel application development programme. This on-going activity was continued under the auspices of the Imperial College Parallel Computing Centre (1996-2002),the London e-Science Centre, [15] (2002-2005) and the Imperial College Internet Centre (2005-2008).
The UK e-Science programme was launched in 2001 with the London e-Science Centre,(LeSC) as a regional centre. LeSC mounted an extensive collaborative programme with applications in materials modelling,protein folding,whole-earth climate modelling,pollution monitoring,distributed workflow and data management,particle physics,health data informatics and brain imaging. [16]
The Internet Centre,founded in 2005,with seed-corn funding from Imperial College,emphasised the importance of economic and social factors when studying the Internet. The Internet Centre developed collaborations with a range of commercial and public organisations including:Vodafone,the BBC,Transport for London,the Royal Bank of Scotland,the RCA and the Science Museum.
Darlington has collaborated with industry in a number of UK Technology Strategy Board and Innovate UK and European projects,applying ideas in functional-based software and cloud computing,that have developed a range of innovative applications in media processing,internet cloud services and public health. In this work Darlington developed methodologies combining the use of functional languages with conventional software systems. [17]
Computing is any goal-oriented activity requiring, benefiting from, or creating computing machinery. It includes the study and experimentation of algorithmic processes, and development of both hardware and software. Computing has scientific, engineering, mathematical, technological and social aspects. Major computing disciplines include computer engineering, computer science, cybersecurity, data science, information systems, information technology, digital art and software engineering.
A distributed system is a system whose components are located on different networked computers, which communicate and coordinate their actions by passing messages to one another. Distributed computing is a field of computer science that studies distributed systems.
In computer science, functional programming is a programming paradigm where programs are constructed by applying and composing functions. It is a declarative programming paradigm in which function definitions are trees of expressions that map values to other values, rather than a sequence of imperative statements which update the running state of the program.
The LEO was a series of early computer systems created by J. Lyons and Co. The first in the series, the LEO I, was the first computer used for commercial business applications.
In computer science, an abstract machine is a theoretical model that allows for a detailed and precise analysis of how a computer system functions. It is similar to a mathematical function in that it receives inputs and produces outputs based on predefined rules. Abstract machines vary from literal machines in that they are expected to perform correctly and independently of hardware. Abstract machines are "machines" because they allow step-by-step execution of programmes; they are "abstract" because they ignore many aspects of actual (hardware) machines. A typical abstract machine consists of a definition in terms of input, output, and the set of allowable operations used to turn the former into the latter. They can be used for purely theoretical reasons as well as models for real-world computer systems. In the theory of computation, abstract machines are often used in thought experiments regarding computability or to analyse the complexity of algorithms. This use of abstract machines is fundamental to the field of computational complexity theory, such as finite state machines, Mealy machines, push-down automata, and Turing machines.
John Warner Backus was an American computer scientist. He directed the team that invented and implemented FORTRAN, the first widely used high-level programming language, and was the inventor of the Backus–Naur form (BNF), a widely used notation to define formal language syntax. He later did research into the function-level programming paradigm, presenting his findings in his influential 1977 Turing Award lecture "Can Programming Be Liberated from the von Neumann Style?"
Alfred Vaino Aho is a Canadian computer scientist best known for his work on programming languages, compilers, and related algorithms, and his textbooks on the art and science of computer programming.
The Fifth Generation Computer Systems (FGCS) was a 10-year initiative begun in 1982 by Japan's Ministry of International Trade and Industry (MITI) to create computers using massively parallel computing and logic programming. It aimed to create an "epoch-making computer" with supercomputer-like performance and to provide a platform for future developments in artificial intelligence. FGCS was ahead of its time, and its excessive ambitions led to commercial failure. However on a theoretical level, the project spurred the development of concurrent logic programming.
The von Neumann architecture—also known as the von Neumann model or Princeton architecture—is a computer architecture based on a 1945 description by John von Neumann, and by others, in the First Draft of a Report on the EDVAC. The document describes a design architecture for an electronic digital computer with these components:
The School of Informatics is an academic unit of the University of Edinburgh, in Scotland, responsible for research, teaching, outreach and commercialisation in informatics. It was created in 1998 from the former Department of Artificial Intelligence, the Centre for Cognitive Science and the Department of Computer Science, along with the Artificial Intelligence Applications Institute (AIAI) and the Human Communication Research Centre.
James Reginald Cordy is a Canadian computer scientist and educator who is Professor Emeritus in the School of Computing at Queen's University. As a researcher he is most recently active in the fields of source code analysis and manipulation, software reverse and re-engineering, and pattern analysis and machine intelligence. He has a long record of previous work in programming languages, compiler technology, and software architecture.
Hope is a small functional programming language developed in the 1970s at the University of Edinburgh. It predates Miranda and Haskell; it is contemporaneous with ML, also developed at the University of Edinburgh. Hope was derived from NPL, a simple functional language developed by Rod Burstall and John Darlington in their work on program transformation. NPL and Hope are notable for being the first languages with call-by-pattern evaluation and algebraic data types.
NPL is a functional programming language with pattern matching designed by Rod Burstall and John Darlington in 1977. The language allows certain sets and logic constructs to appear on the right hand side of definitions, e.g.
setofeven(X) <= <:x: x in X & even(x) :>
Joseph Amadee Goguen was an American computer scientist. He was professor of Computer Science at the University of California and University of Oxford, and held research positions at IBM and SRI International.
Keith Leonard Clark is an Emeritus Professor in the Department of Computing at Imperial College London, England.
Rodney Martineau "Rod" Burstall FRSE is a British computer scientist and one of four founders of the Laboratory for Foundations of Computer Science at the University of Edinburgh.
Peter George Harrison is an Emeritus Professor of Computing Science at Imperial College London known for the reversed compound agent theorem, which gives conditions for a stochastic network to have a product-form solution.
In computing, algorithmic skeletons, or parallelism patterns, are a high-level parallel programming model for parallel and distributed computing.
Data-intensive computing is a class of parallel computing applications which use a data parallel approach to process large volumes of data typically terabytes or petabytes in size and typically referred to as big data. Computing applications which devote most of their execution time to computational requirements are deemed compute-intensive, whereas computing applications which require large volumes of data and devote most of their processing time to I/O and manipulation of data are deemed data-intensive.