History of programming languages

Last updated

The history of programming languages spans from documentation of early mechanical computers to modern tools for software development. Early programming languages were highly specialized, relying on mathematical notation and similarly obscure syntax. [1] Throughout the 20th century, research in compiler theory led to the creation of high-level programming languages, which use a more accessible syntax to communicate instructions.

Contents

The first high-level programming language was Plankalkül, created by Konrad Zuse between 1942 and 1945. [2] The first high-level language to have an associated compiler was created by Corrado Böhm in 1951, for his PhD thesis. [3] The first commercially available language was FORTRAN (FORmula TRANslation), developed in 1956 (first manual appeared in 1956, but first developed in 1954) by a team led by John Backus at IBM.

Early history

During 1842–1849, Ada Lovelace translated the memoir of Italian mathematician Luigi Menabrea about Charles Babbage's newest proposed machine: the Analytical Engine; she supplemented the memoir with notes that specified in detail a method for calculating Bernoulli numbers with the engine, recognized by most of historians as the world's first published computer program. [4]

Jacquard Looms and Charles Babbage's Difference Engine both were designed to utilize punched cards, [5] [6] which would describe the sequence of operations that their programmable machines should perform.

The first computer codes were specialized for their applications: e.g., Alonzo Church was able to express the lambda calculus in a formulaic way and the Turing machine was an abstraction of the operation of a tape-marking machine.

First programming languages

In the 1940s, the first recognizably modern electrically powered computers were created. The limited speed and memory capacity forced programmers to write hand-tuned assembly language programs. It was eventually realized that programming in assembly language required a great deal of intellectual effort.[ citation needed ]

An early proposal for a high-level programming language was Plankalkül, developed by Konrad Zuse for his Z1 computer between 1942 and 1945 but not implemented at the time. [7]

The first functioning programming languages designed to communicate instructions to a computer were written in the early 1950s. John Mauchly's Short Code, proposed in 1949, was one of the first high-level languages ever developed for an electronic computer. [8] Unlike machine code, Short Code statements represented mathematical expressions in understandable form. However, the program had to be interpreted into machine code every time it ran, making the process much slower than running the equivalent machine code.

In the early 1950s, Alick Glennie developed Autocode, possibly the first compiled programming language, at the University of Manchester. In 1954, a second iteration of the language, known as the "Mark 1 Autocode", was developed for the Mark 1 by R. A. Brooker. Brooker, with the University of Manchester, also developed an autocode for the Ferranti Mercury in the 1950s. The version for the EDSAC 2 was devised by Douglas Hartree of University of Cambridge Mathematical Laboratory in 1961. Known as EDSAC 2 Autocode, it was a straight development from Mercury Autocode adapted for local circumstances and was noted for its object code optimization and source-language diagnostics which were advanced for the time. A contemporary but separate thread of development, Atlas Autocode was developed for the University of Manchester Atlas 1 machine.

In 1954, FORTRAN was invented at IBM by a team led by John Backus; it was the first widely used high-level general purpose language to have a functional implementation, in contrast to only a design on paper. [9] [10] When FORTRAN was first introduced, it was viewed with skepticism due to bugs, delays in development, and the comparative efficiency of "hand-coded" programs written in assembly. [11] However, in a hardware market that was rapidly evolving, the language eventually became known for its efficiency. It is still a popular language for high-performance computing [12] and is used for programs that benchmark and rank the world's TOP500 fastest supercomputers. [13]

Another early programming language was devised by Grace Hopper in the US, named FLOW-MATIC. It was developed for the UNIVAC I at Remington Rand during the period from 1955 until 1959. Hopper found that business data processing customers were uncomfortable with mathematical notation, and in early 1955, she and her team wrote a specification for an English language programming language and implemented a prototype. [14] The FLOW-MATIC compiler became publicly available in early 1958 and was substantially complete in 1959. [15] Flow-Matic was a major influence in the design of COBOL, since only it and its direct descendant AIMACO were in use at the time. [16]

Other languages still in use today include LISP (1958), invented by John McCarthy and COBOL (1959), created by the Short Range Committee. Another milestone in the late 1950s was the publication, by a committee of American and European computer scientists, of "a new language for algorithms"; the ALGOL 60 Report (the "ALGOrithmic Language"). This report consolidated many ideas circulating at the time and featured three key language innovations:

Another innovation, related to this, was in how the language was described:

ALGOL 60 was particularly influential in the design of later languages, some of which soon became more popular. The Burroughs large systems were designed to be programmed in an extended subset of ALGOL.

ALGOL's key ideas were continued, producing ALGOL 68:

ALGOL 68's many little-used language features (for example, concurrent and parallel blocks) and its complex system of syntactic shortcuts and automatic type coercions made it unpopular with implementers and gained it a reputation of being difficult. Niklaus Wirth actually walked out of the design committee to create the simpler Pascal language.

Logos
Fortran acs cover.jpeg
Fortran
Lisp logo.svg
Lisp
Simula - logo.svg
Simula

Some notable languages that were developed in this period include:

Establishing fundamental paradigms

Logos
Lambda lc.svg
Scheme
The C Programming Language logo.svg
C
Smalltalk Balloon.svg
Smalltalk

The period from the late 1960s to the late 1970s brought a major flowering of programming languages. Most of the major language paradigms now in use were invented in this period:[ original research? ]

The 1960s and 1970s also saw considerable debate over the merits of "structured programming", which essentially meant programming without the use of goto . A significant fraction of programmers believed that, even in languages that provide goto, it is bad programming style to use it except in rare circumstances. This debate was closely related to language design: some languages had no goto, which forced the use of structured programming.

To provide even faster compile times, some languages were structured for "one-pass compilers" which expect subordinate routines to be defined first, as with Pascal, where the main routine, or driver function, is the final section of the program listing.

Some notable languages that were developed in this period include:

1980s: consolidation, modules, performance

Logos
Matlab Logo.png
MATLAB
Erlang logo.png
Erlang
Tcl-powered.svg
Tcl
ISO C++ Logo.svg
C++

The 1980s were years of relative consolidation in imperative languages. Rather than inventing new paradigms, all of these movements elaborated upon the ideas invented in the prior decade. C++ combined object-oriented and systems programming. The United States government standardized Ada, a systems programming language intended for use by defense contractors. In Japan and elsewhere, vast sums were spent investigating so-called fifth-generation programming languages that incorporated logic programming constructs. The functional languages community moved to standardize ML and Lisp. Research in Miranda, a functional language with lazy evaluation, began to take hold in this decade.

One important new trend in language design was an increased focus on programming for large-scale systems through the use of modules, or large-scale organizational units of code. Modula, Ada, and ML all developed notable module systems in the 1980s. Module systems were often wedded to generic programming constructs: generics being, in essence, parametrized modules[ citation needed ] (see also Polymorphism (computer science)).

Although major new paradigms for imperative programming languages did not appear, many researchers expanded on the ideas of prior languages and adapted them to new contexts. For example, the languages of the Argus and Emerald systems adapted object-oriented programming to distributed computing systems.

The 1980s also brought advances in programming language implementation. The reduced instruction set computer (RISC) movement in computer architecture postulated that hardware should be designed for compilers rather than for human assembly programmers. Aided by central processing unit (CPU) speed improvements that enabled increasingly aggressive compiling methods, the RISC movement sparked greater interest in compiler technology for high-level languages.

Language technology continued along these lines well into the 1990s.

Some notable languages that were developed in this period include:

1990s: the Internet age

Logos
Haskell-Logo.svg
Haskell
Lua-Logo.svg
Lua
PHP Logo.png
PHP
Rebol logo.png
Rebol
Python-logo-notext.svg
Python
Ruby logo.svg
Ruby
OCaml Logo.svg
Ocaml

The rapid growth of the Internet in the mid-1990s was the next major historic event in programming languages. By opening up a radically new platform for computer systems, the Internet created an opportunity for new languages to be adopted. In particular, the JavaScript programming language rose to popularity because of its early integration with the Netscape Navigator web browser. Various other scripting languages achieved widespread use in developing customized applications for web servers such as PHP. The 1990s saw no fundamental novelty in imperative languages, but much recombination and maturation of old ideas. This era began the spread of functional languages. A big driving philosophy was programmer productivity. Many rapid application development (RAD) languages emerged, which usually came with an integrated development environment (IDE), garbage collection, and were descendants of older languages. All such languages were object-oriented. These included Object Pascal, Objective Caml (renamed OCaml), Visual Basic, and Java. Java in particular received much attention.

More radical and innovative than the RAD languages were the new scripting languages. These did not directly descend from other languages and featured new syntaxes and more liberal incorporation of features. Many consider these scripting languages to be more productive than even the RAD languages, but often because of choices that make small programs simpler but large programs more difficult to write and maintain.[ citation needed ] Nevertheless, scripting languages came to be the most prominent ones used in connection with the Web.

Some programming languages included other languages in their distribution to save the development time. for example both of Python and Ruby included Tcl to support GUI programming through libraries like Tkinter.

Some notable languages that were developed in this period include:

2000s: programming paradigms

Logos
D Programming Language logo.svg
D
Groovy-logo.svg
Groovy
PowerShell Core 6.0 icon.png
PowerShell
Scratchlogo.svg
Scratch
Go Logo Blue.svg
Go
Clojure logo.svg
Clojure
Haxe logo.svg
Haxe

Programming language evolution continues, and more programming paradigms are used in production.

Some of the trends have included:

Big Tech companies introduced multiple new programming languages that are designed to serve their needs. for example:

Some notable languages developed during this period include:

2010s: the Mobile age

Logos
Rust programming language black logo.svg
Rust
Dart programming language logo.svg
Dart
Swift logo.svg
Swift
Kotlin logo 2021.svg
Kotlin
Typescript.svg
TypeScript
C Sharp Logo 2023.svg
C#
Ringlogo transparent.png
Ring
Julia Programming Language Logo.svg
Julia
Zig logo 2020.svg
Zig

Programming language evolution continues with the rise of new programming domains.

Many Big Tech companies continued introducing new programming languages that are designed to serve their needs and provides first-class support for their platforms. for example:

Some notable languages developed during this period include: [20] [21]

Other new programming languages include Elm, Ballerina, Red, Crystal, V (Vlang), Reason.

Logos
Power Fx logo.png
Power Fx
Carbon logo.png
Carbon

The development of new programming languages continues, and some new languages appears with focus on providing a replacement for current languages. These new languages try to provide the advantages of a known language like C++ (versatile and fast) while adding safety or reducing complexity. Other new languages try to bring ease of use as provided by Python while adding performance as a priority. Also, the growing of Machine Learning and AI tools still plays a big rule behind these languages' development, where some visual languages focus on integrating these AI tools while other textual languages focus on providing more suitable support for developing them. [22] [23] [24]

Some notable new programming languages include:

Key figures

Some innovators
Dennis Ritchie 2011.jpg
Dennis Ritchie
Niklaus Wirth, UrGU.jpg
Niklaus Wirth
Grace Hopper.jpg
Grace M. Hopper
BjarneStroustrup.jpg
Bjarne Stroustrup
Anders Hejlsberg.jpg
Anders Hejlsberg
Guido-portrait-2014-drc.jpg
Guido van Rossum
Yukihiro Matsumoto EuRuKo 2011.jpg
Yukihiro Matsumoto
James Gosling 2008.jpg
James Gosling
Larry Wall YAPC 2007.jpg
Larry Wall

Some key people who helped develop programming languages:

See also

Related Research Articles

<span class="mw-page-title-main">ALGOL</span> Family of programming languages

ALGOL is a family of imperative computer programming languages originally developed in 1958. ALGOL heavily influenced many other languages and was the standard method for algorithm description used by the Association for Computing Machinery (ACM) in textbooks and academic sources for more than thirty years.

In computing, a compiler is a computer program that translates computer code written in one programming language into another language. The name "compiler" is primarily used for programs that translate source code from a high-level programming language to a low-level programming language to create an executable program.

<span class="mw-page-title-main">Programming language</span> Language for communicating instructions to a machine

A programming language is a system of notation for writing computer programs.

In computer science, a compiler-compiler or compiler generator is a programming tool that creates a parser, interpreter, or compiler from some form of formal description of a programming language and machine.

<span class="mw-page-title-main">John Backus</span> American computer scientist

John Warner Backus was an American computer scientist. He led the team that invented and implemented FORTRAN, the first widely used high-level programming language, and was the inventor of the Backus–Naur form (BNF), a widely used notation to define syntaxes of formal languages. He later did research into the function-level programming paradigm, presenting his findings in his influential 1977 Turing Award lecture "Can Programming Be Liberated from the von Neumann Style?"

This is a "genealogy" of programming languages. Languages are categorized under the ancestor language with the strongest influence. Those ancestor languages are listed in alphabetic order. Any such categorization has a large arbitrary element, since programming languages often incorporate major ideas from multiple sources.

In computer science, a high-level programming language is a programming language with strong abstraction from the details of the computer. In contrast to low-level programming languages, it may use natural language elements, be easier to use, or may automate significant areas of computing systems, making the process of developing a program simpler and more understandable than when using a lower-level language. The amount of abstraction provided defines how "high-level" a programming language is.

In computer science, imperative programming is a programming paradigm of software that uses statements that change a program's state. In much the same way that the imperative mood in natural languages expresses commands, an imperative program consists of commands for the computer to perform. Imperative programming focuses on describing how a program operates step by step, rather than on high-level descriptions of its expected results.

ALGOL 60 is a member of the ALGOL family of computer programming languages. It followed on from ALGOL 58 which had introduced code blocks and the begin and end pairs for delimiting them, representing a key advance in the rise of structured programming. ALGOL 60 was one of the first languages implementing function definitions. ALGOL 60 function definitions could be nested within one another, with lexical scope. It gave rise to many other languages, including CPL, PL/I, Simula, BCPL, B, Pascal, and C. Practically every computer of the era had a systems programming language based on ALGOL 60 concepts.

Autocode is the name of a family of "simplified coding systems", later called programming languages, devised in the 1950s and 1960s for a series of digital computers at the Universities of Manchester, Cambridge and London. Autocode was a generic term; the autocodes for different machines were not necessarily closely related as are, for example, the different versions of the single language Fortran.

Programming languages are used for controlling the behavior of a machine. Like natural languages, programming languages follow rules for syntax and semantics.

<span class="mw-page-title-main">Programming language theory</span> Branch of computer science

Programming language theory (PLT) is a branch of computer science that deals with the design, implementation, analysis, characterization, and classification of formal languages known as programming languages. Programming language theory is closely related to other fields including mathematics, software engineering, and linguistics. There are a number of academic conferences and journals in the area.

<span class="mw-page-title-main">History of compiler construction</span>

In computing, a compiler is a computer program that transforms source code written in a programming language or computer language, into another computer language. The most common reason for transforming source code is to create an executable program.

References

  1. Hopper (1978) p. 16.
  2. Knuth, Donald E.; Pardo, Luis Trabb. "Early development of programming languages". Encyclopedia of Computer Science and Technology. 7. Marcel Dekker: 419–493.
  3. Corrado Böhm's PhD thesis
  4. Fuegi, J.; Francis, J. (October–December 2003), "Lovelace & Babbage and the creation of the 1843 'notes'", Annals of the History of Computing, 25 (4): 16–26, doi:10.1109/MAHC.2003.1253887
  5. Bales, Rebecca (24 July 2023). "Charles Babbage Analytical Engine Explained". history-computer.com.
  6. Swade, Doron. "The Engines". computerhistory.org. Retrieved 23 February 2024.
  7. In 1998 and 2000 compilers were created for the language as a historical exercise. Rojas, Raúl, et al. (2000). "Plankalkül: The First High-Level Programming Language and its Implementation". Institut frame Informatik, Freie Universität Berlin, Technical Report B-3/2000. (full text)
  8. Sebesta, W.S. (2006). Concepts of Programming Languages. p. 44. ISBN   978-0-321-33025-3.
  9. "Fortran creator John Backus dies – Tech and gadgets". NBC News. 2007-03-20. Retrieved 2010-04-25.
  10. "CSC-302 99S : Class 02: A Brief History of Programming Languages". Math.grin.edu. Archived from the original on 2010-07-15. Retrieved 2010-04-25.
  11. Padua, David (Feb 2000). "The FORTRAN I Compiler" (PDF). Computing in Science and Engineering. 2 (1): 70–75. Bibcode:2000CSE.....2a..70P. doi:10.1109/5992.814661 . Retrieved 7 November 2019.
  12. Eugene Loh (18 June 2010). "The Ideal HPC Programming Language". Queue. 8 (6). Association of Computing Machines.
  13. "HPL – A Portable Implementation of the High-Performance Linpack Benchmark for Distributed-Memory Computers" . Retrieved 2015-02-21.
  14. Hopper (1978) p. 16.
  15. Sammet (1969) p. 316
  16. Sammet (1978) p. 204.
  17. Gordon, Michael J. C. (1996). "From LCF to HOL: a short history" (PDF). p. 3. Retrieved 2015-05-04. Edinburgh LCF, including the ML interpreter, was implemented in Lisp.
  18. Manjoo, Farhad (July 29, 2020). "How Do You Know a Human Wrote This?". The New York Times . ISSN   0362-4331 . Retrieved August 4, 2020.
  19. Milmo, Dan (2023-12-06). "Google says new AI model Gemini outperforms ChatGPT in most tests". The Guardian. ISSN   0261-3077 . Retrieved 2024-02-26.
  20. "TIOBE Index, Top 100 programming languages according to TIOBE Index". www.tiobe.com. TIOBE index. 22 February 2024.
  21. "GitHub's Octoverse 2018". Archived from the original on 2019-03-22.
  22. https://www.microsoft.com/en-us/power-platform/blog/power-apps/introducing-the-new-copilot-features-for-power-fx/
  23. https://www.infoworld.com/article/2336275/carbon-language-aims-to-be-a-better-c-plus-plus.html
  24. https://devops.com/modular-makes-a-case-for-mojo-programming-language-based-on-python/
  25. Rojas, Raúl; Hashagen, Ulf (2002). The First Computers: History and Architectures. MIT Press. p. 292. ISBN   978-0262681377 . Retrieved October 25, 2013.

Further reading