List of concurrent and parallel programming languages

Last updated

This article lists concurrent and parallel programming languages, categorizing them by a defining paradigm. Concurrent and parallel programming languages involve multiple timelines. Such languages provide synchronization constructs whose behavior is defined by a parallel execution model. A concurrent programming language is defined as one which uses the concept of simultaneously executing processes or threads of execution as a means of structuring a program. A parallel language is able to express programs that are executable on more than one processor. Both types are listed, as concurrency is a useful tool in expressing parallelism, but it is not necessary. In both cases, the features must be part of the language syntax and not an extension such as a library (libraries such as the posix-thread library implement a parallel execution model but lack the syntax and grammar required to be a programming language).

Contents

The following categories aim to capture the main, defining feature of the languages contained, but they are not necessarily orthogonal.

Coordination languages

Dataflow programming

Distributed computing

Event-driven and hardware description

Functional programming

Logic programming

Monitor-based

Multi-threaded

Object-oriented programming

Partitioned global address space (PGAS)

Message passing

Actor model

CSP-based

APIs/frameworks

These application programming interfaces support parallelism in host languages.

See also

Related Research Articles

This is a "genealogy" of programming languages. Languages are categorized under the ancestor language with the strongest influence. Those ancestor languages are listed in alphabetic order. Any such categorization has a large arbitrary element, since programming languages often incorporate major ideas from multiple sources.

A compiled language is a programming language whose implementations are typically compilers, and not interpreters.

Programming languages can be grouped by the number and types of paradigms supported.

<span class="mw-page-title-main">History of programming languages</span> History of languages used to program computers

The history of programming languages spans from documentation of early mechanical computers to modern tools for software development. Early programming languages were highly specialized, relying on mathematical notation and similarly obscure syntax. Throughout the 20th century, research in compiler theory led to the creation of high-level programming languages, which use a more accessible syntax to communicate instructions.

<span class="mw-page-title-main">Concurrency (computer science)</span> Ability to execute a task in a non-serial manner

In computer science, concurrency is the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the outcome. This allows for parallel execution of the concurrent units, which can significantly improve overall speed of the execution in multi-processor and multi-core systems. In more technical terms, concurrency refers to the decomposability of a program, algorithm, or problem into order-independent or partially-ordered components or units of computation.

In computer science, message passing is a technique for invoking behavior on a computer. The invoking program sends a message to a process and relies on that process and its supporting infrastructure to then select and run some appropriate code. Message passing differs from conventional programming where a process, subroutine, or function is directly invoked by name. Message passing is key to some models of concurrency and object-oriented programming.

The actor model in computer science is a mathematical model of concurrent computation that treats an actor as the basic building block of concurrent computation. In response to a message it receives, an actor can: make local decisions, create more actors, send more messages, and determine how to respond to the next message received. Actors may modify their own private state, but can only affect each other indirectly through messaging.

In computer science, future, promise, delay, and deferred refer to constructs used for synchronizing program execution in some concurrent programming languages. They describe an object that acts as a proxy for a result that is initially unknown, usually because the computation of its value is not yet complete.

In computing, a parallel programming model is an abstraction of parallel computer architecture, with which it is convenient to express algorithms and their composition in programs. The value of a programming model can be judged on its generality: how well a range of different problems can be expressed for a variety of different architectures, and its performance: how efficiently the compiled programs can execute. The implementation of a parallel programming model can take the form of a library invoked from a sequential language, as an extension to an existing language, or as an entirely new language.

Concurrent computing is a form of computing in which several computations are executed concurrently—during overlapping time periods—instead of sequentially—with one completing before the next starts.

Programming languages are used for controlling the behavior of a machine. Like natural languages, programming languages follow rules for syntax and semantics.

In computer programming, a green thread is a thread that is scheduled by a runtime library or virtual machine (VM) instead of natively by the underlying operating system (OS). Green threads emulate multithreaded environments without relying on any native OS abilities, and they are managed in user space instead of kernel space, enabling them to work in environments that do not have native thread support.

This comparison of programming languages compares the features of language syntax (format) for over 50 computer programming languages.

Task parallelism is a form of parallelization of computer code across multiple processors in parallel computing environments. Task parallelism focuses on distributing tasks—concurrently performed by processes or threads—across different processors. In contrast to data parallelism which involves running the same task on different components of data, task parallelism is distinguished by running many different tasks at the same time on the same data. A common type of task parallelism is pipelining, which consists of moving a single set of data through a series of separate tasks where each task can execute independently of the others.

<span class="mw-page-title-main">Parallel Extensions</span>

Parallel Extensions was the development name for a managed concurrency library developed by a collaboration between Microsoft Research and the CLR team at Microsoft. The library was released in version 4.0 of the .NET Framework. It is composed of two parts: Parallel LINQ (PLINQ) and Task Parallel Library (TPL). It also consists of a set of coordination data structures (CDS) – sets of data structures used to synchronize and co-ordinate the execution of concurrent tasks.

Join-patterns provides a way to write concurrent, parallel and distributed computer programs by message passing. Compared to the use of threads and locks, this is a high level programming model using communication constructs model to abstract the complexity of concurrent environment and to allow scalability. Its focus is on the execution of a chord between messages atomically consumed from a group of channels.

<span class="mw-page-title-main">Akka (toolkit)</span> Open-source runtime

Akka is a source-available toolkit and runtime simplifying the construction of concurrent and distributed applications on the JVM. Akka supports multiple programming models for concurrency, but it emphasizes actor-based concurrency, with inspiration drawn from Erlang.

References

  1. Thom Frühwirth (9 July 2009). Constraint Handling Rules. Cambridge University Press. ISBN   978-0-521-87776-3.
  2. "Using Threads to Run Code Simultaneously - The Rust Programming Language". doc.rust-lang.org. Retrieved 2022-10-11.
  3. Documentation » The Python Standard Library » Concurrent Execution
  4. "Using Message Passing to Transfer Data Between Threads - The Rust Programming Language". doc.rust-lang.org. Retrieved 2022-10-11.
  5. Alan Kay The Early History Of Smalltalk
  6. "Crystal Programming Language – Concurrency" . Retrieved 10 August 2018.