Synchronous programming language

Last updated

A synchronous programming language is a computer programming language optimized for programming reactive systems.

Contents

Computer systems can be sorted in three main classes:

  1. Transformational systems take some inputs, process them, deliver their outputs, and terminate their execution. A typical example is a compiler.
  2. Interactive systems interact continuously with their environment, at their own speed. A typical example is the web.
  3. Reactive systems interact continuously with their environment, at a speed imposed by the environment. A typical example is the automatic flight control system of modern airplanes. Reactive systems must therefore react to stimuli from the environment within strict time bounds. For this reason they are often also called real-time systems, and are found often in embedded systems.

Synchronous programming, also called synchronous reactive programming (SRP), is a computer programming paradigm supported by synchronous programming languages. The principle of SRP is to make the same abstraction for programming languages as the synchronous abstraction in digital circuits. Synchronous circuits are indeed designed at a high-level of abstraction where the timing characteristics of the electronic transistors are neglected. Each gate of the circuit (or, and, ...) is therefore assumed to compute its result instantaneously, each wire is assumed to transmit its signal instantaneously. A synchronous circuit is clocked and at each tick of its clock, it computes instantaneously its output values and the new values of its memory cells (latches) from its input values and the current values of its memory cells. In other words, the circuit behaves as if the electrons were flowing infinitely fast. The first synchronous programming languages were invented in France in the 1980s: Esterel, Lustre, and SIGNAL. Since then, many other synchronous languages have emerged.

The synchronous abstraction makes reasoning about time in a synchronous program a lot easier, thanks to the notion of logical ticks: a synchronous program reacts to its environment in a sequence of ticks, and computations within a tick are assumed to be instantaneous, i.e., as if the processor executing them were infinitely fast. The statement "a||b" is therefore abstracted as the package "ab" where "a" and "b" are simultaneous. To take a concrete example, the Esterel statement "'every 60 second emit minute" specifies that the signal "minute" is exactly synchronous with the 60-th occurrence of the signal "second". At a more fundamental level, the synchronous abstraction eliminates the non-determinism resulting from the interleaving of concurrent behaviors. This allows deterministic semantics, therefore making synchronous programs amenable to formal analysis, verification and certified code generation, and usable as formal specification formalisms.

In contrast, in the asynchronous model of computation, on a sequential processor, the statement "a||b" can be either implemented as "a;b" or as "b;a". This is known as the interleaving-based non determinism. The drawback with an asynchronous model is that it intrinsically forbids deterministic semantics (e.g., race conditions), which makes formal reasoning such as analysis and verification more complex. Nonetheless, asynchronous formalisms are very useful to model, design and verify distributed systems, because they are intrinsically asynchronous.

Also in contrast are systems with processes that basically interact synchronously. An example would be systems based on the Communicating sequential processes (CSP) model, which allows deterministic (external) and nondeterministic (internal) choice.

Synchronous languages

See also

Related Research Articles

In digital logic and computing, a counter is a device which stores the number of times a particular event or process has occurred, often in relationship to a clock. The most common type is a sequential digital logic circuit with an input line called the clock and multiple output lines. The values on the output lines represent a number in the binary or BCD number system. Each pulse applied to the clock input increments or decrements the number in the counter.

In computer science, denotational semantics is an approach of formalizing the meanings of programming languages by constructing mathematical objects that describe the meanings of expressions from the languages. Other approaches providing formal semantics of programming languages include axiomatic semantics and operational semantics.

In computer engineering, a hardware description language (HDL) is a specialized computer language used to describe the structure and behavior of electronic circuits, most commonly to design ASICs and program FPGAs.

In computer science, communicating sequential processes (CSP) is a formal language for describing patterns of interaction in concurrent systems. It is a member of the family of mathematical theories of concurrency known as process algebras, or process calculi, based on message passing via channels. CSP was highly influential in the design of the occam programming language and also influenced the design of programming languages such as Limbo, RaftLib, Erlang, Go, Crystal, and Clojure's core.async.

In computer science, message queues and mailboxes are software-engineering components typically used for inter-process communication (IPC), or for inter-thread communication within the same process. They use a queue for messaging – the passing of control or of content. Group communication systems provide similar kinds of functionality.

<span class="mw-page-title-main">Concurrency (computer science)</span> Ability to execute a task in a non-serial manner

In computer science, concurrency is the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the outcome. This allows for parallel execution of the concurrent units, which can significantly improve overall speed of the execution in multi-processor and multi-core systems. In more technical terms, concurrency refers to the decomposability of a program, algorithm, or problem into order-independent or partially-ordered components or units of computation.

Esterel is a synchronous programming language for the development of complex reactive systems. The imperative programming style of Esterel allows the simple expression of parallelism and preemption. As a consequence, it is well suited for control-dominated model designs.

In computer science, message passing is a technique for invoking behavior on a computer. The invoking program sends a message to a process and relies on that process and its supporting infrastructure to then select and run some appropriate code. Message passing differs from conventional programming where a process, subroutine, or function is directly invoked by name. Message passing is key to some models of concurrency and object-oriented programming.

Asynchronous circuit is a sequential digital logic circuit that does not use a global clock circuit or signal generator to synchronize its components. Instead, the components are driven by a handshaking circuit which indicates a completion of a set of instructions. Handshaking works by simple data transfer protocols. Many synchronous circuits were developed in early 1950s as part of bigger asynchronous systems. Asynchronous circuits and theory surrounding is a part of several steps in integrated circuit design, a field of digital electronics engineering.

Lustre is a formally defined, declarative, and synchronous dataflow programming language for programming reactive systems. It began as a research project in the early 1980s. A formal presentation of the language can be found in the 1991 Proceedings of the IEEE. In 1993 it progressed to practical, industrial use in a commercial product as the core language of the industrial environment SCADE, developed by Esterel Technologies. It is now used for critical control software in aircraft, helicopters, and nuclear power plants.

Averest is a synchronous programming language and set of tools to specify, verify, and implement reactive systems. It includes a compiler for synchronous programs, a symbolic model checker, and a tool for hardware/software synthesis.

Concurrent computing is a form of computing in which several computations are executed concurrently—during overlapping time periods—instead of sequentially—with one completing before the next starts.

The actor model and process calculi share an interesting history and co-evolution.

Functional reactive programming (FRP) is a programming paradigm for reactive programming using the building blocks of functional programming. FRP has been used for programming graphical user interfaces (GUIs), robotics, games, and music, aiming to simplify these problems by explicitly modeling time.

PROMELA is a verification modeling language introduced by Gerard J. Holzmann. The language allows for the dynamic creation of concurrent processes to model, for example, distributed systems. In PROMELA models, communication via message channels can be defined to be synchronous, or asynchronous. PROMELA models can be analyzed with the SPIN model checker, to verify that the modeled system produces the desired behavior. An implementation verified with Isabelle/HOL is also available, as part of the Computer Aided Verification of Automata (CAVA) project. Files written in Promela traditionally have a .pml file extension.

In computing, reactive programming is a declarative programming paradigm concerned with data streams and the propagation of change. With this paradigm, it's possible to express static or dynamic data streams with ease, and also communicate that an inferred dependency within the associated execution model exists, which facilitates the automatic propagation of the changed data flow.

<span class="mw-page-title-main">Reo Coordination Language</span>

Reo is a domain-specific language for programming and analyzing coordination protocols that compose individual processes into full systems, broadly construed. Examples of classes of systems that can be composed with Reo include component-based systems, service-oriented systems, multithreading systems, biological systems, and cryptographic protocols. Reo has a graphical syntax in which every Reo program, called a connector or circuit, is a labeled directed hypergraph. Such a graph represents the data-flow among the processes in the system. Reo has formal semantics, which stand at the basis of its various formal verification techniques and compilation tools.

Globally asynchronous locally synchronous (GALS), in electronics, is an architecture for designing electronic circuits that addresses the problem of safe and reliable data transfer between independent clock domains. GALS is a model of computation that emerged in the 1980s. It allows to design computer systems consisting of several synchronous islands interacting with other islands using asynchronous communication, e.g. with FIFOs.

SIGNAL is a programming language based on synchronized data-flow : a process is a set of equations on elementary flows describing both data and control.

Join-patterns provides a way to write concurrent, parallel and distributed computer programs by message passing. Compared to the use of threads and locks, this is a high level programming model using communication constructs model to abstract the complexity of concurrent environment and to allow scalability. Its focus is on the execution of a chord between messages atomically consumed from a group of channels.

References

  1. G. Berry and G. Gonthier. The synchronous programming language ESTEREL: Design, semantics, implementation. Science of Computer Programming, 19(2), 1992.