Delay insensitive circuit

Last updated

A delay-insensitive circuit is a type of asynchronous circuit which performs a digital logic operation often within a computing processor chip. Instead of using clock signals or other global control signals, the sequencing of computation in delay-insensitive circuit is determined by the data flow.

An asynchronous circuit, or self-timed circuit, is a sequential digital logic circuit which is not governed by a clock circuit or global clock signal. Instead it often uses signals that indicate completion of instructions and operations, specified by simple data transfer protocols. This type of circuit is contrasted with synchronous circuits, in which changes to the signal values in the circuit are triggered by repetitive pulses called a clock signal. Most digital devices today use synchronous circuits. However asynchronous circuits have the potential to be faster, and may also have advantages in lower power consumption, lower electromagnetic interference, and better modularity in large systems. Asynchronous circuits are an active area of research in digital logic design.

In electronics and especially synchronous digital circuits, a clock signal is a particular type of signal that oscillates between a high and a low state and is used like a metronome to coordinate actions of digital circuits.

Contents

Data flows from one circuit element to another using "handshakes", or sequences of voltage transitions to indicate readiness to receive data, or readiness to offer data. Typically, inputs of a circuit module will indicate their readiness to receive, which will be "acknowledged" by the connected output by sending data (encoded in such a way that the receiver can detect the validity directly [1] ), and once that data has been safely received, the receiver will explicitly acknowledge it, allowing the sender to remove the data, thus completing the handshake, and allowing another datum to be transmitted.

In a delay-insensitive circuit, there is therefore no need to provide a clock signal to determine a starting time for a computation. Instead, the arrival of data to the input of a sub-circuit triggers the computation to start. Consequently, the next computation can be initiated immediately when the result of the first computation is completed.

The main advantage of such circuits is their ability to optimize processing of activities that can take arbitrary periods of time depending on the data or requested function. An example of a process with a variable time for completion would be mathematical division or recovery of data where such data might be in a cache.

Division (mathematics) arithmetic operation; one of the four basic operations of arithmetic (others being addition, subtraction, multiplication).The division of two natural numbers is the process of calculating the number of times one number is contained within one another

Division is one of the four basic operations of arithmetic, the others being addition, subtraction, and multiplication. The mathematical symbols used for the division operator are the obelus (÷) and the slash (/).

Cache (computing) computing component that transparently stores data so that future requests for that data can be served faster

In computing, a cache is a hardware or software component that stores data so that future requests for that data can be served faster; the data stored in a cache might be the result of an earlier computation or a copy of data stored elsewhere. A cache hit occurs when the requested data can be found in a cache, while a cache miss occurs when it cannot. Cache hits are served by reading data from the cache, which is faster than recomputing a result or reading from a slower data store; thus, the more requests that can be served from the cache, the faster the system performs.

The Delay-Insensitive (DI) class is the most robust of all asynchronous circuit delay models. It makes no assumptions on the delay of wires or gates. In this model all transitions on gates or wires must be acknowledged before transitioning again. This condition stops unseen transitions from occurring. In DI circuits any transition on an input to a gate must be seen on the output of the gate before a subsequent transition on that input is allowed to happen. This forces some input states or sequences to become illegal. For example OR gates must never go into the state where both inputs are one, as the entry and exit from this state will not be seen on the output of the gate. Although this model is very robust, no practical circuits are possible due to the lack of expressible conditionals in DI circuits. [2] Instead the Quasi-Delay-Insensitive model is the smallest compromise model yet capable of generating useful computing circuits. For this reason circuits are often incorrectly referred to as Delay-Insensitive when they are Quasi Delay-Insensitive.

See also

Related Research Articles

In digital logic and computing, a counter is a device which stores the number of times a particular event or process has occurred, often in relationship to a clock signal. The most common type is a sequential digital logic circuit with an input line called the clock and multiple output lines. The values on the output lines represent a number in the binary or BCD number system. Each pulse applied to the clock input increments or decrements the number in the counter.

Digital electronics Electronic circuits that utilize digital signals

Digital electronics or digital (electronic) circuits are electronics that operate on digital signals. In contrast, analog circuits manipulate analog signals whose performance is more subject to manufacturing tolerance, signal attenuation and noise. Digital techniques are helpful because it is a lot easier to get an electronic device to switch into one of a number of known states than to accurately reproduce a continuous range of values.

In digital circuit theory, sequential logic is a type of logic circuit whose output depends not only on the present value of its input signals but on the sequence of past inputs, the input history as well. This is in contrast to combinational logic, whose output is a function of only the present input. That is, sequential logic has state (memory) while combinational logic does not.

In digital electronics, the fan-out of a logic gate output is the number of gate inputs it can drive.

In telecommunications and electronics, a self-clocking signal is one that can be decoded without the need for a separate clock signal or other source of synchronization. This is usually done by including embedded synchronization information within the signal, and adding constraints on the coding of the data payload such that false synchronization can easily be detected.

ATPG is an electronic design automation method/technology used to find an input sequence that, when applied to a digital circuit, enables automatic test equipment to distinguish between the correct circuit behavior and the faulty circuit behavior caused by defects. The generated patterns are used to test semiconductor devices after manufacture, or to assist with determining the cause of failure. The effectiveness of ATPG is measured by the number of modeled defects, or fault models, detectable and by the number of generated patterns. These metrics generally indicate test quality and test application time. ATPG efficiency is another important consideration that is influenced by the fault model under consideration, the type of circuit under test, the level of abstraction used to represent the circuit under test, and the required test quality.

Synchronous and asynchronous transmissions are two different methods of transmission synchronization. Synchronous transmissions are synchronized by an external clock, while asynchronous transmissions are synchronized by special signals along the transmission medium.

Propagation delay is the length of time taken for the quantity of interest to reach its destination. It can relate to networking, electronics or physics.

In data communications, flow control is the process of managing the rate of data transmission between two nodes to prevent a fast sender from overwhelming a slow receiver. It provides a mechanism for the receiver to control the transmission speed, so that the receiving node is not overwhelmed with data from transmitting node. Flow control should be distinguished from congestion control, which is used for controlling the flow of data when congestion has actually occurred. Flow control mechanisms can be classified by whether or not the receiving node sends feedback to the sending node.

A synchronous circuit is a digital circuit in which the changes in the state of memory elements are synchronized by a clock signal. In a sequential digital logic circuit, data is stored in memory devices called flip-flops or latches. The output of a flip-flop is constant until a pulse is applied to its "clock" input, upon which the input of the flip-flop is latched into its output. In a synchronous logic circuit, an electronic oscillator called the clock generates a string of pulses, the "clock signal". This clock signal is applied to every storage element, so in an ideal synchronous circuit, every change in the logical levels of its storage components is simultaneous. Ideally, the input to each storage element has reached its final value before the next clock occurs, so the behaviour of the whole circuit can be predicted exactly. Practically, some delay is required for each logical operation, resulting in a maximum speed at which each synchronous system can run.

Delay-insensitive minterm synthesis

The DIMS system is an asynchronous design methodology making the least possible timing assumptions. Assuming only the quasi-delay-insensitive delay model the generated designs need little if any timing hazard testing. The basis for DIMS is the use of two wires to represent each bit of data. This is known as a dual-rail data encoding. Parts of the system communicate using the early four-phase asynchronous protocol.

A synchronous programming language is a computer programming language optimized for programming reactive systems. Computer systems can be sorted in three main classes: (1) transformational systems that take some inputs, process them, deliver their outputs, and terminate their execution; a typical example is a compiler; (2) interactive systems that interact continuously with their environment, at their own speed; a typical example is the web; and (3) reactive systems that interact continuously with their environment, at a speed imposed by the environment; a typical example is the automatic flight control system of modern airplanes. Reactive systems must therefore react to stimuli from the environment within strict time bounds. For this reason they are often also called real-time systems, and are found often in embedded systems.

C-element

The Muller C-element is a small digital block widely used in design of asynchronous circuits and systems. It has been specified formally in 1955 by David E. Muller and first used in ILLIAC II computer. In terms of the theory of lattices, the C-element is a semimodular distributive circuit, whose operation in time is described by a Hasse diagram. The C-element is closely related to the rendezvous and join elements, where an input is not allowed to change twice in succession. In some cases, when relations between delays are known, the C-element can be realized as a sum-of-product (SOP) circuit ,. Earlier techniques for implementing the C-element include Schmidt trigger, Eccles-Jordan flip-flop and last moving point flip-flop.

Metastability (electronics)

Metastability in electronics is the ability of a digital electronics system to persist for an unbounded time in an unstable equilibrium or metastable state. In digital logic circuits, a digital signal is required to be within certain voltage or current limits to represent a '0' or '1' logic level for correct circuit operation; if the signal is within a forbidden intermediate range it may cause faulty behavior in logic gates the signal is applied to. In metastable states, the circuit may be unable to settle into a stable '0' or '1' logic level within the time required for proper circuit operation. As a result, the circuit can act in unpredictable ways, and may lead to a system failure, sometimes referred to as a "glitch". Metastability is an instance of Buridan's paradox.

Static timing analysis (STA) is a simulation method of computing the expected timing of a digital circuit without requiring a simulation of the full circuit.

In digital logic design, an asynchronous circuit is quasi delay-insensitive (QDI) when it operates correctly, independent of gate and wire delay with the weakest exception necessary to be turing-complete.

The primary focus of this article is asynchronous control in digital electronic systems. In a synchronous system, operations are coordinated by one, or more, centralized clock signals. An asynchronous digital system, in contrast, has no global clock. Asynchronous systems do not depend on strict arrival times of signals or messages for reliable operation. Coordination is achieved via events such as: packet arrival, changes (transitions) of signals, handshake protocols, and other methods.

Quantum dot cellular automata are a proposed improvement on conventional computer design (CMOS), which have been devised in analogy to conventional models of cellular automata introduced by John von Neumann.

Flip-flop (electronics) circuit that has two stable states and can be used to store state information

In electronics, a flip-flop or latch is a circuit that has two stable states and can be used to store state information. A flip-flop is a bistable multivibrator. The circuit can be made to change state by signals applied to one or more control inputs and will have one or two outputs. It is the basic storage element in sequential logic. Flip-flops and latches are fundamental building blocks of digital electronics systems used in computers, communications, and many other types of systems.

References