Reversible computing is any model of computation where the computational process, to some extent, is time-reversible. In a model of computation that uses deterministic transitions from one state of the abstract machine to another, a necessary condition for reversibility is that the relation of the mapping from states to their successors must be one-to-one. Reversible computing is a form of unconventional computing.
Due to the unitarity of quantum mechanics, quantum circuits are reversible, as long as they do not "collapse" the quantum states on which they operate. [1]
There are two major, closely related types of reversibility that are of particular interest for this purpose: physical reversibility and logical reversibility. [2]
A process is said to be physically reversible if it results in no increase in physical entropy; it is isentropic. There is a style of circuit design ideally exhibiting this property that is referred to as charge recovery logic, adiabatic circuits, or adiabatic computing (see Adiabatic process). Although in practice no nonstationary physical process can be exactly physically reversible or isentropic, there is no known limit to the closeness with which we can approach perfect reversibility, in systems that are sufficiently well isolated from interactions with unknown external environments, when the laws of physics describing the system's evolution are precisely known.
A motivation for the study of technologies aimed at implementing reversible computing is that they offer what is predicted to be the only potential way to improve the computational energy efficiency (i.e., useful operations performed per unit energy dissipated) of computers beyond the fundamental von Neumann–Landauer limit [3] [4] of kT ln(2) energy dissipated per irreversible bit operation. Although the Landauer limit was millions of times below the energy consumption of computers in the 2000s and thousands of times less in the 2010s, [5] proponents of reversible computing argue that this can be attributed largely to architectural overheads which effectively magnify the impact of Landauer's limit in practical circuit designs, so that it may prove difficult for practical technology to progress very far beyond current levels of energy efficiency if reversible computing principles are not used. [6]
As was first argued by Rolf Landauer while working at IBM, [7] in order for a computational process to be physically reversible, it must also be logically reversible. Landauer's principle is the observation that the oblivious erasure of n bits of known information must always incur a cost of nkT ln(2) in thermodynamic entropy. A discrete, deterministic computational process is said to be logically reversible if the transition function that maps old computational states to new ones is a one-to-one function; i.e. the output logical states uniquely determine the input logical states of the computational operation.
For computational processes that are nondeterministic (in the sense of being probabilistic or random), the relation between old and new states is not a single-valued function, and the requirement needed to obtain physical reversibility becomes a slightly weaker condition, namely that the size of a given ensemble of possible initial computational states does not decrease, on average, as the computation proceeds forwards.
Landauer's principle (and indeed, the second law of thermodynamics) can also be understood to be a direct logical consequence of the underlying reversibility of physics, as is reflected in the general Hamiltonian formulation of mechanics, and in the unitary time-evolution operator of quantum mechanics more specifically. [8]
The implementation of reversible computing thus amounts to learning how to characterize and control the physical dynamics of mechanisms to carry out desired computational operations so precisely that the experiment accumulates a negligible total amount of uncertainty regarding the complete physical state of the mechanism, per each logic operation that is performed. In other words, precisely track the state of the active energy that is involved in carrying out computational operations within the machine, and design the machine so that the majority of this energy is recovered in an organized form that can be reused for subsequent operations, rather than being permitted to dissipate into the form of heat.
Although achieving this goal presents a significant challenge for the design, manufacturing, and characterization of ultra-precise new physical mechanisms for computing, there is at present no fundamental reason to think that this goal cannot eventually be accomplished, allowing someday to build computers that generate much less than 1 bit's worth of physical entropy (and dissipate much less than kT ln 2 energy to heat) for each useful logical operation that they carry out internally.
Today, the field has a substantial body of academic literature. A wide variety of reversible device concepts, logic gates, electronic circuits, processor architectures, programming languages, and application algorithms have been designed and analyzed by physicists, electrical engineers, and computer scientists.
This field of research awaits the detailed development of a high-quality, cost-effective, nearly reversible logic device technology, one that includes highly energy-efficient clocking and synchronization mechanisms, or avoids the need for these through asynchronous design. This sort of solid engineering progress will be needed before the large body of theoretical research on reversible computing can find practical application in enabling real computer technology to circumvent the various near-term barriers to its energy efficiency, including the von Neumann–Landauer bound. This may only be circumvented by the use of logically reversible computing, due to the second law of thermodynamics. [9]
For a computational operation to be logically reversible means that the output (or final state) of the operation can be computed from the input (or initial state), and vice versa. Reversible functions are bijective. This means that reversible gates (and circuits, i.e. compositions of multiple gates) generally have the same number of input bits as output bits (assuming that all input bits are consumed by the operation, and that all input/output states are possible).
An inverter (NOT) gate is logically reversible because it can be undone. The NOT gate may however not be physically reversible, depending on its implementation.
The exclusive or (XOR) gate is irreversible because its two inputs cannot be unambiguously reconstructed from its single output, or alternatively, because information erasure is not reversible. However, a reversible version of the XOR gate—the controlled NOT gate (CNOT)—can be defined by preserving one of the inputs as a 2nd output. The three-input variant of the CNOT gate is called the Toffoli gate. It preserves two of its inputs a,b and replaces the third c by . With , this gives the AND function, and with this gives the NOT function. Because AND and NOT together is a functionally complete set, the Toffoli gate is universal and can implement any Boolean function (if given enough initialized ancilla bits).
Similarly, in the Turing machine model of computation, a reversible Turing machine is one whose transition function is invertible, so that each machine state has at most one predecessor.
Yves Lecerf proposed a reversible Turing machine in a 1963 paper, [10] but apparently unaware of Landauer's principle, did not pursue the subject further, devoting most of the rest of his career to ethnolinguistics. In 1973 Charles H. Bennett, at IBM Research, showed that a universal Turing machine could be made both logically and thermodynamically reversible, [11] and therefore able in principle to perform an arbitrarily large number of computation steps per unit of physical energy dissipated, if operated sufficiently slowly. Thermodynamically reversible computers could perform useful computations at useful speed, while dissipating considerably less than kT of energy per logical step. In 1982 Edward Fredkin and Tommaso Toffoli proposed the Billiard ball computer, a mechanism using classical hard spheres to do reversible computations at finite speed with zero dissipation, but requiring perfect initial alignment of the balls' trajectories, and Bennett's review [12] compared these "Brownian" and "ballistic" paradigms for reversible computation. Aside from the motivation of energy-efficient computation, reversible logic gates offered practical improvements of bit-manipulation transforms in cryptography and computer graphics. Since the 1980s, reversible circuits have attracted interest as components of quantum algorithms, and more recently in photonic and nano-computing technologies where some switching devices offer no signal gain.
Surveys of reversible circuits, their construction and optimization, as well as recent research challenges, are available. [13] [14] [15] [16] [17]
A logic gate is a device that performs a Boolean function, a logical operation performed on one or more binary inputs that produces a single binary output. Depending on the context, the term may refer to an ideal logic gate, one that has, for instance, zero rise time and unlimited fan-out, or it may refer to a non-ideal physical device.
A quantum computer is a computer that exploits quantum mechanical phenomena. On small scales, physical matter exhibits properties of both particles and waves, and quantum computing leverages this behavior using specialized hardware. Classical physics cannot explain the operation of these quantum devices, and a scalable quantum computer could perform some calculations exponentially faster than any modern "classical" computer. In particular, a large-scale quantum computer could break widely used encryption schemes and aid physicists in performing physical simulations; however, the current state of the art is largely experimental and impractical, with several obstacles to useful applications.
Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term.
In logic circuits, the Toffoli gate, also known as the CCNOT gate (“controlled-controlled-not”), invented by Tommaso Toffoli, is a CNOT gate with two control qubits and one target qubit. That is, the target qubit will be inverted if the first and second qubits are both 1. It is a universal reversible logic gate, which means that any classical reversible circuit can be constructed from Toffoli gates.
An adder, or summer, is a digital circuit that performs addition of numbers. In many computers and other kinds of processors, adders are used in the arithmetic logic units (ALUs). They are also used in other parts of the processor, where they are used to calculate addresses, table indices, increment and decrement operators and similar operations.
In quantum information theory, a quantum circuit is a model for quantum computation, similar to classical circuits, in which a computation is a sequence of quantum gates, measurements, initializations of qubits to known values, and possibly other actions. The minimum set of actions that a circuit needs to be able to perform on the qubits to enable quantum computation is known as DiVincenzo's criteria.
In quantum computing and specifically the quantum circuit model of computation, a quantum logic gate is a basic quantum circuit operating on a small number of qubits. Quantum logic gates are the building blocks of quantum circuits, like classical logic gates are for conventional digital circuits.
Rolf William Landauer was a German-American physicist who made important contributions in diverse areas of the thermodynamics of information processing, condensed matter physics, and the conductivity of disordered media. Born in Germany, he emigrated to the U.S. in 1938, obtained a Ph.D. in physics from Harvard in 1950, and then spent most of his career at IBM.
Charles Henry Bennett is a physicist, information theorist and IBM Fellow at IBM Research. Bennett's recent work at IBM has concentrated on a re-examination of the physical basis of information, applying quantum physics to the problems surrounding information exchange. He has played a major role in elucidating the interconnections between physics and information, particularly in the realm of quantum computation, but also in cellular automata and reversible computing. He discovered, with Gilles Brassard, the concept of quantum cryptography and is one of the founding fathers of modern quantum information theory.
The Fredkin gate is a computational circuit suitable for reversible computing, invented by Edward Fredkin. It is universal, which means that any logical or arithmetic operation can be constructed entirely of Fredkin gates. The Fredkin gate is a circuit or device with three inputs and three outputs that transmits the first bit unchanged and swaps the last two bits if, and only if, the first bit is 1.
Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that an irreversible change in information stored in a computer, such as merging two computational paths, dissipates a minimum amount of heat to its surroundings.
The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.
Unconventional computing is computing by any of a wide range of new or unusual methods.
A chemical computer, also called a reaction-diffusion computer, Belousov–Zhabotinsky (BZ) computer, or gooware computer, is an unconventional computer based on a semi-solid chemical "soup" where data are represented by varying concentrations of chemicals. The computations are performed by naturally occurring chemical reactions.
In computational complexity theory and circuit complexity, a Boolean circuit is a mathematical model for combinational digital logic circuits. A formal language can be decided by a family of Boolean circuits, one circuit for each possible input length.
Quantum dot cellular automata are a proposed improvement on conventional computer design (CMOS), which have been devised in analogy to conventional models of cellular automata introduced by John von Neumann.
Adiabatic circuits are low-power electronic circuits which use "reversible logic" to conserve energy. The term "adiabatic" refers to an ideal thermodynamic process in which no heat or mass is exchanged with the surrounding environment, alluding to the ability of the circuits to reduce energy loss as heat.
In reversible computing, ancilla bits are extra bits being used to implement irreversible logical operations. In classical computation, any memory bit can be turned on or off at will, requiring no prior knowledge or extra complexity. However, this is not the case in quantum computing or classical reversible computing. In these models of computing, all operations on computer memory must be reversible, and toggling a bit on or off would lose the information about the initial value of that bit. For this reason, in a quantum algorithm there is no way to deterministically put bits in a specific prescribed state unless one is given access to bits whose original state is known in advance. Such bits, whose values are known a priori, are known as ancilla bits in a quantum or reversible computing task.
In quantum computing, a qubit is a unit of information analogous to a bit in classical computing, but it is affected by quantum mechanical properties such as superposition and entanglement which allow qubits to be in some ways more powerful than classical bits for some tasks. Qubits are used in quantum circuits and quantum algorithms composed of quantum logic gates to solve computational problems, where they are used for input/output and intermediate computations.
Dmitri Maslov is a Canadian-American computer scientist known for his work on quantum circuit synthesis and optimization, quantum advantage, and benchmarking quantum computers. Currently, he is the Chief Software Architect at IBM Quantum. Maslov was formerly a program director for Quantum Information Science at the National Science Foundation. He was named a Fellow of the Institute of Electrical and Electronics Engineers in 2021 "for contributions to quantum circuit synthesis and optimization, and compiling for quantum computers."
The entropy of a closed system, e.g., a computer with its own batteries, cannot decrease; hence this entropy must appear elsewhere as a heating effect, supplying 0.6931 kT per restored bit to the surroundings.