State (computer science)

Last updated

In information technology and computer science, a system is described as stateful if it is designed to remember preceding events or user interactions; [1] the remembered information is called the state of the system.

Contents

The set of states a system can occupy is known as its state space. In a discrete system, the state space is countable and often finite. The system's internal behaviour or interaction with its environment consists of separately occurring individual actions or events, such as accepting input or producing output, that may or may not cause the system to change its state. Examples of such systems are digital logic circuits and components, automata and formal language, computer programs, and computers.

The output of a digital circuit or deterministic computer program at any time is completely determined by its current inputs and its state. [2]

Digital logic circuit state

Digital logic circuits can be divided into two types: combinational logic, whose output signals are dependent only on its present input signals, and sequential logic, whose outputs are a function of both the current inputs and the past history of inputs. [3] In sequential logic, information from past inputs is stored in electronic memory elements, such as flip-flops. The stored contents of these memory elements, at a given point in time, is collectively referred to as the circuit's state and contains all the information about the past to which the circuit has access. [4]

Since each binary memory element, such as a flip-flop, has only two possible states, one or zero, and there is a finite number of memory elements, a digital circuit has only a certain finite number of possible states. If N is the number of binary memory elements in the circuit, the maximum number of states a circuit can have is 2N.

Program state

Similarly, a computer program stores data in variables, which represent storage locations in the computer's memory. The contents of these memory locations, at any given point in the program's execution, are called the program's state. [5] [6] [7]

A more specialized definition of state is used for computer programs that operate serially or sequentially on streams of data, such as parsers, firewalls, communication protocols and encryption. Serial programs operate on the incoming data characters or packets sequentially, one at a time. In some of these programs, information about previous data characters or packets received is stored in variables and used to affect the processing of the current character or packet. This is called a stateful protocol and the data carried over from the previous processing cycle is called the state. In others, the program has no information about the previous data stream and starts fresh with each data input; this is called a stateless protocol.

Imperative programming is a programming paradigm (way of designing a programming language) that describes computation in terms of the program state, and of the statements which change the program state. Changes of state are implicit, managed by the program runtime, so that a subroutine has visibility of the changes of state made by other parts of the program, known as side effects.

In declarative programming languages, the program describes the desired results and doesn't specify changes to the state directly.

In functional programming, state is usually represented with temporal logic as explicit variables that represent the program state at each step of a program execution: a state variable is passed as an input parameter of a state-transforming function, which returns the updated state as part of its return value. A pure functional subroutine only has visibility of changes of state represented by the state variables in its scope.

Finite state machines

The output of a sequential circuit or computer program at any time is completely determined by its current inputs and current state. Since each binary memory element has only two possible states, 0 or 1, the total number of different states a circuit can assume is finite, and fixed by the number of memory elements. If there are N binary memory elements, a digital circuit can have at most 2N distinct states. The concept of state is formalized in an abstract mathematical model of computation called a finite state machine, used to design both sequential digital circuits and computer programs.

Examples

An example of an everyday device that has a state is a television set. To change the channel of a TV, the user usually presses a "channel up" or "channel down" button on the remote control, which sends a coded message to the set. In order to calculate the new channel that the user desires, the digital tuner in the television must have stored in it the number of the current channel it is on. It then adds one or subtracts one from this number to get the number for the new channel, and adjusts the TV to receive that channel. This new number is then stored as the current channel. Similarly, the television also stores a number that controls the level of volume produced by the speaker. Pressing the "volume up" or "volume down" buttons increments or decrements this number, setting a new level of volume. Both the current channel and current volume numbers are part of the TV's state. They are stored in non-volatile memory, which preserves the information when the TV is turned off, so when it is turned on again the TV will return to its previous station and volume level.

As another example, the state of a microprocessor is the contents of all the memory elements in it: the accumulators, storage registers, data caches, and flags. When computers such as laptops go into hibernation mode to save energy by shutting down the processor, the state of the processor is stored on the computer's hard disk, so it can be restored when the computer comes out of hibernation, and the processor can take up operations where it left off.

See also

Related Research Articles

The bit is the most basic unit of information in computing and digital communications. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented as either "1" or "0", but other representations such as true/false, yes/no, on/off, or +/ are also widely used.

In digital logic and computing, a counter is a device which stores the number of times a particular event or process has occurred, often in relationship to a clock. The most common type is a sequential digital logic circuit with an input line called the clock and multiple output lines. The values on the output lines represent a number in the binary or BCD number system. Each pulse applied to the clock input increments or decrements the number in the counter.

<span class="mw-page-title-main">Finite-state machine</span> Mathematical model of computation

A finite-state machine (FSM) or finite-state automaton, finite automaton, or simply a state machine, is a mathematical model of computation. It is an abstract machine that can be in exactly one of a finite number of states at any given time. The FSM can change from one state to another in response to some inputs; the change from one state to another is called a transition. An FSM is defined by a list of its states, its initial state, and the inputs that trigger each transition. Finite-state machines are of two types—deterministic finite-state machines and non-deterministic finite-state machines. For any non-deterministic finite-state machine, an equivalent deterministic one can be constructed.

<span class="mw-page-title-main">Logic gate</span> Device performing a Boolean function

A logic gate is a device that performs a Boolean function, a logical operation performed on one or more binary inputs that produces a single binary output. Depending on the context, the term may refer to an ideal logic gate, one that has, for instance, zero rise time and unlimited fan-out, or it may refer to a non-ideal physical device.

<span class="mw-page-title-main">Multiplexer</span> A device that selects between several analog or digital input signals

In electronics, a multiplexer, also known as a data selector, is a device that selects between several analog or digital input signals and forwards the selected input to a single output line. The selection is directed by a separate set of digital inputs known as select lines. A multiplexer of inputs has select lines, which are used to select which input line to send to the output.

<span class="mw-page-title-main">Digital electronics</span> Electronic circuits that utilize digital signals

Digital electronics is a field of electronics involving the study of digital signals and the engineering of devices that use or produce them. This is in contrast to analog electronics which work primarily with analog signals. Despite the name, digital electronics designs includes important analog design considerations.

Ladder logic was originally a written method to document the design and construction of relay racks as used in manufacturing and process control. Each device in the relay rack would be represented by a symbol on the ladder diagram with connections between those devices shown. In addition, other items external to the relay rack such as pumps, heaters, and so forth would also be shown on the ladder diagram.

<span class="mw-page-title-main">Combinational logic</span> Type of digital logic implemented by boolean circuits

In automata theory, combinational logic is a type of digital logic that is implemented by Boolean circuits, where the output is a pure function of the present input only. This is in contrast to sequential logic, in which the output depends not only on the present input but also on the history of the input. In other words, sequential logic has memory while combinational logic does not.

In automata theory, sequential logic is a type of logic circuit whose output depends on the present value of its input signals and on the sequence of past inputs, the input history. This is in contrast to combinational logic, whose output is a function of only the present input. That is, sequential logic has state (memory) while combinational logic does not.

Reading is an action performed by computers, to acquire data from a source and place it into their volatile memory for processing. Computers may read information from a variety of sources, such as magnetic storage, the Internet, or audio and video input ports. Reading is one of the core functions of a Turing machine.

Formal equivalence checking process is a part of electronic design automation (EDA), commonly used during the development of digital integrated circuits, to formally prove that two representations of a circuit design exhibit exactly the same behavior.

Dataflow architecture is a dataflow-based computer architecture that directly contrasts the traditional von Neumann architecture or control flow architecture. Dataflow architectures have no program counter, in concept: the executability and execution of instructions is solely determined based on the availability of input arguments to the instructions, so that the order of instruction execution may be hard to predict.

In digital electronics, a synchronous circuit is a digital circuit in which the changes in the state of memory elements are synchronized by a clock signal. In a sequential digital logic circuit, data is stored in memory devices called flip-flops or latches. The output of a flip-flop is constant until a pulse is applied to its "clock" input, upon which the input of the flip-flop is latched into its output. In a synchronous logic circuit, an electronic oscillator called the clock generates a string (sequence) of pulses, the "clock signal". This clock signal is applied to every storage element, so in an ideal synchronous circuit, every change in the logical levels of its storage components is simultaneous. Ideally, the input to each storage element has reached its final value before the next clock occurs, so the behaviour of the whole circuit can be predicted exactly. Practically, some delay is required for each logical operation, resulting in a maximum speed limitations at which each synchronous system can run.

The algorithmic state machine (ASM) is a method for designing finite state machines (FSMs) originally developed by Thomas E. Osborne at the University of California, Berkeley (UCB) since 1960, introduced to and implemented at Hewlett-Packard in 1968, formalized and expanded since 1967 and written about by Christopher R. Clare since 1970. It is used to represent diagrams of digital integrated circuits. The ASM diagram is like a state diagram but more structured and, thus, easier to understand. An ASM chart is a method of describing the sequential operations of a digital system.

<span class="mw-page-title-main">D-17B</span> Missile guidance computer

The D-17B (D17B) computer was used in the Minuteman I NS-1OQ missile guidance system. The complete guidance system contained a D-17B computer, the associated stable platform, and power supplies.

The D-37C (D37C) is the computer component of the all-inertial NS-17 Missile Guidance Set (MGS) for accurately navigating to its target thousands of miles away. The NS-17 MGS was used in the Minuteman II (LGM-30F) ICBM. The MGS, originally designed and produced by the Autonetics Division of North American Aviation, could store multiple preprogrammed targets in its internal memory.

The ESPRESSO logic minimizer is a computer program using heuristic and specific algorithms for efficiently reducing the complexity of digital logic gate circuits. ESPRESSO-I was originally developed at IBM by Robert K. Brayton et al. in 1982. and improved as ESPRESSO-II in 1984. Richard L. Rudell later published the variant ESPRESSO-MV in 1986 and ESPRESSO-EXACT in 1987. Espresso has inspired many derivatives.

<span class="mw-page-title-main">Digital signal</span> Signal used to represent data as a sequence of discrete values

A digital signal is a signal that represents data as a sequence of discrete values; at any given time it can only take on, at most, one of a finite number of values. This contrasts with an analog signal, which represents continuous values; at any given time it represents a real number within a continuous range of values.

<span class="mw-page-title-main">Memory cell (computing)</span> Part of computer memory

The memory cell is the fundamental building block of computer memory. The memory cell is an electronic circuit that stores one bit of binary information and it must be set to store a logic 1 and reset to store a logic 0. Its value is maintained/stored until it is changed by the set/reset process. The value in the memory cell can be accessed by reading it.

State encoding assigns a unique pattern of ones and zeros to each defined state of a finite-state machine (FSM). Traditionally, design criteria for FSM synthesis were speed, area or both. Following Moore's law, with technology advancement, density and speed of integrated circuits have increased exponentially. With this, power dissipation per area has inevitably increased, which has forced designers for portable computing devices and high-speed processors to consider power dissipation as a critical parameter during design consideration.

References

  1. "What is stateless? - Definition from WhatIs.com". techtarget.com.
  2. Harris, David Money; Harris, Sarah L. (2007). Digital Design and Computer Architecture. USA: Morgan Kaufmann. p. 103. ISBN   978-0123704979.
  3. Kaeslin, Hubert (2008). Digital Integrated Circuit Design: From VLSI Architectures to CMOS Fabrication. UK: Cambridge University Press. p. 735. ISBN   978-0521882675.
  4. Srinath, N. K. (August 2005). 8085 Microprocessor: Programming and Interfacing. Prentice-Hall of India Pvt. Ltd. p. 326. ISBN   978-8120327856 . Retrieved 7 December 2012. page 46
  5. Laplante, Philip A. (2000). Dictionary of Computer Science, Engineering and Technology. USA: CRC Press. p. 466. ISBN   978-0849326912.
  6. Misra, Jayadev (2001). A Discipline of Multiprogramming: Programming Theory for Distributed Applications. Springer. p. 14. ISBN   978-0387952062.
  7. Prata, Stephen Prata (2004). C Primer Plus, 5th Ed. Pearson Education. pp. 113–114. ISBN   978-0132713603.