Observer effect (information technology)

Last updated

In information technology, the observer effect is the impact on the behaviour of a computer process caused by the act of observing the process while it is running.

Information technology (IT) is the use of computers to store, retrieve, transmit, and manipulate data, or information, often in the context of a business or other enterprise. IT is considered to be a subset of information and communications technology (ICT). An information technology system is generally an information system, a communications system or, more specifically speaking, a computer system – including all hardware, software and peripheral equipment – operated by a limited group of users.

For example: if a process uses a log file to record its progress, the process could slow down. Furthermore, the act of viewing the file while the process is running could cause an I/O error in the process, which could, in turn, cause it to stop. Another example would be observing the performance of a CPU by running both the observed and observing programs on the same CPU, [1] which will lead to inaccurate results because the observer program itself affects the CPU performance (modern, heavily cached and pipelined CPUs are particularly affected by this kind of observation).

The observer effect could either have a positive or negative impact on the computer process behaviour. A positive impact can be software bugs, also known as Heisenbugs, which diminish or change their negative behavior when observation mechanisms, such as debugging, are enabled. Such bugs usually create extra difficulties in being isolated.

A software bug is an error, flaw, failure or fault in a computer program or system that causes it to produce an incorrect or unexpected result, or to behave in unintended ways. The process of finding and fixing bugs is termed "debugging" and often uses formal techniques or tools to pinpoint bugs, and since the 1950s, some computer systems have been designed to also deter, detect or auto-correct various computer bugs during operations.

In computer programming jargon, a heisenbug is a software bug that seems to disappear or alter its behavior when one attempts to study it. The term is a pun on the name of Werner Heisenberg, the physicist who first asserted the observer effect of quantum mechanics, which states that the act of observing a system inevitably alters its state. In electronics the traditional term is probe effect, where attaching a test probe to a device changes its behavior.

Debugging is the process of finding and resolving defects or problems within a computer program that prevent correct operation of computer software or a system.

Related Research Articles

Central processing unit electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logical, control and input/output (I/O) operations specified by the instructions

A central processing unit (CPU), also called a central processor or main processor, is the electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logic, controlling, and input/output (I/O) operations specified by the instructions. The computer industry has used the term "central processing unit" at least since the early 1960s. Traditionally, the term "CPU" refers to a processor, more specifically to its processing unit and control unit (CU), distinguishing these core elements of a computer from external components such as main memory and I/O circuitry.

Microcode is a computer hardware technique that imposes an interpreter between the CPU hardware and the programmer-visible instruction set architecture of the computer. As such, the microcode is a layer of hardware-level instructions that implement higher-level machine code instructions or internal state machine sequencing in many digital processing elements. Microcode is used in general-purpose central processing units, although in current desktop CPUs it is only a fallback path for cases that the faster hardwired control unit cannot handle.

Operating system collection of software that manages computer hardware resources

An operating system (OS) is system software that manages computer hardware and software resources and provides common services for computer programs.

In computing, an optimizing compiler is a compiler that tries to minimize or maximize some attributes of an executable computer program. The most common requirement is to minimize the time taken to execute a program; a less common one is to minimize the amount of memory occupied. The growth of portable computers has created a market for minimizing the power consumed by a program.

Symmetric multiprocessing multiprocessor architecture where two or more identical processors are connected to a single, shared main memory, have full access to all input and output devices, and are controlled by a single OS that treats all processors equally

Symmetric multiprocessing (SMP) involves a multiprocessor computer hardware and software architecture where two or more identical processors are connected to a single, shared main memory, have full access to all input and output devices, and are controlled by a single operating system instance that treats all processors equally, reserving none for special purposes. Most multiprocessor systems today use an SMP architecture. In the case of multi-core processors, the SMP architecture applies to the cores, treating them as separate processors.

CDC 6600 computer

The CDC 6600 was the flagship of the 6000 series of mainframe computer systems manufactured by Control Data Corporation. Generally considered to be the first successful supercomputer, it outperformed the industry's prior recordholder, the IBM 7030 Stretch, by a factor of three. With performance of up to three megaFLOPS, the CDC 6600 was the world's fastest computer from 1964 to 1969, when it relinquished that status to its successor, the CDC 7600.

Overclocking action of increasing a components clock rate

Overclocking in the context of computing devices refers to making them "run faster" than originally intended. More specifically it is the configuration of computer hardware components to operate faster than certified by the original manufacturer, with "faster" specified as clock frequency in megahertz (MHz) or gigahertz (GHz). Commonly operating voltage is also increased to maintain a component's operational stability at accelerated speeds. Semiconductor devices operated at higher frequencies and voltages increase power consumption and heat. An overclocked device may be unreliable or fail completely if the additional heat load is not removed or power delivery components cannot meet increased power demands. Many device warranties state that overclocking and/or over-specification voids any warranty.

Watchdog timer

A watchdog timer is an electronic timer that is used to detect and recover from computer malfunctions. During normal operation, the computer regularly resets the watchdog timer to prevent it from elapsing, or "timing out". If, due to a hardware fault or program error, the computer fails to reset the watchdog, the timer will elapse and generate a timeout signal. The timeout signal is used to initiate corrective action or actions. The corrective actions typically include placing the computer system in a safe state and restoring normal system operation.

Observer effect may refer to:

Software rot, also known as code rot, bit rot, software erosion, software decay or software entropy is either a slow deterioration of software performance over time or its diminishing responsiveness that will eventually lead to software becoming faulty, unusable, or otherwise called "legacy" and in need of upgrade. This is not a physical phenomenon: the software does not actually decay, but rather suffers from a lack of being responsive and updated with respect to the changing environment in which it resides.

Social cognitive theory (SCT), used in psychology, education, and communication, holds that portions of an individual's knowledge acquisition can be directly related to observing others within the context of social interactions, experiences, and outside media influences. This theory was advanced by Albert Bandura as an extension of his social learning theory. The theory states that when people observe a model performing a behavior and the consequences of that behavior, they remember the sequence of events and use this information to guide subsequent behaviors. Observing a model can also prompt the viewer to engage in behavior they already learned. In other words, people do not learn new behaviors solely by trying them and either succeeding or failing, but rather, the survival of humanity is dependent upon the replication of the actions of others. Depending on whether people are rewarded or punished for their behavior and the outcome of the behavior, the observer may choose to replicate behavior modeled. Media provides models for a vast array of people in many different environmental settings.

Software incompatibility is a characteristic of software components or systems which cannot operate satisfactorily together on the same computer, or on different computers linked by a computer network. They may be components or systems which are intended to operate cooperatively or independently. Software compatibility is a characteristic of software components or systems which can operate satisfactorily together on the same computer, or on different computers linked by a computer network. It is possible that some software components or systems may be compatible in one environment and incompatible in another.

Process management (computing)

Process management is an integral part of any modern-day operating system (OS). The OS must allocate resources to processes, enable processes to share and exchange information, protect the resources of each process from other processes and enable synchronization among processes. To meet these requirements, the OS must maintain a data structure for each process, which describes the state and resource ownership of that process, and which enables the OS to exert control over each process.

Kernel (operating system) main component of most computer operating systems

The kernel is a computer program that is the core of a computer's operating system, with complete control over everything in the system. On most systems, it is one of the first programs loaded on start-up. It handles the rest of start-up as well as input/output requests from software, translating them into data-processing instructions for the central processing unit. It handles memory and peripherals like keyboards, monitors, printers, and speakers.

Computer hardware physical components of a computer system

Computer hardware includes the physical, tangible parts or components of a computer, such as the cabinet, central processing unit, monitor, keyboard, computer data storage, graphics card, sound card, speakers and motherboard. By contrast, software is instructions that can be stored and run by hardware. Hardware is so-termed because it is "hard" or rigid with respect to changes or modifications; whereas software is "soft" because it is easy to update or change. Intermediate between software and hardware is "firmware", which is software that is strongly coupled to the particular hardware of a computer system and thus the most difficult to change but also among the most stable with respect to consistency of interface. The progression from levels of "hardness" to "softness" in computer systems parallels a progression of layers of abstraction in computing.

The term "epistemic feedback" is a form of feedback which refers to an interplay between what is being observed and the result of the observation. The concept can apply to a process to obtain information, where the process itself changes the information when being obtained. For example, instead of quietly asking customers for their opinions about food in a restaurant, making an announcement about food quality, as being tested in a survey, could cause cooks to focus on producing high-quality results. The concept can also apply to changing the method of observation, rather than affecting the data. For example, if after asking several customers about food, they noted the food as generally good or fair, then the questions might be changed to ask more specifically which food items were most/least liked. Hence, the interplay can alter either the observations, or the method of observation, or both.

Shared memory memory that may be simultaneously accessed by multiple programs with an intent to provide communication among them or avoid redundant copies

In computer science, shared memory is memory that may be simultaneously accessed by multiple programs with an intent to provide communication among them or avoid redundant copies. Shared memory is an efficient means of passing data between programs. Depending on context, programs may run on a single processor or on multiple separate processors.

Meltdown (security vulnerability) hardware vulnerability affecting Intel x86 microprocessors, IBM POWER processors, and some ARM-based microprocessors, allowing a rogue process to read all memory without authorization

Meltdown is a hardware vulnerability affecting Intel x86 microprocessors, IBM POWER processors, and some ARM-based microprocessors. It allows a rogue process to read all memory, even when it is not authorized to do so.

References

  1. Mytkowicz, Todd; Sweeny, Peter; Hauswirth, Matthias; Diwan, Amer (2008), "Observer Effect and Measurement Bias in Performance Analysis", Computer Science Technical Reports