Delta time or delta timing is a concept used amongst programmers in relation to hardware and network responsiveness. [1] In graphics programming, the term is usually used for variably updating scenery based on the elapsed time since the game last updated, [2] (i.e. the previous "frame") which will vary depending on the speed of the computer, and how much work needs to be done in the program at any given time. This also allows graphics to be calculated separately if graphics are being multi-threaded.
In network programming, due to the unpredictable nature of internet connections, delta timing is used in a similar way to variably update the movement information received via the computer network, regardless of how long it took to receive the next data packet of movement information. [3]
It is often done by calling a timer every frame per second that holds the time between now and last call. Thereafter the resulting number (delta time) is used to calculate how far, for instance, a video game character would have travelled during that time. This results in the character taking the same amount of real world time to move across the screen regardless of the rate of update, and whether the delay is caused by lack of processing power or a slow internet connection.
In graphics programming, this avoids the gameplay slowing down or speeding up depending on the complexity of what is happening at any given time, which would make for an inconsistent, jarring experience (e.g. time slowing down the more characters walk onto the screen, or running too fast because only one character is on screen). In network programming, this keeps the game world of each computer in sync with the others, by making sure each client eventually sees the same activity at the same time, even if more time has passed since the last update for some clients than others.
Big enough delays will eventually negatively affect the gameplay experience, but using Delta Time keeps the gameplay consistent so long as the computer and internet connection meet the minimum hardware requirements of the game.
Delta time can be used to measure how long a given program took to execute in real-time. The python snippet below shows how an example function's execution time can be calculated using the delta of times before and after execution.
# time()# returns the seconds since the epoch as a floating-point numberfromtimeimporttime# define is_prime(x)defis_prime(x):foriinrange(2,x):if(x%i)==0:print(x,"is not prime")returnprint(x,"is prime")# measure delta timet0=time()# start of measurementis_prime(13)# call functiont1=time()# end of measurementdeltaTime=t1-t0print("is_prime(13) takes",deltaTime,"seconds to execute!")
Delta-time and frame rate are not always related. Video games fall into one of two categories regarding frame rate: frame rate dependent or frame rate independent. Frame rate Dependent games have a frame rate that varies based on the computer running the software. For example, if a frame rate dependent game runs at 300 frames per second (fps) on a computer with a refresh rate of 120 hertz (Hz), then it would run at 150 fps on a computer with a refresh rate of 60 Hz. A standard delta-time expression can create pause screens and intentional slow-motion effects in frame rate-dependent games. Standard delta-time formulas are seldom used for standard game-play because the delta-time between frames varies greatly depending on the refresh rate of the computer on which it is running.
If a game is frame rate independent, the frame rate is preset and runs identically across all computers regardless of their specifications. Frame rate independence limits the maximum graphics quality to make the game available to more consumers. Frame rate independence is particularly popular in mobile games and games optimized for lower-end computers such as Chromebooks. In frame rate independent games, the delta-time between frames is consistent across the entire game. This standardization means that one delta-time expression can create a consistent frame rate for all users on all types of computers. [4]
Delta-timing can excel anytime a game's frame rate needs to be independent of its hardware. One example of this could be games that need to run on mobile devices or low-end computers. In some cases, video game developers use delta-timing to standardize the movement speed of an object on the screen. For example, if a character moves across the screen at a constant rate, delta-timing can ensure that this movement speed is consistent and does not fluctuate. Using delta-timing for movement is particularly useful for users who possess inconsistent internet or low-end computer hardware. This method can also regulate the movement of on-screen objects.
One disadvantage of using delta-timing for movement is that it can be complicated for games that incorporate a wide variety of movements and movement speeds. For example, a game could have a specified walking speed and sprinting speed for all characters; however, characters may also drive cars, boats, planes, and other vehicles. If the developer wants each of these movement speeds to be different (to make the game as realistic as possible), they would need a separate delta-time expression for each movement speed (that is only assuming these movements occur at a constant rate). If the movements do not occur at a constant rate, delta-timing expressions are rendered ineffective. [5]
The Original Chip Set (OCS) is a chipset used in the earliest Commodore Amiga computers and defined the Amiga's graphics and sound capabilities. It was succeeded by the slightly improved Enhanced Chip Set (ECS) and the greatly improved Advanced Graphics Architecture (AGA).
Frame rate, most commonly expressed in frames per second or FPS, is typically the frequency (rate) at which consecutive images (frames) are captured or displayed. This definition applies to film and video cameras, computer animation, and motion capture systems. In these contexts, frame rate may be used interchangeably with frame frequency and refresh rate, which are expressed in hertz. Additionally, in the context of computer graphics performance, FPS is the rate at which a system, particularly a GPU, is able to generate frames, and refresh rate is the frequency at which a display shows completed frames. In electronic camera specifications frame rate refers to the maximum possible rate frames could be captured, but in practice, other settings may reduce the actual frequency to a lower number than the frame rate.
Video Graphics Array (VGA) is a video display controller and accompanying de facto graphics standard, first introduced with the IBM PS/2 line of computers in 1987, which became ubiquitous in the IBM PC compatible industry within three years. The term can now refer to the computer display standard, the 15-pin D-subminiature VGA connector, or the 640 × 480 resolution characteristic of the VGA hardware.
A framebuffer is a portion of random-access memory (RAM) containing a bitmap that drives a video display. It is a memory buffer containing data representing all the pixels in a complete video frame. Modern video cards contain framebuffer circuitry in their cores. This circuitry converts an in-memory bitmap into a video signal that can be displayed on a computer monitor.
The Color Graphics Adapter (CGA), originally also called the Color/Graphics Adapter or IBM Color/Graphics Monitor Adapter, introduced in 1981, was IBM's first color graphics card for the IBM PC and established a de facto computer display standard.
The refresh rate, also known as vertical refresh rate or vertical scan rate in reference to terminology originating with the cathode-ray tubes (CRTs), is the number of times per second that a raster-based display device displays a new image. This is independent from frame rate, which describes how many images are stored or generated every second by the device driving the display. On CRT displays, higher refresh rates produce less flickering, thereby reducing eye strain. In other technologies such as liquid-crystal displays, the refresh rate affects only how often the image can potentially be updated.
The Television Interface Adaptor (TIA) is the custom computer chip which, along with a variant of the MOS Technology 6502, constitutes the heart of the 1977 Atari Video Computer System game console. The TIA generates the screen display, sound effects, and reads the controllers. At the time the Atari VCS was designed, even small amounts of RAM were expensive. The chip was designed without the extra circuitry of a framebuffer, instead requiring detailed programming to create even a simple display.
The Tektronix 4010 series was a family of text-and-graphics computer terminals based on storage-tube technology created by Tektronix. Several members of the family were introduced during the 1970s, the best known being the 11-inch 4010 and 19-inch 4014, along with the less popular 25-inch 4016. They were widely used in the computer-aided design market in the 1970s and early 1980s.
In computer displays, filmmaking, television production, video games and other kinetic displays, scrolling is sliding text, images or video across a monitor or display, vertically or horizontally. "Scrolling," as such, does not change the layout of the text or pictures but moves the user's view across what is apparently a larger image that is not wholly seen. A common television and movie special effect is to scroll credits, while leaving the background stationary. Scrolling may take place completely without user intervention or, on an interactive device, be triggered by touchscreen or a keypress and continue without further intervention until a further user action, or be entirely controlled by input devices.
In computer graphics, a sprite is a two-dimensional bitmap that is integrated into a larger scene, most often in a 2D video game. Originally, the term sprite referred to fixed-sized objects composited together, by hardware, with a background. Use of the term has since become more general.
Attribute clash is a display artifact caused by limits in the graphics circuitry of some colour 8-bit home computers, most notably the ZX Spectrum, where it meant that only two colours could be used in any 8×8 tile of pixels. The effect was also noticeable on MSX software and in some Commodore 64 titles. Workarounds to prevent this limit from becoming apparent have since been considered an element of Spectrum programmer culture.
The Motorola 6845, or MC6845, is a display controller that was widely used in 8-bit computers during the 1980s. Originally intended for designs based on the Motorola 6800 CPU and given a related part number, it was more widely used alongside various other processors, and was most commonly found in machines based on the Zilog Z80 and MOS 6502.
Multi-monitor, also called multi-display and multi-head, is the use of multiple physical display devices, such as monitors, televisions, and projectors, in order to increase the area available for computer programs running on a single computer system. Research studies show that, depending on the type of work, multi-head may increase the productivity by between 50 and 70 percent.
Real-time computer graphics or real-time rendering is the sub-field of computer graphics focused on producing and analyzing images in real time. The term can refer to anything from rendering an application's graphical user interface (GUI) to real-time image analysis, but is most often used in reference to interactive 3D computer graphics, typically using a graphics processing unit (GPU). One example of this concept is a video game that rapidly renders changing 3D environments to produce an illusion of motion.
An active shutter 3D system is a technique for displaying stereoscopic 3D images. It works by only presenting the image intended for the left eye while blocking the right eye's view, then presenting the right-eye image while blocking the left eye, and repeating this so rapidly that the interruptions do not interfere with the perceived fusion of the two images into a single 3D image.
Software rendering is the process of generating an image from a model by means of computer software. In the context of computer graphics rendering, software rendering refers to a rendering process that is not dependent upon graphics hardware ASICs, such as a graphics card. The rendering takes place entirely in the CPU. Rendering everything with the (general-purpose) CPU has the main advantage that it is not restricted to the (limited) capabilities of graphics hardware, but the disadvantage is that more transistors are needed to obtain the same speed.
Screen tearing is a visual artifact in video display where a display device shows information from multiple frames in a single screen draw.
A screenshot is a digital image that shows the contents of a computer display. A screenshot is created by the operating system or software running on the device powering the display.
A variety of computer graphic techniques have been used to display video game content throughout the history of video games. The predominance of individual techniques have evolved over time, primarily due to hardware advances and restrictions such as the processing power of central or graphics processing units.
Adaptive tile refresh is a computer graphics technique for side-scrolling video games. It was most famously used by id Software's John Carmack in games such as Commander Keen to compensate for the poor graphics performance of PCs in the early 1990s. Its principal innovation is a novel use of several EGA hardware features to perform the scrolling in hardware. The technique is named for its other aspect, the tracking of moved graphical elements in order to minimize the amount of redrawing required in every frame. Together, the combination saves the processing time that would be otherwise required for redrawing the entire screen. Carmack designed the software engine based on a scrolling display for large images from the 1970s.