Input lag

Last updated

Input lag or input latency is the amount of time that passes between sending an electrical signal and the occurrence of a corresponding action.

Contents

In video games the term is often used to describe any latency between input and the game engine, monitor, or any other part of the signal chain reacting to that input, though all contributions of input lag are cumulative.

Potential causes of delay from pressing a button to the game reacting

The potential causes for input lag are described below. Each step in the process, however small, increases total input lag, however the combined result may not be noticeable if all input lag is low enough.

Controller sends signal to console

For wired controllers, this lag is normally negligible. For wireless controllers, opinions vary as to the significance of this lag. Some people claim to notice extra lag when using a wireless controller, while other people claim that the 4–8 milliseconds of lag is negligible. [1]

Console/PC processes next frame

A videogame console or PC will send out a new frame once it has finished performing the necessary calculations to create it. The amount of frames rendered per second (on average) is called the frame rate. Using common a 60  Hz monitor as an example, the maximum theoretical frame rate is 60 FPS (frames per second), which means the minimum theoretical input lag for the overall system is approximately 16.67 ms (1 frame/60 FPS). The monitor is usually the bottleneck for the theoretical maximum FPS, since there is little point in rendering more frames than the monitor can show. In situations where the CPU, GPU, memory, bus, etc. load is bottlenecked, FPS can drop below the monitor's refresh rate.

Individual frames need not be finished within the interval of a screen refresh to output at an equivalent rate. Game engines often make use of pipelined architectures to process multiple frames concurrently, allowing for a more efficient use of the underlying hardware. This exacerbates input lag, especially at low frame rates. [2] [3]

Display lag

This is the lag caused by the television or monitor (also called output lag). In addition to the latency imposed by the screen's pixel response time, any image processing (such as upscaling, motion smoothing, or edge smoothing) takes time and therefore adds more input lag. An input lag below 30 ms is generally considered unnoticeable in a television. [4] Once the frame has been processed, the final step is the updating the pixels to display the correct color for the new frame. The time this takes is called the pixel response time.

Typical overall response times

Testing has found that overall input lag (from human input to visible response) times of approximately 200 ms are distracting to the user. It also appears that (excluding the monitor/television display lag) 133 ms is an average response time and the most sensitive games (fighting games, first person shooters and rhythm games) achieve response times of 67 ms (excluding display lag). [5]

See also

Related Research Articles

<span class="mw-page-title-main">Computer monitor</span> Computer output device

A computer monitor is an output device that displays information in pictorial or textual form. A discrete monitor comprises a visual display, support electronics, power supply, housing, electrical connectors, and external user controls.

Frame rate is typically the frequency (rate) at which consecutive images (frames) are captured or displayed. This definition applies to film and video cameras, computer animation, and motion capture systems. In these contexts, frame rate may be used interchangeably with frame frequency and refresh rate, which are expressed in hertz. Additionally, in the context of computer graphics performance, FPS is the rate at which a system, particularly a GPU, is able to generate frames, and refresh rate is the frequency at which a display shows completed frames. In electronic camera specifications frame rate refers to the maximum possible rate a frame could be captured, but in practice, other settings may reduce the actual frequency to a lower number than the frame rate.

<span class="mw-page-title-main">Interlaced video</span> Technique for doubling the perceived frame rate of a video display

Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of the phi phenomenon.

Progressive scanning is a format of displaying, storing, or transmitting moving images in which all the lines of each frame are drawn in sequence. This is in contrast to interlaced video used in traditional analog television systems where only the odd lines, then the even lines of each frame are drawn alternately, so that only half the number of actual image frames are used to produce video. The system was originally known as "sequential scanning" when it was used in the Baird 240 line television transmissions from Alexandra Palace, United Kingdom in 1936. It was also used in Baird's experimental transmissions using 30 lines in the 1920s. Progressive scanning became universally used in computer screens beginning in the early 21st century.

The refresh rate, also known as vertical refresh rate or vertical scan rate in reference to terminology originating with the cathode-ray tubes (CRTs), is the number of times per second that a raster-based display device displays a new image. This is independent from frame rate, which describes how many images are stored or generated every second by the device driving the display. On CRT displays, higher refresh rates produce less flickering, thereby reducing eye strain. In other technologies such as liquid-crystal displays, the refresh rate affects only how often the image can potentially be updated.

Deinterlacing is the process of converting interlaced video into a non-interlaced or progressive form. Interlaced video signals are commonly found in analog television, digital television (HDTV) when in the 1080i format, some DVD titles, and a smaller number of Blu-ray discs.

High-definition video is video of higher resolution and quality than standard-definition. While there is no standardized meaning for high-definition, generally any video image with considerably more than 480 vertical scan lines or 576 vertical lines (Europe) is considered high-definition. 480 scan lines is generally the minimum even though the majority of systems greatly exceed that. Images of standard resolution captured at rates faster than normal, by a high-speed camera may be considered high-definition in some contexts. Some television series shot on high-definition video are made to look as if they have been shot on film, a technique which is often known as filmizing.

<span class="mw-page-title-main">1080p</span> Video mode

1080p is a set of HDTV high-definition video modes characterized by 1,920 pixels displayed across the screen horizontally and 1,080 pixels down the screen vertically; the p stands for progressive scan, i.e. non-interlaced. The term usually assumes a widescreen aspect ratio of 16:9, implying a resolution of 2.1 Pixel. It is often marketed as Full HD or FHD, to contrast 1080p with 720p resolution screens. Although 1080p is sometimes informally referred to as 2K, these terms reflect two distinct technical standards, with differences including resolution and aspect ratio.

<span class="mw-page-title-main">Active shutter 3D system</span> Method of displaying stereoscopic 3D images

An active shutter 3D system is a technique of displaying stereoscopic 3D images. It works by only presenting the image intended for the left eye while blocking the right eye's view, then presenting the right-eye image while blocking the left eye, and repeating this so rapidly that the interruptions do not interfere with the perceived fusion of the two images into a single 3D image.

<span class="mw-page-title-main">Screen tearing</span> Visual artifact in video display

Screen tearing is a visual artifact in video display where a display device shows information from multiple frames in a single screen draw.

A lagometer is a display of network latency on an Internet connection and of rendering by the client. Lagometers are commonly found in computer games or IRC where timing plays a large role. Quake and derived games commonly have them.

Enthusiast computing refers to a group of people who build high-end personal computers that facilitate gaming, stock trading, video editing, music creation and editing, photograph editing, programming, remote work, cryptocurrency mining, and other hardware-intensive applications.

Display motion blur, also called HDTV blur and LCD motion blur, refers to several visual artifacts that are frequently found on modern consumer high-definition television sets and flat panel displays for computers.

<span class="mw-page-title-main">Motion interpolation</span>

Motion interpolation or motion-compensated frame interpolation (MCFI) is a form of video processing in which intermediate animation frames are generated between existing ones by means of interpolation, in an attempt to make animation more fluid, to compensate for display motion blur, and for fake slow motion effects.

In computers, lag is delay (latency) between the action of the user (input) and the reaction of the server supporting the task, which has to be sent back to the client.

Display lag is a phenomenon associated with most types of liquid crystal displays (LCDs) like smartphones and computers and nearly all types of high-definition televisions (HDTVs). It refers to latency, or lag between when the signal is sent to the display and when the display starts to show that signal. This lag time has been measured as high as 68 ms, or the equivalent of 3-4 frames on a 60 Hz display. Display lag is not to be confused with pixel response time, which is the amount of time it takes for a pixel to change from one brightness value to another. Currently the majority of manufacturers quote the pixel response time, but neglect to report display lag.

Netcode is a blanket term most commonly used by gamers relating to networking in online games, often referring to synchronization issues between clients and servers. Players often infer "bad netcodes" when they experience lag or when their inputs are dropped. Common causes of such issues include high latency between server and client, packet loss, network congestion, and external factors independent to network quality such as frame rendering time or inconsistent frame rates. Netcodes may be designed to uphold a synchronous and seamless experience between users despite these networking challenges.

G-Sync is a proprietary adaptive sync technology developed by Nvidia aimed primarily at eliminating screen tearing and the need for software alternatives such as Vsync. G-Sync eliminates screen tearing by allowing a video display's refresh rate to adapt to the frame rate of the outputting device rather than the outputting device adapting to the display, which could traditionally be refreshed halfway through the process of a frame being output by the device, resulting in screen tearing, or two or more frames being shown at once. In order for a device to use G-Sync, it must contain a proprietary G-Sync module sold by Nvidia. AMD has released a similar technology for displays, called FreeSync, which has the same function as G-Sync yet is royalty-free.

<span class="mw-page-title-main">FreeSync</span> Brand name for an adaptive synchronization technology

FreeSync is an adaptive synchronization technology for LCD and OLED displays that support a variable refresh rate aimed at avoiding tearing and reducing stuttering caused by misalignment between the screen's refresh rate and the content's frame rate.

<span class="mw-page-title-main">GPD Win 2</span> Handheld Windows 10 gaming computer

The GPD Win 2 is a Windows-based palmtop computer that is the successor to the GPD Win. It is manufactured by Chinese company GamePad Digital, and crowdfunded just as its predecessor was. Announced in first-quarter 2017, the crowdfunding campaign officially kicked off on January 15, 2018, and quickly surpassed its goal. It was released in May 2018.

References

Input Lag Test: TVs from 2016 + 2017 Dein-Fernseher.de

  1. "Wireless Controller Latency: is It a Problem? - LockerGnome". LockerGnome. 2011-08-27. Archived from the original on 2016-08-28. Retrieved 2016-06-12.
  2. Ericson, Christer (2007-09-22). "Input latency". realtimecollisiondetection.net - the blog.
  3. Carmack, John (2013-02-22). "Latency Mitigation Strategies". #AltDevBlogADay. Archived from the original on 2013-02-25.
  4. "The Dark Side of Overdrive". bit-tech. Retrieved 2016-06-12.
  5. "Console Gaming: The Lag Factor". Eurogamer . Gamer Network. 5 September 2009. p. 2. Retrieved 2016-06-12.