Refresh rate

Last updated

The refresh rate, also known as vertical refresh rate or vertical scan rate in reference to terminology originating with the cathode-ray tubes (CRTs), is the number of times per second that a raster-based display device displays a new image. This is independent from frame rate, which describes how many images are stored or generated every second by the device driving the display. On CRT displays, higher refresh rates produce less flickering, thereby reducing eye strain. In other technologies such as liquid-crystal displays, the refresh rate affects only how often the image can potentially be updated. [1]

Contents

Non-raster displays may not have a characteristic refresh rate. Vector displays, for instance, do not trace the entire screen, only the actual lines comprising the displayed image, so refresh speed may differ by the size and complexity of the image data. [2] For computer programs or telemetry, the term is sometimes applied to how frequently a datum is updated with a new external value from another source (for example; a shared public spreadsheet or hardware feed).

Physical factors

While all raster display devices have a characteristic refresh rate, the physical implementation differs between technologies.

Cathode-ray tubes

Electron beam in the process of refreshing an image on a CRT Refresh scan.jpg
Electron beam in the process of refreshing an image on a CRT

Raster-scan CRTs by their nature must refresh the screen since their phosphors will fade and the image will disappear quickly unless refreshed regularly.

In a CRT, the vertical scan rate is the number of times per second that the electron beam returns to the upper left corner of the screen to begin drawing a new frame. [3] It is controlled by the vertical blanking signal generated by the video controller, and is partially limited by the monitor's maximum horizontal scan rate.

The refresh rate can be calculated from the horizontal scan rate by dividing the scanning frequency by the number of horizontal lines, plus some amount of time to allow for the beam to return to the top. By convention, this is a 1.05x multiplier. [4] For instance, a monitor with a horizontal scanning frequency of 96 kHz at a resolution of 1280 × 1024 results in a refresh rate of 96,000 ÷ (1024 × 1.05) ≈ 89 Hz (rounded down).

CRT refresh rates have historically been an important factor in video game programming. In early videogame systems, the only time available for computation was during the vertical blanking interval, during which the beam is returning to the top right corner of the screen and no image is being drawn. [5] Even in modern games, however, it is important to avoid altering the computer's video buffer except during the vertical retrace, to prevent flickering graphics or screen tearing.

Liquid-crystal displays

Unlike CRTs, where the image will fade unless refreshed, the pixels of liquid-crystal displays retain their state for as long as power is provided, and consequently there is no intrinsic flicker regardless of refresh rate. However, the refresh rate still determines the highest frame rate that can be displayed, and despite there being no actual blanking of the screen, the vertical blanking interval is still a period in each refresh cycle when the screen is not being updated, during which the image data in the host system's frame buffer can be updated. Vsync options can eliminate screen tearing by rendering the whole image at the same time.

Computer displays

A video of a CPU fan rotating at 0, 300 and 1300 revolutions per minute, recorded at 60 frames per second

On smaller CRT monitors (up to about 15 in or 38 cm), few people notice any discomfort between 6072 Hz. On larger CRT monitors (17 in or 43 cm or larger), most people experience mild discomfort unless the refresh is set to 72 Hz or higher. A rate of 100 Hz is comfortable at almost any size. However, this does not apply to LCD monitors. The closest equivalent to a refresh rate on an LCD monitor is its frame rate, which is often locked at 60 fps. But this is rarely a problem, because the only part of an LCD monitor that could produce CRT-like flicker—its backlight—typically operates at around a minimum of 200 Hz.

Different operating systems set the default refresh rate differently. Microsoft Windows 95 and Windows 98 (First and Second Editions) set the refresh rate to the highest rate that they believe the display supports. Windows NT-based operating systems, such as Windows 2000 and its descendants Windows XP, Windows Vista and Windows 7, set the default refresh rate to a conservative rate, usually 60 Hz. Some fullscreen applications, including many games, now allow the user to reconfigure the refresh rate before entering fullscreen mode, but most default to a conservative resolution and refresh rate and let you increase the settings in the options.[ citation needed ]

Old monitors could be damaged if a user set the video card to a refresh rate higher than the highest rate supported by the monitor. Some models of monitors display a notice that the video signal uses an unsupported refresh rate.

Dynamic refresh rate

Some LCDs support adapting their refresh rate to the current frame rate delivered by the graphics card. Two technologies that allow this are FreeSync and G-Sync.

Stereo displays

When LCD shutter glasses are used for stereo 3D displays, the effective refresh rate is halved, because each eye needs a separate picture. For this reason, it is usually recommended to use a display capable of at least 120 Hz, because divided in half this rate is again 60 Hz. Higher refresh rates result in greater image stability, for example 72 Hz non-stereo is 144 Hz stereo, and 90 Hz non-stereo is 180 Hz stereo. Most low-end computer graphics cards and monitors cannot handle these high refresh rates, especially at higher resolutions.

For LCD monitors the pixel brightness changes are much slower than CRT or plasma phosphors. Typically LCD pixel brightness changes are faster when voltage is applied than when voltage is removed, resulting in an asymmetric pixel response time. With 3D shutter glasses this can result in a blurry smearing of the display and poor depth perception, due to the previous image frame not fading to black fast enough as the next frame is drawn.[ citation needed ]

Televisions

This gif animation shows a rudimentary comparison of how motion varies with 4 Hz, 12 Hz, and 24 Hz refresh rates. Entire sequence has a frame rate of 24 Hz. Waveform video comparison.gif
This gif animation shows a rudimentary comparison of how motion varies with 4 Hz, 12 Hz, and 24 Hz refresh rates. Entire sequence has a frame rate of 24 Hz.

The development of televisions in the 1930s was determined by a number of technical limitations. The AC power line frequency was used for the vertical refresh rate for two reasons. The first reason was that the television's vacuum tube was susceptible to interference from the unit's power supply, including residual ripple. This could cause drifting horizontal bars (hum bars). Using the same frequency reduced this, and made interference static on the screen and therefore less obtrusive. The second reason was that television studios would use AC lamps, filming at a different frequency would cause strobing. [7] [8] [9] Thus producers had little choice but to run sets at 60 Hz in America, and 50 Hz in Europe. These rates formed the basis for the sets used today: 60 Hz System M (almost always used with NTSC color coding) and 50 Hz System B/G (almost always used with PAL or SECAM color coding). This accident of chance gave European sets higher resolution, in exchange for lower frame-rates. Compare System M (704 × 480 at 30i) and System B/G (704 × 576 at 25i). However, the lower refresh rate of 50 Hz introduces more flicker, so sets that use digital technology to double the refresh rate to 100 Hz are now very popular. (see Broadcast television systems)

Another difference between 50 Hz and 60 Hz standards is the way motion pictures (film sources as opposed to video camera sources) are transferred or presented. 35 mm film is typically shot at 24 frames per second (fps). For PAL 50 Hz this allows film sources to be easily transferred by accelerating the film by 4%. The resulting picture is therefore smooth, however, there is a small shift in the pitch of the audio. NTSC sets display both 24 fps and 25 fps material without any speed shifting by using a technique called 3:2 pulldown, but at the expense of introducing unsmooth playback in the form of telecine judder.

Similar to some computer monitors and some DVDs, analog television systems use interlace, which decreases the apparent flicker by painting first the odd lines and then the even lines (these are known as fields). This doubles the refresh rate, compared to a progressive scan image at the same frame rate. This works perfectly for video cameras, where each field results from a separate exposure the effective frame rate doubles, there are now 50 rather than 25 exposures per second. The dynamics of a CRT are ideally suited to this approach, fast scenes will benefit from the 50 Hz refresh, the earlier field will have largely decayed away when the new field is written, and static images will benefit from improved resolution as both fields will be integrated by the eye. Modern CRT-based televisions may be made flicker-free in the form of 100 Hz technology.

Many high-end LCD televisions now have a 120 or 240 Hz (current and former NTSC countries) or 100 or 200 Hz (PAL/SECAM countries) refresh rate. The rate of 120 was chosen as the least common multiple of 24 fps (cinema) and 30 fps (NTSC TV), and allows for less distortion when movies are viewed due to the elimination of telecine (3:2 pulldown). For PAL at 25 fps, 100 or 200 Hz is used as a fractional compromise of the least common multiple of 600 (24 × 25). These higher refresh rates are most effective from a 24p-source video output (e.g. Blu-ray Disc), and/or scenes of fast motion. [10]

Displaying movie content on a TV

As movies are usually filmed at a rate of 24 frames per second, while television sets operate at different rates, some conversion is necessary. Different techniques exist to give the viewer an optimal experience.

The combination of content production, playback device, and display device processing may also give artifacts that are unnecessary. A display device producing a fixed 60 fps rate cannot display a 24 fps movie at an even, judder-free rate. Usually a 3:2 pulldown is used, giving a slight uneven movement.

While common multisync CRT computer monitors have been capable of running at even multiples of 24 Hz since the early 1990s, recent "120 Hz" LCDs have been produced for the purpose of having smoother, more fluid motion, depending upon the source material, and any subsequent processing done to the signal. In the case of material shot on video, improvements in smoothness just from having a higher refresh rate may be barely noticeable. [11]

In the case of filmed material, as 120 is an even multiple of 24, it is possible to present a 24 fps sequence without judder on a well-designed 120 Hz display (i.e., so-called 5-5 pulldown). If the 120 Hz rate is produced by frame-doubling a 60 fps 3:2 pulldown signal, the uneven motion could still be visible (i.e., so-called 6-4 pulldown).

Additionally, material may be displayed with synthetically created smoothness with the addition of motion interpolation abilities to the display, which has an even larger effect on filmed material.

"50 Hz" TV sets (when fed with "50 Hz" content) usually get a movie that is slightly faster than normal, avoiding any problems with uneven pulldown.

See also

Related Research Articles

<span class="mw-page-title-main">Analog television</span> Television that uses analog signals

Analog television is the original television technology that uses analog signals to transmit video and audio. In an analog television broadcast, the brightness, colors and sound are represented by amplitude, phase and frequency of an analog signal.

<span class="mw-page-title-main">NTSC</span> Analog television system

NTSC is the first American standard for analog television, published in 1941. In 1961, it was assigned the designation System M. It is also known as EIA standard 170.

<span class="mw-page-title-main">Interlaced video</span> Technique for doubling the perceived frame rate of a video display

Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of the phi phenomenon.

<span class="mw-page-title-main">Telecine</span> Process for broadcasting content stored on film stock

Telecine is the process of transferring film into video and is performed in a color suite. The term is also used to refer to the equipment used in this post-production process.

In video technology, 24p refers to a video format that operates at 24 frames per second frame rate with progressive scanning. Originally, 24p was used in the non-linear editing of film-originated material. Today, 24p formats are being increasingly used for aesthetic reasons in image acquisition, delivering film-like motion characteristics. Some vendors advertise 24p products as a cheaper alternative to film acquisition.

<span class="mw-page-title-main">Display resolution</span> Number of distinct pixels in each dimension that can be displayed

The display resolution or display modes of a digital television, computer monitor or display device is the number of distinct pixels in each dimension that can be displayed. It can be an ambiguous term especially as the displayed resolution is controlled by different factors in cathode ray tube (CRT) displays, flat-panel displays and projection displays using fixed picture-element (pixel) arrays.

Flicker is a visible change in brightness between cycles displayed on video displays. It applies to the refresh interval on cathode ray tube (CRT) televisions and computer monitors, as well as plasma computer displays and televisions.

<span class="mw-page-title-main">Flicker fixer</span> Video de-interlacer

A flicker fixer or scan doubler is a piece of computer hardware that de-interlaces an output video signal. The flicker fixer accomplishes this by adjusting the timing of the natively interlaced video signal to suit the needs of a progressive display Ex: CRT computer monitor. Flicker fixers in essence create a progressive frame of video from two interlaced fields of video.

<span class="mw-page-title-main">576i</span> Standard-definition video mode

576i is a standard-definition digital video mode, originally used for digitizing analogue television in most countries of the world where the utility frequency for electric power distribution is 50 Hz. Because of its close association with the legacy colour encoding systems, it is often referred to as PAL, PAL/SECAM or SECAM when compared to its 60 Hz NTSC-colour-encoded counterpart, 480i.

<span class="mw-page-title-main">Active shutter 3D system</span> Method of displaying stereoscopic 3D images

An active shutter 3D system is a technique of displaying stereoscopic 3D images. It works by only presenting the image intended for the left eye while blocking the right eye's view, then presenting the right-eye image while blocking the left eye, and repeating this so rapidly that the interruptions do not interfere with the perceived fusion of the two images into a single 3D image.

In filmmaking, video production, animation, and related fields, a frame is one of the many still images which compose the complete moving picture. The term is derived from the historical development of film stock, in which the sequentially recorded single images look like a framed picture when examined individually.

Flicker-free is a term given to video displays, primarily cathode ray tubes, operating at a high refresh rate to reduce or eliminate the perception of screen flicker. For televisions, this involves operating at a 100 Hz or 120 Hz hertz field rate to eliminate flicker, compared to standard televisions that operate at 50 Hz or 60 Hz (NTSC), most simply done by displaying each field twice, rather than once. For computer displays, this is usually a refresh rate of 70–90 Hz, sometimes 100 Hz or higher. This should not be confused with motion interpolation, though they may be combined – see implementation, below.

This article discusses moving image capture, transmission and presentation from today's technical and creative points of view; concentrating on aspects of frame rates.

<span class="mw-page-title-main">Raster scan</span> Rectangular pattern of image capture and reconstruction

A raster scan, or raster scanning, is the rectangular pattern of image capture and reconstruction in television. By analogy, the term is used for raster graphics, the pattern of image storage and transmission used in most computer bitmap image systems. The word raster comes from the Latin word rastrum, which is derived from radere ; see also rastrum, an instrument for drawing musical staff lines. The pattern left by the lines of a rake, when drawn straight, resembles the parallel lines of a raster: this line-by-line scanning is what creates a raster. It is a systematic process of covering the area progressively, one line at a time. Although often a great deal faster, it is similar in the most general sense to how one's gaze travels when one reads lines of text.

Coordinated Video Timings is a standard by VESA which defines the timings of the component video signal. Initially intended for use by computer monitors and video cards, the standard made its way into consumer televisions.

Scan conversion or scan converting rate is a video processing technique for changing the vertical / horizontal scan frequency of video signal for different purposes and applications. The device which performs this conversion is called a scan converter.

Display motion blur, also called HDTV blur and LCD motion blur, refers to several visual artifacts that are frequently found on modern consumer high-definition television sets and flat panel displays for computers.

A multiple-sync (multisync) monitor, also known as a multiscan or multimode monitor, is a raster-scan analog video monitor that can properly synchronise with multiple horizontal and vertical scan rates. In contrast, fixed frequency monitors can only synchronise with a specific set of scan rates. They are generally used for computer displays, but sometimes for television, and the terminology is mostly applied to CRT displays although the concept applies to other technologies.

<span class="mw-page-title-main">Motion interpolation</span> Form of video processing

Motion interpolation or motion-compensated frame interpolation (MCFI) is a form of video processing in which intermediate film, video or animation frames are generated between existing ones by means of interpolation, in an attempt to make animation more fluid, to compensate for display motion blur, and for fake slow motion effects.

References

  1. How To Change the Screen Refresh Rate of Your Monitor in Windows XP [ dead link ]
  2. What is the Refresh Rate of Monitor [ dead link ]
  3. "The Perfect Display". PC Magazine . Ziff Davis, Inc. July 1993. p. 177.
  4. "XFree86-Video-Timings-HOWTO". TLDP.
  5. Kohler, Chris (2009-03-13). "Racing the Beam: How Atari 2600's Crazy Hardware Changed Game Design". Wired. ISSN   1059-1028 . Retrieved 2020-08-16.
  6. Qazi, Atif. "What is Monitor Refresh Rate". Tech Gearoid. Retrieved May 15, 2019.
  7. Dorf, Richard C. (26 September 1997). The Electrical Engineering Handbook,Second Edition. p. 1538. ISBN   9781420049763 . Retrieved 25 June 2015.
  8. Emmerson, Andrew. "Lines, frames and frequencies". soundscape.info. Archived from the original on 23 July 2006. Retrieved 25 June 2015.
  9. "Television broadcasting  video standards". tvradioworld.com. Retrieved 25 June 2015.
  10. What is monitor refresh rate? [ dead link ]
  11. "Six things you need to know about 120Hz LCD TVs". Archived from the original on 2007-10-28.