Multisync monitor

Last updated

A multiple-sync (multisync) monitor, also known as a multiscan or multimode monitor, is a raster-scan analog video monitor that can properly synchronise with multiple horizontal and vertical scan rates. [1] [2] In contrast, fixed frequency monitors can only synchronise with a specific set of scan rates. They are generally used for computer displays, but sometimes for television, and the terminology is mostly applied to CRT displays although the concept applies to other technologies.

Contents

Multiscan computer monitors appeared during the mid 1980s, offering flexibility as computer video hardware shifted from producing a single fixed scan rate to multiple possible scan rates. [3] "MultiSync" specifically was a trademark of one of NEC's first multiple-sync monitors. [4]

Computers

History

Early home computers output video to ordinary televisions or composite monitors, utilizing television display standards such as NTSC, PAL or SECAM. These display standards had fixed scan rates, and only used the vertical and horizontal sync pulses embedded in the video signals to ensure synchronization, not to set the actual scan rates.

Early dedicated computer monitors still often relied on fixed scan rates. IBM's original 1981 PC, for instance, was sold with a choice of two video cards (MDA and CGA) which were intended for use with custom IBM monitors which still used fixed scan rates. The CGA timings were identical to NTSC television, whereas the MDA card used a custom timing for higher resolution to provide better text quality. Early Macintosh monitors also used fixed scan rates.

In 1984, IBM's EGA added a second resolution which necessitated the use of a monitor supporting two scan rates, the original CGA rate as well as a second scan rate for the new video modes. [5] This monitor as well as others that could be manually switched between these two sync rates were known as dual-scan displays. [6]

The NEC Multisync was released in 1985 for use with the IBM PC, supporting a wide range of sync frequencies including those for CGA, EGA, various extended forms of those standards marketed by third party vendors, and standards yet to be released. [4]

IBM's 1987 VGA standard, in turn, expanded to three fixed scan rates. At this point, PC and Mac owners with multiple graphics cards required unique monitors for each of them, [7] and by the late 80s all of the below computer video standards required monitors which supported a small number of specific frequencies:

  1. PAL, NTSC, CGA: ~15.7 kHz horizontal scan, 50 or 60 Hz vertical scan
  2. EGA: 15.7 kHz (CGA compatible mode) or 21.8 kHz horizontal scan, 60 Hz vertical scan
  3. VGA: 31.5 kHz horizontal scan, 60 or 70 Hz vertical scan. No support for CGA/EGA timings. CGA/EGA resolutions are transmitted to the monitor at VGA compatible timings.
  4. XGA: 35.5 kHz horizontal scan, 87 Hz (43.5 Hz interlaced) vertical scan (plus VGA modes)
  5. Many different display formats for Macintosh, Sun, NeXT, and other microcomputers

After 1987's VGA. the IBM market began to develop Super VGA cards which used many different scan rates, culminating in the VBE which established standardized methods for outputting many different resolutions from one card, eventually becoming the Generalized Timing Formula which permitted graphics cards to output arbitrary resolutions.

By the late 1990s, graphics cards for microcomputers were available with specs ranging from 1024x768 at 60 Hz, to at least 1600x1200 at 85 Hz. [8] In addition to these higher resolutions and frequencies, during system boot on systems like the IBM PC, the display would operate at standard low resolution, such as the PC standard of 720x400 at 70 Hz. A monitor capable of displaying at both resolutions would need to be able to horizontally scan in a range from at least 31 to 68 kHz.

In response, VESA established a standardized list of display resolutions, refresh rates, and accompanying timing for hardware manufacturers. [9] This was superseded by VESA's Generalized Timing Formula, which provided a standard method to derive the timing of an arbitrary display mode from its sync pulses, [10] and this in turn was superseded by VESA's Coordinated Video Timings standard.

Implementation

Early multisync monitors designed for use with systems having a small number of specific frequencies, like CGA, EGA and VGA, or built-in Macintosh graphics, supported limited fixed frequencies. On the IBM PC, these were signaled from the graphics card to the monitor through the polarities of one or both H- and V-sync signals sent by the video adapter. [5]

Later designs supported a continuous range of scan frequencies, such as the NEC Multisync which supported horizontal scan rates from 15 to 31 kHz [4] derived from the sync signal timing rather than the polarity of the sync signals. [11] Displays like these could be used on multiple platforms and video cards as long as the frequencies were within range.

Modern monitors produced using the VESA frequency standards generally support arbitrary scan rates between specific minimum and maximum horizontal and vertical rates. Most modern multiscan computer monitors have a minimum horizontal scan frequency of 31 kHz. [12]

In both multisync and fixed-sync monitors, timing is important to prevent image distortion and even damage to components. [13] Most modern multiscan monitors are microprocessor controlled [14] and will refuse to attempt to synchronise to an unsupported scan rate, which usually protects them from damage.

Non-CRT monitors

The multisync concept applies to non-CRT monitors, such as LCDs, but is implemented differently.

LCD monitors are fixed-pixel displays, where the number of rows and columns displayed on the screen are constant, set by the construction of the panel. When the input signal has a resolution that does not match the number of pixels in the display, the LCD controller must still populate the same number of image elements.

This is accomplished either by scaling the image up or down as needed, creating a picture that does not have a 1:1 relationship between LCD image elements and pixels in the original image, or by displaying the image unscaled in the center of the monitor, filling the spaces on all sides with black pixels. While stand-alone LCD monitors generally accept a wide range of horizontal scan rates, the majority of LCDs accept only 60 Hz to 75 Hz vertical scan rates. In recent years, LCD monitors designed for gaming have appeared on the market offering vertical scan rates of 120 Hz and up. [15] These monitors are usually referred to by their specific max refresh rate.

Television

CRT televisions are typically designed to operate only with the video standard of the country they are sold in (PAL, NTSC, SECAM), but some sets, particularly broadcast monitors, can operate on multiple standards.

Related Research Articles

<span class="mw-page-title-main">Interlaced video</span> Technique for doubling the perceived frame rate of a video display

Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of the phi phenomenon.

<span class="mw-page-title-main">Digital Visual Interface</span> Standard for transmitting digital video to a display

Digital Visual Interface (DVI) is a video display interface developed by the Digital Display Working Group (DDWG). The digital interface is used to connect a video source, such as a video display controller, to a display device, such as a computer monitor. It was developed with the intention of creating an industry standard for the transfer of uncompressed digital video content.

<span class="mw-page-title-main">Video Graphics Array</span> Computer display standard and resolution

Video Graphics Array (VGA) is a video display controller and accompanying de facto graphics standard, first introduced with the IBM PS/2 line of computers in 1987, which became ubiquitous in the IBM PC compatible industry within three years. The term can now refer to the computer display standard, the 15-pin D-subminiature VGA connector, or the 640 × 480 resolution characteristic of the VGA hardware.

<span class="mw-page-title-main">Hercules Graphics Card</span> IBM PC graphic adapter and display standard

The Hercules Graphics Card (HGC) is a computer graphics controller formerly made by Hercules Computer Technology, Inc. that combines IBM's text-only MDA display standard with a bitmapped graphics mode, also offering a parallel printer port. This allows the HGC to offer both high-quality text and graphics from a single card.

<span class="mw-page-title-main">Enhanced Graphics Adapter</span> IBM PC graphic adapter and display standard

The Enhanced Graphics Adapter (EGA) is an IBM PC graphics adapter and de facto computer display standard from 1984 that superseded the CGA standard introduced with the original IBM PC, and was itself superseded by the VGA standard in 1987. In addition to the original EGA card manufactured by IBM, many compatible third-party cards were manufactured, and EGA graphics modes continued to be supported by VGA and later standards.

<span class="mw-page-title-main">IBM PS/2</span> Second generation of personal computers by IBM

The Personal System/2 or PS/2 is IBM's second generation of personal computers. Released in 1987, it officially replaced the IBM PC, XT, AT, and PC Convertible in IBM's lineup. Many of the PS/2's innovations, such as the 16550 UART, 1440 KB 3.5-inch floppy disk format, 72-pin SIMMs, the PS/2 port, and the VGA video standard, went on to become standards in the broader PC market.

<span class="mw-page-title-main">Color Graphics Adapter</span> IBM PC graphic adapter and display standard

The Color Graphics Adapter (CGA), originally also called the Color/Graphics Adapter or IBM Color/Graphics Monitor Adapter, introduced in 1981, was IBM's first color graphics card for the IBM PC and established a de facto computer display standard.

The refresh rate, also known as vertical refresh rate or vertical scan rate in reference to terminology originating with the cathode-ray tubes (CRTs), is the number of times per second that a raster-based display device displays a new image. This is independent from frame rate, which describes how many images are stored or generated every second by the device driving the display. On CRT displays, higher refresh rates produce less flickering, thereby reducing eye strain. In other technologies such as liquid-crystal displays, the refresh rate affects only how often the image can potentially be updated.

<span class="mw-page-title-main">VGA connector</span> 15-pin video connector

The Video Graphics Array (VGA) connector is a standard connector used for computer video output. Originating with the 1987 IBM PS/2 and its VGA graphics system, the 15-pin connector went on to become ubiquitous on PCs, as well as many monitors, projectors and high-definition television sets.

<span class="mw-page-title-main">Display resolution</span> Number of distinct pixels in each dimension that can be displayed

The display resolution or display modes of a digital television, computer monitor or display device is the number of distinct pixels in each dimension that can be displayed. It can be an ambiguous term especially as the displayed resolution is controlled by different factors in cathode ray tube (CRT) displays, flat-panel displays and projection displays using fixed picture-element (pixel) arrays.

<span class="mw-page-title-main">IBM Monochrome Display Adapter</span> IBM PC graphic adapter and display standard

The Monochrome Display Adapter is IBM's standard video display card and computer display standard for the IBM PC introduced in 1981. The MDA does not have any pixel-addressable graphics modes, only a single monochrome text mode which can display 80 columns by 25 lines of high resolution text characters or symbols useful for drawing forms.

<span class="mw-page-title-main">DB13W3</span> Video interface connector

DB13W3 (13W3) is a style of D-subminiature connector used for analog video interfaces. The 13 refers to the total number of pins, the W refers to workstation and the 3 refers to the number of high-frequency pins. The connector was something of a pseudo-standard for high-end graphical workstations from the early 1990s to the early 2000s.

Coordinated Video Timings is a standard by VESA which defines the timings of the component video signal. Initially intended for use by computer monitors and video cards, the standard made its way into consumer televisions.

Scan conversion or scan converting rate is a video processing technique for changing the vertical / horizontal scan frequency of video signal for different purposes and applications. The device which performs this conversion is called a scan converter.

<span class="mw-page-title-main">Olivetti M24</span> Computer sold by Olivetti in 1983 using the Intel 8086 CPU

The Olivetti M24 is a computer that was sold by Olivetti in 1983 using the Intel 8086 CPU.

<span class="mw-page-title-main">VGA text mode</span> Computer graphics standard from 1987

VGA text mode was introduced in 1987 by IBM as part of the VGA standard for its IBM PS/2 computers. Its use on IBM PC compatibles was widespread through the 1990s and persists today for some applications on modern computers. The main features of VGA text mode are colored characters and their background, blinking, various shapes of the cursor, and loadable fonts. The Linux console traditionally uses hardware VGA text modes, and the Win32 console environment has an ability to switch the screen to text mode for some text window sizes.

<span class="mw-page-title-main">Graphics display resolution</span> Width and height of an electronic visual display device, such as a computer monitor, in pixels

The graphics display resolution is the width and height dimension of an electronic visual display device, measured in pixels. This information is used for electronic devices such as a computer monitor. Certain combinations of width and height are standardized and typically given a name and an initialism that is descriptive of its dimensions. A graphics display resolution can be used in tandem with the size of the graphics display to calculate pixel density. An increase in the pixel density often correlates with a decrease in the size of individual pixels on a display.

The ATI Wonder series represents some of the first video card add-on products for IBM Personal Computers and compatibles introduced by ATI Technologies in the mid to late 1980s. These cards were unique at the time as they offered the end user a considerable amount of value by combining support for multiple graphics standards into a single card. The VGA Wonder series added additional value with the inclusion of a bus mouse port, which normally required the installation of a dedicated Microsoft Mouse adapter.

A modeline is a configuration line in xorg.conf or the XFree86 configuration file (XF86Config) that provides information to the display server about a connected computer monitor or television and how to drive it at a specified display resolution. The Modeline is based on the Generalized Timing Formula or the Coordinated Video Timings standards produced by VESA.

References

  1. "13 What's the difference between fixed frequency and multisynchronous monitors?". 070808 stason.org
  2. "Standards FAQ". VESA - Interface Standards for The Display Industry. Retrieved 2020-08-16. Multimode monitors can measure the incoming sync signal frequencies and thus sync to any frequency within their range of operation.
  3. "MultiSync 25th Anniversary  The Evolution of the MultiSync".
  4. 1 2 3 Inc, InfoWorld Media Group (1986-10-27). InfoWorld. InfoWorld Media Group, Inc.{{cite book}}: |last= has generic name (help)
  5. 1 2 IBM Enhanced Color Display Manual (PDF). p. 1.
  6. Inc, InfoWorld Media Group (1988-08-22). InfoWorld. InfoWorld Media Group, Inc.{{cite book}}: |last= has generic name (help)
  7. Inc, InfoWorld Media Group (1988-08-22). InfoWorld. InfoWorld Media Group, Inc.{{cite book}}: |last= has generic name (help)
  8. Inc, InfoWorld Media Group (1997-12-15). InfoWorld. InfoWorld Media Group, Inc.{{cite book}}: |last= has generic name (help)
  9. Inc, Ziff Davis (July 1993). PC Mag. Ziff Davis, Inc.{{cite book}}: |last= has generic name (help)
  10. "Standards FAQ". VESA - Interface Standards for The Display Industry. Retrieved 2020-08-16. Q: How will GTF help the monitor automatically set itself to any timing format? / A: GTF defines the relationship between syncs and video signals at any frequency of operation. The display can measure the incoming sync frequency, and thus can predict where the image will start and finish, even though it may not have been preset at that operating point.
  11. "PC Mag 1987-03-31 : Free Download, Borrow, and Streaming". Internet Archive. 31 March 1987. Retrieved 2020-08-16.
  12. "Converters | RetroRGB" . Retrieved 2020-08-16.
  13. "Standards FAQ". VESA - Interface Standards for The Display Industry. Retrieved 2020-08-16. Sync signals for displays drastically affect the quality, performance and even reliability of CRT displays. Even small differences in timing parameters can significantly affect image position and size, causing problems for the user. Difference in blanking times can lead to excessive power dissipation and electrical stress in the scanning circuits, or at the other extreme, incomplete or distorted images being displayed.
  14. "Standards FAQ". VESA - Interface Standards for The Display Industry. Retrieved 2020-08-16. In order to identify the mode, most present day multiple frequency monitors use a simple microcontroller to measure syncs.
  15. "List of 120Hz monitors  Includes 144Hz, 240Hz Blur Busters".