This article needs additional citations for verification .(May 2023) |
A multiple-sync (multisync) monitor, also known as a multiscan or multimode monitor, is a raster-scan analog video monitor that can properly synchronise with multiple horizontal and vertical scan rates. [1] [2] In contrast, fixed frequency monitors can only synchronise with a specific set of scan rates. They are generally used for computer displays, but sometimes for television, and the terminology is mostly applied to CRT displays although the concept applies to other technologies.
Multiscan computer monitors appeared during the mid 1980s, offering flexibility as computer video hardware shifted from producing a single fixed scan rate to multiple possible scan rates. [3] "MultiSync" specifically was a trademark of one of NEC's first multiple-sync monitors. [4]
Early home computers output video to ordinary televisions or composite monitors, utilizing television display standards such as NTSC, PAL or SECAM. These display standards had fixed scan rates, and only used the vertical and horizontal sync pulses embedded in the video signals to ensure synchronization, not to set the actual scan rates.
Early dedicated computer monitors still often relied on fixed scan rates. IBM's original 1981 PC, for instance, was sold with a choice of two video cards (MDA and CGA) which were intended for use with custom IBM monitors which still used fixed scan rates. The CGA timings were identical to NTSC television, whereas the MDA card used a custom timing for higher resolution to provide better text quality. Early Macintosh monitors also used fixed scan rates.
In 1984, IBM's EGA added a second resolution which necessitated the use of a monitor supporting two scan rates, the original CGA rate as well as a second scan rate for the new video modes. [5] This monitor as well as others that could be manually switched between these two sync rates were known as dual-scan displays. [6]
The NEC Multisync was released in 1985 for use with the IBM PC, supporting a wide range of sync frequencies including those for CGA, EGA, various extended forms of those standards marketed by third party vendors, and standards yet to be released. [4]
IBM's 1987 VGA standard, in turn, expanded to three fixed scan rates. At this point, PC and Mac owners with multiple graphics cards required unique monitors for each of them, [7] and by the late 80s all of the below computer video standards required monitors which supported a small number of specific frequencies:
After 1987's VGA. the IBM market began to develop Super VGA cards which used many different scan rates, culminating in the VBE which established standardized methods for outputting many different resolutions from one card, eventually becoming the Generalized Timing Formula which permitted graphics cards to output arbitrary resolutions.
By the late 1990s, graphics cards for microcomputers were available with specs ranging from 1024x768 at 60 Hz, to at least 1600x1200 at 85 Hz. [8] In addition to these higher resolutions and frequencies, during system boot on systems like the IBM PC, the display would operate at standard low resolution, such as the PC standard of 720x400 at 70 Hz. A monitor capable of displaying at both resolutions would need to be able to horizontally scan in a range from at least 31 to 68 kHz.
In response, VESA established a standardized list of display resolutions, refresh rates, and accompanying timing for hardware manufacturers. [9] This was superseded by VESA's Generalized Timing Formula, which provided a standard method to derive the timing of an arbitrary display mode from its sync pulses, [10] and this in turn was superseded by VESA's Coordinated Video Timings standard.
Early multisync monitors designed for use with systems having a small number of specific frequencies, like CGA, EGA and VGA, or built-in Macintosh graphics, supported limited fixed frequencies. On the IBM PC, these were signaled from the graphics card to the monitor through the polarities of one or both H- and V-sync signals sent by the video adapter. [5]
Later designs supported a continuous range of scan frequencies, such as the NEC Multisync which supported horizontal scan rates from 15 to 31 kHz [4] derived from the sync signal timing rather than the polarity of the sync signals. [11] Displays like these could be used on multiple platforms and video cards as long as the frequencies were within range.
Modern monitors produced using the VESA frequency standards generally support arbitrary scan rates between specific minimum and maximum horizontal and vertical rates. Most modern multiscan computer monitors have a minimum horizontal scan frequency of 31 kHz. [12]
In both multisync and fixed-sync monitors, timing is important to prevent image distortion and even damage to components. [13] Most modern multiscan monitors are microprocessor controlled [14] and will refuse to attempt to synchronise to an unsupported scan rate, which usually protects them from damage.
The multisync concept applies to non-CRT monitors, such as LCDs, but is implemented differently.
LCD monitors are fixed-pixel displays, where the number of rows and columns displayed on the screen are constant, set by the construction of the panel. When the input signal has a resolution that does not match the number of pixels in the display, the LCD controller must still populate the same number of image elements.
This is accomplished either by scaling the image up or down as needed, creating a picture that does not have a 1:1 relationship between LCD image elements and pixels in the original image, or by displaying the image unscaled in the center of the monitor, filling the spaces on all sides with black pixels. While stand-alone LCD monitors generally accept a wide range of horizontal scan rates, the majority of LCDs accept only 60 Hz to 75 Hz vertical scan rates. In recent years, LCD monitors designed for gaming have appeared on the market offering vertical scan rates of 120 Hz and up. [15] These monitors are usually referred to by their specific max refresh rate.
CRT televisions are typically designed to operate only with the video standard of the country they are sold in (PAL, NTSC, SECAM), but some sets, particularly broadcast monitors, can operate on multiple standards.
Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of the characteristics of the human visual system.
Digital Visual Interface (DVI) is a video display interface developed by the Digital Display Working Group (DDWG). The digital interface is used to connect a video source, such as a video display controller, to a display device, such as a computer monitor. It was developed with the intention of creating an industry standard for the transfer of uncompressed digital video content.
Video Graphics Array (VGA) is a video display controller and accompanying de facto graphics standard, first introduced with the IBM PS/2 line of computers in 1987, which became ubiquitous in the IBM PC compatible industry within three years. The term can now refer to the computer display standard, the 15-pin D-subminiature VGA connector, or the 640 × 480 resolution characteristic of the VGA hardware.
The Hercules Graphics Card (HGC) is a computer graphics controller formerly made by Hercules Computer Technology, Inc. that combines IBM's text-only MDA display standard with a bitmapped graphics mode, also offering a parallel printer port. This allows the HGC to offer both high-quality text and graphics from a single card.
The Enhanced Graphics Adapter (EGA) is an IBM PC graphics adapter and de facto computer display standard from 1984 that superseded the CGA standard introduced with the original IBM PC, and was itself superseded by the VGA standard in 1987. In addition to the original EGA card manufactured by IBM, many compatible third-party cards were manufactured, and EGA graphics modes continued to be supported by VGA and later standards.
The Personal System/2 or PS/2 is IBM's second generation of personal computers. Released in 1987, it officially replaced the IBM PC, XT, AT, and PC Convertible in IBM's lineup. Many of the PS/2's innovations, such as the 16550 UART, 1440 KB 3.5-inch floppy disk format, 72-pin SIMMs, PS/2 port, and VGA video standard, went on to become standards in the broader PC market.
The Color Graphics Adapter (CGA), originally also called the Color/Graphics Adapter or IBM Color/Graphics Monitor Adapter, introduced in 1981, was IBM's first color graphics card for the IBM PC and established a de facto computer display standard.
The Multi-Color Graphics Array or MCGA is a video subsystem built into the motherboard of the IBM PS/2 Model 30, introduced in April 1987, and Model 25, introduced later in August 1987; no standalone MCGA cards were ever made.
The refresh rate, also known as vertical refresh rate or vertical scan rate in reference to terminology originating with the cathode-ray tubes (CRTs), is the number of times per second that a raster-based display device displays a new image. This is independent from frame rate, which describes how many images are stored or generated every second by the device driving the display. On CRT displays, higher refresh rates produce less flickering, thereby reducing eye strain. In other technologies such as liquid-crystal displays, the refresh rate affects only how often the image can potentially be updated.
The Video Graphics Array (VGA) connector is a standard connector used for computer video output. Originating with the 1987 IBM PS/2 and its VGA graphics system, the 15-pin connector went on to become ubiquitous on PCs, as well as many monitors, projectors and HD television sets.
The Monochrome Display Adapter is IBM's standard video display card and computer display standard for the IBM PC introduced in 1981. The MDA does not have any pixel-addressable graphics modes, only a single monochrome text mode which can display 80 columns by 25 lines of high-resolution text characters or symbols useful for drawing forms.
DB13W3 (13W3) is a style of D-subminiature connector used for analog video interfaces. The 13 refers to the total number of pins, the W refers to workstation and the 3 refers to the number of high-frequency pins. The connector was something of a pseudo-standard for high-end graphical workstations from the early 1990s to the early 2000s.
Coordinated Video Timings is a standard by VESA which defines the timings of the component video signal. Initially intended for use by computer monitors and video cards, the standard made its way into consumer televisions.
Scan conversion or scan converting rate is a video processing technique for changing the vertical / horizontal scan frequency of video signal for different purposes and applications. The device which performs this conversion is called a scan converter.
The Olivetti M24 is a computer that was sold by Olivetti in 1983 using the Intel 8086 CPU.
VGA text mode was introduced in 1987 by IBM as part of the VGA standard for its IBM PS/2 computers. Its use on IBM PC compatibles was widespread through the 1990s and persists today for some applications on modern computers. The main features of VGA text mode are colored characters and their background, blinking, various shapes of the cursor, and loadable fonts. The Linux console traditionally uses hardware VGA text modes, and the Win32 console environment has an ability to switch the screen to text mode for some text window sizes.
A display resolution standard is a commonly used width and height dimension of an electronic visual display device, measured in pixels. This information is used for electronic devices such as a computer monitor. Certain combinations of width and height are standardized and typically given a name and an initialism which is descriptive of its dimensions.
The ATI Wonder is a series of video cards for the IBM Personal Computer and compatibles, introduced by ATI Technologies in the mid to late 1980s. These cards were unique at the time as they offered the end user a considerable amount of value by combining support for multiple graphics standards into a single card. The VGA Wonder series added additional value with the inclusion of a bus mouse port, which normally required the installation of a dedicated Microsoft Mouse adapter.
A modeline is a configuration line in xorg.conf or the XFree86 configuration file (XF86Config) that provides information to the display server about a connected computer monitor or television and how to drive it at a specified display resolution. The Modeline is based on the Generalized Timing Formula or the Coordinated Video Timings standards produced by VESA.
Multimode monitors can measure the incoming sync signal frequencies and thus sync to any frequency within their range of operation.
{{cite book}}
: |last=
has generic name (help){{cite book}}
: |last=
has generic name (help){{cite book}}
: |last=
has generic name (help){{cite book}}
: |last=
has generic name (help){{cite book}}
: |last=
has generic name (help)Q: How will GTF help the monitor automatically set itself to any timing format? / A: GTF defines the relationship between syncs and video signals at any frequency of operation. The display can measure the incoming sync frequency, and thus can predict where the image will start and finish, even though it may not have been preset at that operating point.
Sync signals for displays drastically affect the quality, performance and even reliability of CRT displays. Even small differences in timing parameters can significantly affect image position and size, causing problems for the user. Difference in blanking times can lead to excessive power dissipation and electrical stress in the scanning circuits, or at the other extreme, incomplete or distorted images being displayed.
In order to identify the mode, most present day multiple frequency monitors use a simple microcontroller to measure syncs.