This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these messages)
|
A flicker fixer or scan doubler is a piece of computer hardware that de-interlaces an output video signal. The flicker fixer accomplishes this by adjusting the timing of the natively interlaced video signal to suit the needs of a progressive display for example a CRT computer monitor. Flicker fixers in essence create a progressive frame of video from two interlaced fields of video. [1]
Flicker fixers sample the NTSC/PAL output from the output device and store each scan line from the field currently being displayed in RAM while simultaneously outputting the line alternately with the corresponding neighboring lines from the field stored previously ( weaving ). Some more advanced flicker fixers integrated in add-on graphics cards use more sophisticated methods. Outputting the image at double scan rate essentially composes a progressive display with all lines from both fields at full vertical refresh rate. This promotes the horizontal frequency of the signal from 15.734 kHz to 31.47 kHz (in the NTSC case, numbers for PAL are slightly lower), which can be the used to drive a VGA monitor from an output device. [2] [3] [4]
One computer capable of producing an interlaced image is the Amiga. The Amiga's default video mode is PAL or NTSC. NTSC and PAL interlaced screens have two fields called odd and even. The fields switch every 60th of a second on NTSC, or 50th of a second on PAL, which allows for higher resolution while using a narrower signal bandwidth than full 50 or 60 FPS progressive video would require, but it can also produce an alarming jittering effect for graphics with high contrast details between fields. This NTSC/PAL compatibility gave the Amiga a distinct edge in uses such as television production and gaming, but since the original Amigas were unable to produce vertically high-resolution displays without flickering this was unsuitable for office-like usage where one might need to work with a clear high-resolution image for several hours. Flicker fixers were devised to remedy this.
The later iteration of the Amiga - the Amiga 3000 had a custom chip called Amber which could perform flicker-fixing on any signal. The ECS and AGA chipset could also output VGA display modes. Commodore offered the A2320 Display Enhancer Board for this purpose, The board fit neatly in a video graphics adapter slot on the A2000 series computer. It supported the new video modes offered by the Enhanced Chip Set (ECS) and AmigaOS 2.0, including the Productivity Mode. Also, the earlier A2024 'Hedley' greyscale monitor featured an integrated flicker fixer, supporting up to 8 shades of grey.
The Amiga 500, also known as the A500, was the first popular version of the Amiga home computer, "redefining the home computer market and making so-called luxury features such as multitasking and colour a standard long before Microsoft or Apple sold these to the masses." It contains the same Motorola 68000 as the Amiga 1000, as well as the same graphics and sound coprocessors, but is in a smaller case similar to that of the Commodore 128.
The Enhanced Chip Set (ECS) is the second generation of the Amiga computer's chipset, offering minor improvements over the original chipset (OCS) design. ECS was introduced in 1990 with the launch of the Amiga 3000. Another version was developed around 1994 but was unreleased due to Commodore International filing for bankruptcy. Amigas produced from 1990 onwards featured a mix of OCS and ECS chips, such as later versions of the Amiga 500 and the Commodore CDTV. Other ECS models were the Amiga 500+ in 1991 and lastly the Amiga 600 in 1992.
The Original Chip Set (OCS) is a chipset used in the earliest Commodore Amiga computers and defined the Amiga's graphics and sound capabilities. It was succeeded by the slightly improved Enhanced Chip Set (ECS) and the greatly improved Advanced Graphics Architecture (AGA).
Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of the characteristics of the human visual system.
Video Graphics Array (VGA) is a video display controller and accompanying de facto graphics standard, first introduced with the IBM PS/2 line of computers in 1987, which became ubiquitous in the IBM PC compatible industry within three years. The term can now refer to the computer display standard, the 15-pin D-subminiature VGA connector, or the 640 × 480 resolution characteristic of the VGA hardware.
Amiga Advanced Graphics Architecture (AGA) is the third-generation Amiga graphic chipset, first used in the Amiga 4000 in 1992. Before release AGA was codenamed Pandora by Commodore International.
Enhanced-definition television, or extended-definition television (EDTV) is a Consumer Electronics Association (CEA) marketing shorthand term for certain digital television (DTV) formats and devices. Specifically, this term defines an extension of the standard-definition television (SDTV) format that enables a clearer picture during high-motion scenes compared to previous iterations of SDTV, but not producing images as detailed as high-definition television (HDTV).
The refresh rate, also known as vertical refresh rate or vertical scan rate in reference to terminology originating with the cathode-ray tubes (CRTs), is the number of times per second that a raster-based display device displays a new image. This is independent from frame rate, which describes how many images are stored or generated every second by the device driving the display. On CRT displays, higher refresh rates produce less flickering, thereby reducing eye strain. In other technologies such as liquid-crystal displays, the refresh rate affects only how often the image can potentially be updated.
The display resolution or display modes of a digital television, computer monitor, or other display device is the number of distinct pixels in each dimension that can be displayed. It can be an ambiguous term especially as the displayed resolution is controlled by different factors in cathode-ray tube (CRT) displays, flat-panel displays and projection displays using fixed picture-element (pixel) arrays.
Progressive segmented Frame is a scheme designed to acquire, store, modify, and distribute progressive scan video using interlaced equipment.
Flicker-free is a term given to video displays, primarily cathode ray tubes, operating at a high refresh rate to reduce or eliminate the perception of screen flicker. For televisions, this involves operating at a 100 Hz or 120 Hz hertz field rate to eliminate flicker, compared to standard televisions that operate at 50 Hz or 60 Hz (NTSC), most simply done by displaying each field twice, rather than once. For computer displays, this is usually a refresh rate of 70–90 Hz, sometimes 100 Hz or higher. This should not be confused with motion interpolation, though they may be combined – see implementation, below.
Low-definition television (LDTV) refers to TV systems that have a lower screen resolution than standard-definition television systems. The term is usually used in reference to digital television, in particular when broadcasting at the same resolution as low-definition analog television systems. Mobile DTV systems usually transmit in low definition, as do all slow-scan television systems.
A video display controller (VDC), also called a display engine or display interface, is an integrated circuit which is the main component in a video-signal generator, a device responsible for the production of a TV video signal in a computing or game system. Some VDCs also generate an audio signal, but that is not their main function. VDCs were used in the home computers of the 1980s and also in some early video picture systems.
Scan conversion or scan converting rate is a video processing technique for changing the vertical / horizontal scan frequency of video signal for different purposes and applications. The device which performs this conversion is called a scan converter.
A multiple-sync (multisync) monitor, also known as a multiscan or multimode monitor, is a raster-scan analog video monitor that can properly synchronise with multiple horizontal and vertical scan rates. In contrast, fixed frequency monitors can only synchronise with a specific set of scan rates. They are generally used for computer displays, but sometimes for television, and the terminology is mostly applied to CRT displays although the concept applies to other technologies.
The Amiga video connector is a 23-pin male D-subminiature connector fitted to all personal computers in the Amiga range produced by Commodore International from 1985 to 1994, and by Escom from 1995 to 1996. The connector carries signals for analogue and digital RGB, RGB intensity, and genlocking as well as power.
In addition to the Amiga chipsets, various specially designed chips have been used in Commodore Amiga computers that do not belong to the 'Amiga chipset' in a tight sense.
Composite artifact colors is a technique commonly used to address several graphic modes of some 1970s and 1980s home computers. With some machines, when connected to an NTSC TV or monitor over composite video outputs, the video signal encoding allowed for extra colors to be displayed, by manipulating the pixel position on screen, not being limited by each machine's hardware color palette.