Dual-ported video RAM

Last updated

Dual-ported video RAM (VRAM) is a dual-ported variant of dynamic RAM (DRAM), which was once commonly used to store the framebuffer in graphics adapters.

Contents

Dual-ported RAM allows the CPU to read and write data to memory as if it were a conventional DRAM chip, while adding a second port that reads out data in a serial fashion. This makes it easy to interface with a video display controller (VDC), which sends a timing signal to the memory and receives data in the correct sequence as it draws the screen. Because the CPU and VDC access the memory simultaneously on different ports, dual-ported RAM does not require the CPU to pause while the VDC uses memory, thereby eliminating the associated wait states and improving overall system performance.

Dual-ported RAM was common from the mid-1980s into the mid-1990s. After that date, new forms of high-performance memory began to be used that eventually replaced dual-ported designs. As these other forms of memory are also known as video memory, and thus VRAM, it sometimes confused with this older form of memory.

Samsung Electronics VRAM SEC VRAM.jpg
Samsung Electronics VRAM

History

Early computers used dynamic RAM [lower-alpha 1] to store video data to be output to a conventional television or a simple conversion of a television that accepted composite video input. To work with such a display it is extremely important that the video hardware output a very accurately timed signal. At the speeds that contemporary memory worked at, reading data to feed to the video hardware used up much of the possible performance of the memory devices. This conflicted with the need for the central processing unit (CPU) to write data to memory for the video system to read, as both could not use the same memory at the same time.

Two general solutions were used to avoid timing issues. For higher-priced systems, the video systems had their own dedicated memory and used a separate system for the CPU to store data into it. This eliminated any possibility of contention for memory, but at the cost of requiring separate memory in an era when memory was very expensive. It also almost always communicated over a slow system bus that limited the speed that changes to the screen could be made, making interactive graphics difficult. The other solution, used by most home computers, was to use a single shared bank of memory and allow the video hardware to control access to memory, pausing the CPU when needed. This may lead to slower computing performance as the CPU is repeatedly put into these "wait states", but it had the advantage of being less expensive and allowing the CPU to more rapidly update the display and thus provide more interactivity.

By the early 1980s, the introduction of much higher-resolution monitors that demanded larger framebuffers, and the newly introduced graphical user interfaces (GUIs) that required high resolution and high overall performance, made the performance of the video system an increasingly difficult problem. Complex systems like the Amiga Agnus emerged to carefully control access to memory and reduce contention, but while these reduced the problem they did not eliminate it.

The solution was to use memory that could be access by the CPU and video hardware at the same time. It was invented by F. Dill, D. Ling and R. Matick at IBM Research in 1980, with a patent issued in 1985 (US Patent 4,541,075). [1] The first commercial use of VRAM was in a high-resolution graphics adapter introduced in 1986 by IBM for its RT PC system, which set a new standard for graphics displays. Prior to the development of VRAM, dual-ported memory was quite expensive, limiting higher resolution bitmapped graphics to high-end workstations. VRAM improved the overall framebuffer throughput, allowing low cost, high-resolution, high-speed, color graphics. Modern GUI-based operating systems benefitted from this and thus it provided a key ingredient for proliferation of graphical user interfaces (GUIs) throughout the world at that time.

Description

Dynamic RAM is internally arranged in an array of rows and columns of capacitors, with each row/column intersection holding a single bit in a cell. In typical use, a CPU accessing a DRAM will ask for a small amount of data at a time, possibly a single byte. To read a byte for the CPU, the DRAM decodes the provided address into a series of eight cells, reads the entire row containing those cells, and latches the requested data so it can be read on the data bus. At the time, rows were commonly 1,024 cells wide.

DRAM devices are destructive, meaning that the act of reading a row also causes the data in it to be erased. To make the data permanent, any reading has to be followed by the DRAM writing the same data back to that row. To accomplish this, separate latches for the entire row have to be included, and the data is written back to the row while the CPU is reading out the requested byte. When one considers the process as a whole, this means the DRAM is repeatedly reading entire rows of data, selecting a single byte from that data and discarding the rest, and then writing it all back again.

VRAM operates by not discarding the excess bits in the row. Instead, the data read into the row storage is also sent to a second set of latches and an associated bit shifter. From that point, the data can be read out a bit at a time by triggering the shifter, and doing so only requires a single pin. VRAM generally does not have two address buses, meaning that the CPU and graphics still have to interleave their accesses to the chip, but as a whole row of data is read out to the graphics driver, and that row might represent multiple scan lines on the screen, the number of times it has to interrupt the CPU can be greatly reduced. [2]

Such operation is described in the paper "All points addressable raster display memory" by R. Matick, D. Ling, S. Gupta, and F. Dill, IBM Journal of R&D, Vol 28, No. 4, July 1984, pp. 379–393. To use the video port, the controller first uses the DRAM port to select the row of the memory array that is to be displayed. The VRAM then copies that entire row to an internal row-buffer which is a shift register. The controller can then continue to use the DRAM port for drawing objects on the display. Meanwhile, the controller feeds a clock called the shift clock (SCLK) to the VRAM's video port. Each SCLK pulse causes the VRAM to deliver the next data bit, in strict address order, from the shift register to the video port. For simplicity, the graphics adapter is usually designed so that the contents of a row, and therefore the contents of the shift-register, corresponds to a complete horizontal line on the display.

Shift to SDRAM

Through the 1990s, many graphic subsystems used VRAM, with the number of megabits touted as a selling point. In the late 1990s, synchronous DRAM technologies gradually became affordable, dense, and fast enough to displace VRAM, even though it is only single-ported and more overhead is required. Nevertheless, many of the VRAM concepts of internal, on-chip buffering and organization have been used and improved in modern graphics adapters.

See also

Notes

  1. And in some early systems, static RAM or shift registers.

Related Research Articles

<span class="mw-page-title-main">Amiga Original Chip Set</span> Chipset used in Amiga personal computer

The Original Chip Set (OCS) is a chipset used in the earliest Commodore Amiga computers and defined the Amiga's graphics and sound capabilities. It was succeeded by the slightly improved Enhanced Chip Set (ECS) and the greatly improved Advanced Graphics Architecture (AGA).

<span class="mw-page-title-main">Intel 8085</span> 8-bit microprocessor by Intel

The Intel 8085 ("eighty-eighty-five") is an 8-bit microprocessor produced by Intel and introduced in March 1976. It is the last 8-bit microprocessor developed by Intel.

<span class="mw-page-title-main">Framebuffer</span> Portion of random-access memory containing a bitmap that drives a video display

A framebuffer is a portion of random-access memory (RAM) containing a bitmap that drives a video display. It is a memory buffer containing data representing all the pixels in a complete video frame. Modern video cards contain framebuffer circuitry in their cores. This circuitry converts an in-memory bitmap into a video signal that can be displayed on a computer monitor.

<span class="mw-page-title-main">DECstation</span> DEC brand of computers

The DECstation was a brand of computers used by DEC, and refers to three distinct lines of computer systems—the first released in 1978 as a word processing system, and the latter two both released in 1989. These comprised a range of computer workstations based on the MIPS architecture and a range of PC compatibles. The MIPS-based workstations ran ULTRIX, a DEC-proprietary version of UNIX, and early releases of OSF/1.

Text mode is a computer display mode in which content is internally represented on a computer screen in terms of characters rather than individual pixels. Typically, the screen consists of a uniform rectangular grid of character cells, each of which contains one of the characters of a character set; at the same time, contrasted to graphics mode or other kinds of computer graphics modes.

<span class="mw-page-title-main">X68000</span> 1987 home computer

The X68000 is a home computer created by Sharp Corporation. It was first released in 1987 and sold only in Japan.

<span class="mw-page-title-main">MOS Technology 8563</span>

The 8563 Video Display Controller (VDC) was an integrated circuit produced by MOS Technology. It was used in the Commodore 128 (C128) computer to generate an 80-column RGB video display, running alongside a VIC-II which supported Commodore 64-compatible graphics. The DCR models of the C128 used the later and more technically advanced 8568 [D]VDC controller.

<span class="mw-page-title-main">Television Interface Adaptor</span> Video/audio/input chip of the Atari 2600

The Television Interface Adaptor (TIA) is the custom computer chip, along with a variant of the MOS Technology 6502 constituting the heart of the 1977 Atari Video Computer System game console. The TIA generates the screen display, sound effects, and reads the controllers. At the time the Atari VCS was designed, even small amounts of RAM were expensive. The chip was designed around not having a frame buffer, instead requiring detailed programming to create even a simple display.

<span class="mw-page-title-main">TMS9918</span> Video display controller

The TMS9918 is a video display controller (VDC) manufactured by Texas Instruments, in manuals referenced as "Video Display Processor" (VDP) and introduced in 1979. The TMS9918 and its variants were used in the ColecoVision, CreatiVision, Memotech MTX, MSX, NABU Personal Computer, SG-1000/SC-3000, Spectravideo SV-318, SV-328, Sord M5, Tatung Einstein, TI-99/4, Casio PV-2000, Coleco Adam, Hanimex Pencil II, and Tomy Tutor.

<span class="mw-page-title-main">Motorola 6845</span> Display controller

The Motorola 6845, or MC6845, is a display controller that was widely used in 8-bit computers during the 1980s. Originally intended for designs based on the Motorola 6800 CPU and given a related part number, it was more widely used alongside various other processors, and was most commonly found in machines based on the Zilog Z80 and MOS 6502.

<span class="mw-page-title-main">MOS Technology 8568</span> Graphics processor for the Commodore 128DCR personal computer

The MOS Technology 8568 Video Display Controller (VDC) was the graphics processor responsible for the 80 column or RGBI display on the Commodore 128DCR personal computer. In the Commodore 128 service manual, this part was referred to as the "80 column CRT controller." The 8568 embodied many of the features of the older 6545E monochrome CRT controller plus RGBI color.

<span class="mw-page-title-main">Video display controller</span> Type of integrated circuit

A video display controller (VDC), also called a display engine or display interface, is an integrated circuit which is the main component in a video-signal generator, a device responsible for the production of a TV video signal in a computing or game system. Some VDCs also generate an audio signal, but that is not their main function. VDCs were used in the home computers of the 1980s and also in some early video picture systems.

The AAA chipset was intended to be the next-generation Amiga multimedia system designed by Commodore International. Initially begun as a secret project, the first design discussions were started in 1988, and after many revisions and redesigns the first silicon versions were fabricated in 1992–1993. The project was stymied in 1993 based on a lack of funds for chip revisions.

In addition to the Amiga chipsets, various specially designed chips have been used in Commodore Amiga computers that do not belong to the 'Amiga chipset' in a tight sense.

This glossary of computer hardware terms is a list of definitions of terms and concepts related to computer hardware, i.e. the physical and structural components of computers, architectural issues, and peripheral devices.

<span class="mw-page-title-main">PlayStation 2 technical specifications</span> Overview of the PlayStation 2 technical specifications

The PlayStation 2 technical specifications describe the various components of the PlayStation 2 (PS2) video game console.

<span class="mw-page-title-main">PlayStation technical specifications</span> Overview of the technical specifications of the PlayStation

The PlayStation technical specifications describe the various components of the original PlayStation video game console.

<span class="mw-page-title-main">Video random-access memory</span> Type of dedicated computer memory

Video random-access memory (VRAM) is dedicated computer memory used to store the pixels and other graphics data as a framebuffer to be rendered on a computer monitor. It often uses a different technology than other computer memory, in order to be read quickly for display on a screen.

References

  1. Patent US4541075 , retrieved 2017-06-07
  2. SM55161A 262144×16 bit VRAM data sheet (PDF), Austin Semiconductor, retrieved 2009-03-02