This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these messages)
|
In computer architecture, shared graphics memory refers to a design where the graphics chip does not have its own dedicated memory, and instead shares the main system RAM with the CPU and other components.
This design is used with many integrated graphics solutions to reduce the cost and complexity of the motherboard design, as no additional memory chips are required on the board. There is usually some mechanism (via the BIOS or a jumper setting) to select the amount of system memory to use for graphics, which means that the graphics system can be tailored to only use as much RAM as is actually required, leaving the rest free for applications. A side effect of this is that when some RAM is allocated for graphics, it becomes effectively unavailable for anything else, so an example computer with 512 MiB RAM set up with 64 MiB graphics RAM will appear to the operating system and user to only have 448 MiB RAM installed.
The disadvantage of this design is lower performance because system RAM usually runs slower than dedicated graphics RAM, and there is more contention as the memory bus has to be shared with the rest of the system. It may also cause performance issues with the rest of the system if it is not designed with the fact in mind that some RAM will be 'taken away' by graphics.
A similar approach that gave similar results is the boost up of graphics used in some SGi computers, most notably the O2/O2+. The memory in these machines is simply one fast pool (2.1 GB per second in 1996) shared between system and graphics. Sharing is performed on demand, including pointer redirection communication between main system and graphics subsystem. This is called Unified Memory Architecture (UMA).
Most early personal computers used a shared memory design with graphics hardware sharing memory with the CPU. Such designs saved money as a single bank of DRAM could be used for both display and program. Examples of this include the Apple II computer, the Commodore 64, the Radio Shack Color Computer, the Atari ST, and the Apple Macintosh.[ citation needed ]
A notable exception was the IBM PC. Graphics display was facilitated by the use of an expansion card with its own memory plugged into an ISA slot.
The first IBM PC to use the SMA was the IBM PCjr, released in 1984. Video memory was shared with the first 128 KiB of RAM. The exact size of the video memory could be reconfigured by software to meet the needs of the current program.
An early hybrid system was the Commodore Amiga which could run as a shared memory system, but would load executable code preferentially into non-shared "fast RAM" if it was available.
Amiga is a family of personal computers produced by Commodore from 1985 until the company's bankruptcy in 1994, with production by others afterward. The original model is one of a number of mid-1980s computers with 16-bit or 16/32-bit processors, 256 KB or more of RAM, mouse-based GUIs, and significantly improved graphics and audio compared to previous 8-bit systems. These include the Atari ST—released earlier the same year—as well as the Macintosh and Acorn Archimedes. The Amiga differs from its contemporaries through custom hardware to accelerate graphics and sound, including sprites, a blitter, and four channels of sample-based audio. It runs a pre-emptive multitasking operating system called AmigaOS.
The Amiga 500, also known as the A500, was the first popular version of the Amiga home computer, "redefining the home computer market and making so-called luxury features such as multitasking and colour a standard long before Microsoft or Apple sold these to the masses." It contains the same Motorola 68000 as the Amiga 1000, as well as the same graphics and sound coprocessors, but is in a smaller case similar to that of the Commodore 128.
The IBM Personal Computer is the first microcomputer released in the IBM PC model line and the basis for the IBM PC compatible de facto standard. Released on August 12, 1981, it was created by a team of engineers and designers at International Business Machines (IBM), directed by William C. Lowe and Philip Don Estridge in Boca Raton, Florida.
A motherboard is the main printed circuit board (PCB) in general-purpose computers and other expandable systems. It holds and allows communication between many of the crucial electronic components of a system, such as the central processing unit (CPU) and memory, and provides connectors for other peripherals. Unlike a backplane, a motherboard usually contains significant sub-systems, such as the central processor, the chipset's input/output and memory controllers, interface connectors, and other components integrated for general use.
The Tandy 1000 was the first in a series of IBM PC compatible home computers produced by the Tandy Corporation, sold through its Radio Shack and Radio Shack Computer Center stores. Introduced in 1984, the Tandy 1000 line was designed to offer affordable yet capable systems for home computing and education. Tandy-specific features, such as enhanced graphics, sound, and a built-in joystick port, made the computers particularly attractive for home use.
"IBM PC–compatible" refers to a class of computers that are technically compatible with the 1981 IBM PC and subsequent XT and AT models from computer giant IBM. Like the original IBM PC, they use an Intel x86 central processing unit and are capable of using interchangeable commodity hardware, such as expansion cards. Initially such computers were referred to as PC clones, IBM clones or IBM PC clones, but the term "IBM PC compatible" is now a historical description only, as the vast majority of microcomputers produced since the 1990s are IBM compatible. IBM itself no longer sells personal computers, having sold its division to Lenovo in 2005. "Wintel" is a similar description that is more commonly used for modern computers.
In computing, an expansion card is a printed circuit board that can be inserted into an electrical connector, or expansion slot on a computer's motherboard to add functionality to a computer system. Sometimes the design of the computer's case and motherboard involves placing most of these slots onto a separate, removable card. Typically such cards are referred to as a riser card in part because they project upward from the board and allow expansion cards to be placed above and parallel to the motherboard.
The Amiga 2000 (A2000) is a personal computer released by Commodore in March 1987. It was introduced as a "big box" expandable variant of the Amiga 1000 but quickly redesigned to share most of its electronic components with the contemporary Amiga 500 for cost reduction. Expansion capabilities include two 3.5" drive bays and one 5.25" bay that could be used by a 5.25" floppy drive, a hard drive, or CD-ROM once they became available.
Chip RAM is a commonly used term for the integrated RAM used in Commodore's line of Amiga computers. Chip RAM is shared between the central processing unit (CPU) and the Amiga's dedicated chipset. It was also, rather misleadingly, known as "graphics RAM".
The Apple IIc is a personal computer introduced by Apple Inc. shortly after the launch of the original Macintosh in 1984. It is essentially a compact and portable version of the Apple IIe. The IIc has a built-in floppy disk drive and a keyboard, and was often sold with its matching monitor. The c in the name stands for compact, referring to the fact it is a complete Apple II setup in a smaller notebook-sized housing. It is compatible with a wide range of Apple II software and peripherals.
The IBM PCjr was a home computer produced and marketed by IBM from March 1984 to May 1985, intended as a lower-cost variant of the IBM PC with hardware capabilities better suited for video games, in order to compete more directly with other home computers such as the Apple II and Commodore 64.
In a computer system, a chipset is a set of electronic components on one or more integrated circuits that manages the data flow between the processor, memory and peripherals. The chipset is usually found on the motherboard of computers. Chipsets are usually designed to work with a specific family of microprocessors. Because it controls communications between the processor and external devices, the chipset plays a crucial role in determining system performance. Sometimes the term "chipset" is used to describe a system on chip (SoC) used in a mobile phone.
In computing, a northbridge is a microchip that comprises the core logic chipset architecture on motherboards to handle high-performance tasks, especially for older personal computers. It is connected directly to a CPU via the front-side bus (FSB), and is usually used in conjunction with a slower southbridge to manage communication between the CPU and other parts of the motherboard.
The Adventure Game Interpreter (AGI) is a game engine developed by Sierra On-Line. The company originally developed the engine for King's Quest (1984), an adventure game that Sierra and IBM wished to market in order to attract consumers to IBM's lower-cost home computer, the IBM PCjr.
A video display controller (VDC), also called a display engine or display interface, is an integrated circuit which is the main component in a video-signal generator, a device responsible for the production of a TV video signal in a computing or game system. Some VDCs also generate an audio signal, but that is not their main function. VDCs were used in the home computers of the 1980s and also in some early video picture systems.
The Amiga is a family of home computers that were designed and sold by the Amiga Corporation from 1985 to 1994.
Following the introduction of the IBM Personal Computer, many other personal computer architectures became extinct within just a few years. It led to a wave of IBM PC compatible systems being released.
The history of the personal computer as a mass-market consumer electronic device began with the microcomputer revolution of the 1970s. A personal computer is one intended for interactive individual use, as opposed to a mainframe computer where the end user's requests are filtered through operating staff, or a time-sharing system in which one large processor is shared by many individuals. After the development of the microprocessor, individual personal computers were low enough in cost that they eventually became affordable consumer goods. Early personal computers – generally called microcomputers – were sold often in electronic kit form and in limited numbers, and were of interest mostly to hobbyists and technicians.
Home computers were a class of microcomputers that entered the market in 1977 and became common during the 1980s. They were marketed to consumers as affordable and accessible computers that, for the first time, were intended for the use of a single, non-technical user. These computers were a distinct market segment that typically cost much less than business, scientific, or engineering-oriented computers of the time, such as those running CP/M or the IBM PC, and were generally less powerful in terms of memory and expandability. However, a home computer often had better graphics and sound than contemporary business computers. Their most common uses were word processing, playing video games, and programming.
Tandy Graphics Adapter is a computer display standard for the Tandy 1000 series of IBM PC compatibles, which has compatibility with the video subsystem of the IBM PCjr but became a standard in its own right.