The G400 is a video card made by Matrox, released in September 1999. The graphics processor contains a 2D GUI, video, and Direct3D 6.0 3D accelerator. Codenamed "Toucan", it was a more powerful and refined version of its predecessor, the G200.
The Matrox G200 graphics processor had been a successful product, competing with the various 2D & 3D combination cards available in 1998. Matrox took the technology developed from the G200 project, refined it, and essentially doubled it up to form the G400 processor. The new chip featured several new and innovative additions, such as multiple monitor output support, an all-around 32-bit rendering pipeline with high performance, further improved 2D and video acceleration, and a new 3D feature known as Environment Mapped Bump Mapping.
Internally the G400 is a 256-bit processor, using what Matrox calls a "DualBus" architecture. This is an evolution of G200's "DualBus", which had been 128-bit. A Matrox "DualBus" chip consists of twin unidirectional buses internally, each moving data into or out of the chip. This increases the efficiency and bandwidth of data flow within the chip to each of its functional units. G400's 3D engine consists of 2 parallel pixel pipelines with 1 texture unit each, providing single-pass dual-texturing capability. The Millennium G400 MAX is capable of 333 megapixels per second fillrate at its 166 MHz core clock speed. It is purely a Direct3D 6.0 accelerator and, as such, lacks support for the later hardware transform and lighting acceleration of Direct3D 7.0 cards.
The chip's external memory interface is 128-bit and is designed to use either SDRAM or SGRAM. Matrox released both 16 MiB and 32 MiB versions of the G400 boards, and used both types of RAM. The slowest models are equipped with 166 MHz SDRAM, while the fastest (G400 MAX) uses 200 MHz SGRAM. G400MAX had the highest memory bandwidth of any card before the release of the DDR-equipped version of NVIDIA GeForce 256.
Perhaps the most notable feature of G400 is its ability to drive two separate monitors to display a single desktop. This feature is known as "DualHead" and was a decisive edge for Matrox over the card's competitors at the time. The DualHead capability not only offered desktop widening but also desktop cloning (two screens showing the same thing) and a special "DVDMAX" mode which outputs video overlays onto the second monitor. Matrox's award-winning Powerdesk display drivers and control panel integrated Dualhead in a very flexible and functional way that become world-renowned for its effectiveness. However, contrary to the video mode's name, G400 does not support full DVD decoding hardware acceleration. G400 does have partial support for the DVD video decoding process but it does not perform inverse discrete cosine transform IDCT or motion compensation in hardware (the two most demanding steps of the process).
The G400 chip supports, in hardware, a texture-based surface detailing method called Environment Mapped Bump Mapping (EMBM). EMBM was actually created by BitBoys Oy and licensed to Matrox. EMBM was not supported by several competitors such as NVIDIA's GeForce 256 through GeForce 2, which only supported the simpler Dot-3 BM, but was available on the ATI Radeon 7200. Due to this lack of industry-wide support, and its toll on the limited graphics hardware of the time, EMBM only saw limited use during G400's time. Only a few games supported the feature, such as Dungeon Keeper 2 and Millennium Soldier: Expendable. EMBM requires either specialized hardware within the chip for its calculations or a more flexible and programmable graphics pipeline, such as later DirectX 8.0 accelerators like the GeForce 3 and Radeon 8500.
G400's rendering pipelined uses what Matrox called "Vibrant Color Quality 2" (VCQ2), a functionality in which all internal 3D calculations are done with 32-bit precision. The goal was to prevent dithering and other artifacts caused by inadequate precision when performing calculations. The result was the best quality 16-bit and 32-bit color modes available at the time.
Matrox was known for their quality analog display output on prior cards and the G400 is no exception. G400 was the benchmark for signal quality for several years, significantly outperforming some competitors (notably pre-GeForce4 NVIDIA cards). Where many cards were crippled by blurry output, especially as the resolution and refresh rate increased, the Matrox cards delivered very sharp and clear images.
G400 is the first Matrox board compatible with AGP 4X. Most (REV. A) G400 boards actually only support 2X mode, but there are later revisions (REV. B), that are fully 4X compliant and run at the higher speed if the motherboard is capable as well.
G400 was known for being particularly dependent on the host system's CPU for high 3D performance. This was attributed both to its architecture and to the poor drivers it relied on for much of its life (especially OpenGL ICD). With regard to its hardware, G400's triangle setup engine, called the "Warp Engine" ironically, was somewhat slower than the counterparts aboard the competition's cards. However, the Warp engine was programmable which theoretically enhanced flexibility of the chip. Unfortunately Matrox never described the functionality of this component in-depth so little is known about it.
As said earlier, G400 suffered at launch from driver problems. While its Direct3D performance was admirable, its OpenGL installable client driver (ICD) component was very poor. The situation was eerily similar to what had happened with the older G200, with its near-total lack of credible OpenGL support. Matrox made it very clear that they were committed to supporting OpenGL, however, and development rapidly progressed. G400 initially launched with an OpenGL to Direct3D wrapper driver, like G200, that translated an application's OpenGL calls into Direct3D (a slow and buggy solution). Eventually a native OpenGL driver called "TurboGL" was released, but it was only designed to support several popular games of the time (e.g. Quake3). This driver was a precursor to a fully functional OpenGL ICD driver, a quick development to improve performance as fast as possible by offering an interim solution. Since TurboGL didn't support all OpenGL applications, it was essentially a "Mini ICD" much like 3DFX had used with their Voodoo boards. TurboGL included support for then-new SIMD technologies from AMD and Intel, including SSE1 and 3DNow!. In mid-2000 the G400 received a fully compliant OpenGL ICD which offered capable performance in most OpenGL-supporting software. The G400 continually received official driver updates into 2006.
Even with initial driver difficulties, Matrox G400 was very competitive. 2D and Direct3D performance were more than competitive with the NVIDIA RIVA TNT2, 3dfx Voodoo3, and ATI Rage 128 Pro. In fact, prior to the release of the NVIDIA GeForce 256 that supported Direct3D 7.0 transform and lighting acceleration, the Millennium G400 MAX was a respectable Direct3D card, competitive with Voodoo3 3500 and TNT2 Ultra. 3dfx had an edge in some games with its low-overhead Glide API and NVIDIA was, for a long time, king of OpenGL.
Matrox stopped support for Marvel G400-TV early because there was no way to make it fully functional in Windows 2000. The problem was with the Zoran chip used for hardware MJPEG video compression on the Marvel G400 card. Matrox tried to make stable drivers for several months but with no luck. A Matrox user going by name Adis hacked original drivers to make the card work under Windows 2000. [1] [2] [3] The driver was later updated for Windows XP, and then for Windows Server 2003. Video capturing was possible but drivers are still based on VfW. Hardware MJPEG capturing can be unstable but software compression, using a good video codec, gives much better results anyway. There are no WDM drivers available for this card.
In Fall of 2000, Matrox introduced the G450 chip (codenamed Condor) as a successor to the G400 line. Like the G250 was to the G200, G450 was primarily a die shrink of the G400 core from the 250 nm semiconductor fabrication process to 180 nm. By shrinking the core, costs are reduced because more chips are made per wafer at the factory, and Matrox can take the time to fix earlier mistakes in the core, and trim or add new functionality. Matrox clocked the G450 core at 125 MHz, just like the plain G400. Overclocking tests showed that the core was unable to achieve higher speeds than G400 even though it was manufactured on a newer process. [4]
Perhaps the biggest addition to G450 was that Matrox moved the previously external second RAMDAC, for the second monitor connector (DualHead), into the G450 chip itself. RAMDAC speeds were still different though, with the primary running at an excellent 360 MHz, but the secondary running at only 230 MHz. This meant that the primary monitor could run much higher resolutions and refresh rates than the secondary. This was the same as G400. The G450 also had native support for TMDS signaling, and thus DVI, but this was not a standard issue connector. Boards shipped with dual analog VGA connectors.
G450 was adapted to use a DDR SDRAM memory interface, instead of the older single data rate (SDR) SGRAM and SDRAM used on G400. By doing this they were able to switch to a 64-bit memory bus and use the DDR memory to equal the previous memory bandwidth by clocking the RAM again at 166 MHz. A 64-bit bus reduces the board's complexity (and cost) because fewer traces have to be used, and potentially the pin-count of the graphics processor can be significantly reduced if the chip is designed only for a 64-bit bus. However, DDR has a higher inherent latency than SDR given the same bandwidth, so performance dropped somewhat. [4]
The new G450 again had support for AGP 4X, like some later-produced G400 boards. The 3D capabilities of G450 were identical to G400. Unfortunately, because of the identical core clock and due to lower memory bandwidth, G450 was slower than G400 in games. [5]
Marvel G450 eTV not only had a TV tuner, but also was a launchpad for Matrox's new eDualHead dual display enhancement. It added some new features to DualHead that worked with Internet Explorer to make pages show up on both screens at once. [6]
MGA-G550 processor added a second pixel pipeline, hardware transform and lighting, and the HeadCasting Engine, a hardware implementation of a vertex shader for accelerated matrix palette skinning. It does this by improving on the 96 constant registers specified for by DirectX 8.0 to a total of 256. Despite the feature, it is inaccessible by DirectX driver. Matrox only supports HeadCasting feature through the bundled Matrox Digimask software, which have never become popular. [7]
On 2005-7-13, Matrox Graphics Inc. announced the availability of Millennium G550 PCIe, the world's first PCI Express x1 graphics card. [8] The card uses Texas Instruments XIO2000 bridge controller to achieve PCI Express support. [9]
Findings within a release of Matrox graphics drivers (MGA64.sys v4.77.027) mentioned a never-released Matrox Millennium G800. [10] [11] The MGA-G800, codenamed Condor 2, would have been clocked at 200 MHz core with 200 MHz DDR memory (6.4 GB/s bandwidth). The chip had 3 pixel pipelines with 3 texture units each. It was also equipped with a hardware transform and lighting unit capable of processing 20–30 million triangles per second. Further speculation included a memory controller that could support DDR SDRAM and DDR FC-RAM, DirectX 8.0 compliance, and a faster version running at 250 MHz. These specifications are somewhat reminiscent of Matrox Parhelia, in that Parhelia is a 4 pipeline DirectX 8 GPU with 4 texture units per pipeline.
Board Name | Core Type | Process | Core (MHz) | Memory (MHz) | Pipe Config | T&L | Memory Interface | Notes |
---|---|---|---|---|---|---|---|---|
Millennium G400 | Toucan | 250 nm | 125 | 166 | 2x1 | N | 128-bit | 32 MiB SGRAM or 16 MiB SGRAM/SDRAM |
Millennium G400 MAX | Toucan | 250 nm | 150 | 200 | 2x1 | N | 128-bit | 32 MiB SGRAM. Needs fan. Highest memory bandwidth until GeForce 256 DDR. 3.2 GB/s |
Marvel G400-TV | Toucan | 250 nm | 125 | 166 | 2x1 | N | 128-bit | 16 MiB SGRAM. Video capture & TV tuner. |
Millennium G450 | Condor | 180 nm | 125 | 166 | 2x1 | N | 64-bit | DDR SDRAM. Integrated 2nd RAMDAC into core. TMDS/DVI option. |
Marvel G450 eTV | Condor | 180 nm | 2x1 | N | 64-bit | TV tuner. eDualHead. | ||
Millennium G550 | Condor | 180 nm | 125 | 166 | 2x2x1 | Y | 64-bit | 32 MiB DDR SDRAM |
Matrox Graphics, Inc. is a producer of video card components and equipment for personal computers and workstations. Based in Dorval, Quebec, Canada, it was founded in 1976 by Lorne Trottier and Branko Matić. The name is derived from "Ma" in Matić and "Tro" in Trottier.
The GeForce 256 is the original release in Nvidia's "GeForce" product line. Announced on August 31, 1999 and released on October 11, 1999, the GeForce 256 improves on its predecessor by increasing the number of fixed pixel pipelines, offloading host geometry calculations to a hardware transform and lighting (T&L) engine, and adding hardware motion compensation for MPEG-2 video. It offered a notable leap in 3D PC gaming performance and was the first fully Direct3D 7-compliant 3D accelerator.
The GeForce 2 series (NV15) is the second generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in 2000, it is the successor to the GeForce 256.
The GeForce 3 series (NV20) is the third generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in February 2001, it advanced the GeForce architecture by adding programmable pixel and vertex shaders, multisample anti-aliasing and improved the overall efficiency of the rendering process.
The RIVA 128, or "NV3", was a consumer graphics processing unit created in 1997 by Nvidia. It was the first to integrate 3D acceleration in addition to traditional 2D and video acceleration. Its name is an acronym for Real-time Interactive Video and Animation accelerator.
The RIVA TNT, codenamed NV4, is a 2D, video, and 3D graphics accelerator chip for PCs that was developed by Nvidia and released in March 1998. It cemented Nvidia's reputation as a worthy rival within the developing consumer 3D graphics adapter industry. It succeeded the RIVA 128.
The RIVA TNT2 is a graphics processing unit manufactured by Nvidia starting in early 1999. The chip is codenamed "NV5" because it is the 5th graphics chip design by Nvidia, succeeding the RIVA TNT (NV4). RIVA is an acronym for Real-time Interactive Video and Animation accelerator. The "TNT" suffix refers to the chip's ability to work on two texels at once. Nvidia removed RIVA from the name later in the chip's lifetime.
The GeForce 6 series is the sixth generation of Nvidia's GeForce line of graphics processing units. Launched on April 14, 2004, the GeForce 6 family introduced PureVideo post-processing for video, SLI technology, and Shader Model 3.0 support.
The Voodoo 5 was the last and most powerful graphics card line that was released by 3dfx Interactive. All members of the family were based upon the VSA-100 graphics processor. Only the single-chip Voodoo 4 4500 and dual-chip Voodoo 5 5500 made it to market.
Savage was a product-line of PC graphics chipsets designed by S3.
Rendition, Inc., was a maker of 3D computer graphics chipsets in the mid to late 1990s. They were known for products such as the Vérité 1000 and Vérité 2x00 and for being one of the first 3D chipset makers to directly work with Quake developer John Carmack to make a hardware-accelerated version of the game (vQuake). Rendition's major competitor at the time was 3Dfx. Their proprietary rendering APIs were Speedy3D and RRedline.
The Radeon R100 is the first generation of Radeon graphics chips from ATI Technologies. The line features 3D acceleration based upon Direct3D 7.0 and OpenGL 1.3, and all but the entry-level versions offloading host geometry calculations to a hardware transform and lighting (T&L) engine, a major improvement in features and performance compared to the preceding Rage design. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs. "R100" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.
The ATI Rage is a series of graphics chipsets developed by ATI Technologies offering graphical user interface (GUI) 2D acceleration, video acceleration, and 3D acceleration developed by ATI Technologies. It is the successor to the ATI Mach series of 2D accelerators.
The Matrox Parhelia-512 is a graphics processing unit (GPU) released by Matrox in 2002. It has full support for DirectX 8.1 and incorporates several DirectX 9.0 features. At the time of its release, it was best known for its ability to drive three monitors and its Coral Reef tech demo.
Voodoo3 was a series of computer gaming video cards manufactured and designed by 3dfx Interactive. It was the successor to the company's high-end Voodoo2 line and was based heavily upon the older Voodoo Banshee product. Voodoo3 was announced at COMDEX '98 and arrived on store shelves in early 1999. The Voodoo3 line was the first product manufactured by the combined STB Systems and 3dfx.
The G200 is a 2D, 3D, and video accelerator chip for personal computers designed by Matrox. It was released in 1998.
The IntelliStation is a family of workstations developed by IBM and first released in March 1997 as the follow-on to the PC Series 360 and 365. Certain IntelliStation M Pro Series were near hardware identical to low end IBM Netfinity 1000 Series network servers. In February 2002, POWER processor-based workstations, previously sold directly under the eServer pSeries brand, were also placed under the IntelliStation umbrella.
The Voodoo2 is a set of three specialized 3D graphics chips on a single chipset setup, made by 3dfx. It was released in February 1998 as a replacement for the original Voodoo Graphics chipset.
The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in Radeon graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in features and performance compared to the preceding R200 design. R300 was the first fully Direct3D 9-capable consumer graphics chip. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs.
The R200 is the second generation of GPUs used in Radeon graphics cards and developed by ATI Technologies. This GPU features 3D acceleration based upon Microsoft Direct3D 8.1 and OpenGL 1.3, a major improvement in features and performance compared to the preceding Radeon R100 design. The GPU also includes 2D GUI acceleration, video acceleration, and multiple display outputs. "R200" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.