GeForce 6 series

Last updated

GeForce 6 series
NVIDIA GeForce 6 Series logo.png
Release dateApril 14, 2004;20 years ago (April 14, 2004)
CodenameNV4x
Architecture Curie
ModelsGeForce nForce series
  • GeForce SE series
  • GeForce LE series
  • GeForce XT series
  • GeForce GTO series
  • GeForce GS series
  • GeForce GT series
  • GeForce Ultra series
  • GeForce Ultra Extreme series
Cards
Entry-level6100
6150
6200
6500
Mid-range6600
6700
High-end6800
Enthusiast6800 Ultra / Ultra Extreme
API support
DirectX Direct3D 9.0c
Shader Model 3.0
OpenGL OpenGL 2.1
History
Predecessor GeForce 5 series
Successor GeForce 7 series
Support status
Unsupported

The GeForce 6 series (codename NV40) is the sixth generation of Nvidia's GeForce line of graphics processing units. Launched on April 14, 2004, the GeForce 6 family introduced PureVideo post-processing for video, SLI technology, and Shader Model 3.0 support (compliant with Microsoft DirectX 9.0c specification and OpenGL 2.0).

Contents

GeForce 6 series features

GeForce 6600 GT AGP NVidia GeForce 6600GT AGP with GPU 2009-01-27.jpg
GeForce 6600 GT AGP

SLI

The Scalable Link Interface (SLI) allows two GeForce 6 cards of the same type to be connected in tandem. The driver software balances the workload between the cards. SLI-capability is limited to select members of the GeForce 6 family; 6500 and above. SLI is only available for cards utilizing the PCI-Express bus.

Nvidia PureVideo Technology

Nvidia PureVideo technology is the combination of a dedicated video processing core and software which decodes H.264, VC-1, WMV, and MPEG-2 videos with reduced CPU utilization. [1]

Shader Model 3.0

Nvidia was the first to deliver Shader Model 3.0 (SM3) capability in its GPUs. SM3 extends SM2 in a number of ways: standard FP32 (32-bit floating-point) precision, dynamic branching, increased efficiency and longer shader lengths are the main additions. [2] Shader Model 3.0 was quickly adopted by game developers because it was quite simple to convert existing shaders coded with SM 2.0/2.0A/2.0B to version 3.0, and it offered noticeable performance improvements across the entire GeForce 6 line.

Caveats

PureVideo functionality varies by model, with some models lacking WMV9 and/or H.264 acceleration. [3]

In addition, motherboards with some VIA and SIS chipsets and an AMD Athlon XP processor seemingly have compatibility problems with the GeForce 6600 and 6800 GPUs. Problems that have been known to arise are freezing, artifacts, reboots, and other issues that make gaming and use of 3D applications almost impossible. These problems seem to happen only on Direct3D based applications and do not affect OpenGL.[ citation needed ]

Geforce 6 series comparison

Nvidia NV40 GPU GPU NVidia NV40 6800GT AGP.jpg
Nvidia NV40 GPU
Nvidia NV45 GPU GPU NVIDIA NV45 ES GPU.jpg
Nvidia NV45 GPU

Here is how the released versions of the "GeForce 6" series family compare to Nvidia's previous flagship GPU, the GeForce FX 5950 Ultra, in addition to the comparable units of ATI's newly released for the time Radeon X800 and X850 series:

GeForce FX 5950 UltraGeForce 6200 TC-32GeForce 6600 GTGeForce 6800 UltraATI Radeon X800 XT PEATI Radeon X850 XT PE
Transistor count 135 million77 million146 million222 million160 million160 million
Manufacturing process0.13 μm0.11 μm0.11 μm0.13 μm0.13 μm low-k0.13 μm low-k
Die Area (mm²)~200110156288288297
Core clock speed (MHz)475350500400520540
Number of pixel shader processors448161616
Number of pixel pipes448161616
Number of texturing units8(16*)48161616
Number of vertex pipelines3*33666
Peak pixel fill rate (theoretical)1.9 Gigapixel/s700 Megapixel/s2.0 Gigapixel/s6.4 Gigapixel/s8.32 Gigapixel/s8.64 Gigapixel/s
Peak texture fill rate (theoretical)3.8 Gigatexel/s1.4 Gigatexel/s4.0 Gigatexel/s6.4 Gigatexel/s8.32 Gigatexel/s8.64 Gigatexel/s
Memory interface256-bit64-bit128-bit256-bit256-bit256-bit
Memory clock speed950 MHz DDR 700 MHz DDR2 1.0 GHz / 950 MHz** GDDR3 1.1 GHz GDDR3 1.12 GHz GDDR3 1.18 GHz GDDR3
Peak memory bandwidth (GiB/s)30.45.616.0 / 14.4**35.235.8437.76

(*) GeForce FX series has an Array-based Vertex Shader.

(**) AGP 6600 GT variant.

GeForce 6800 series

GeForce 6800 Ultra PCI-e 512MiB GDDR3 NVIDIA GeForce 6800 Ultra 512MB.jpg
GeForce 6800 Ultra PCI-e 512MiB GDDR3

The first family in the GeForce 6 product-line, the 6800 series catered to the high-performance gaming market. As the very first GeForce 6 model, the 16 pixel pipeline GeForce 6800 Ultra (NV40) was 2 to 2.5 times faster than Nvidia's previous top-line product (the GeForce FX 5950 Ultra), packed four times the number of pixel pipelines, twice the number of texture units and added a much improved pixel-shader architecture. Yet, the 6800 Ultra was fabricated on the same (IBM) 130 nanometer process node as the FX 5950, and it consumed slightly less power.

Like all of Nvidia's GPUs up until 2004, initial 6800 members were designed for the AGP bus. Nvidia added support for the PCI Express (PCIe) bus in later GeForce 6 products, usually by use of an AGP-PCIe bridge chip. In the case of the 6800 GT and 6800 Ultra, Nvidia developed a variant of the NV40 chip called the NV45. The NV45 shares the same die core as the NV40, but embeds an AGP-PCIe bridge on the chip's package. (Internally, the NV45 is an AGP NV40 with added bus-translation logic, to permit interfacing with a PCIe motherboard. Externally, the NV45 is a single package with two separate silicon dies clearly visible on the top.) NV48 is a version of NV45 which supports 512MiB RAM.

The use of an AGP-PCIe bridge chip initially led to fears that natively-AGP GPUs would not be able to take advantage of the additional bandwidth offered by PCIe and would therefore be at a disadvantage relative to native PCIe chips.[ citation needed ] However, benchmarking reveals that even AGP 4× is fast enough that most contemporary games do not improve significantly in performance when switched to AGP 8×, rendering the further bandwidth increase provided by PCIe largely superfluous.[ citation needed ] Additionally, Nvidia's on-board implementations of AGP are clocked at AGP 12× or 16×, providing bandwidth comparable to PCIe for the rare situations when this bandwidth is actually necessary.[ citation needed ]

The use of a bridge chip allowed Nvidia to release a full complement of PCIe graphics cards without having to redesign them for the PCIe interface. Later, when Nvidia's GPUs were designed to use PCIe natively, the bidirectional bridge chip allowed them to be used in AGP cards. ATI, initially a critic of the bridge chip, eventually designed a similar solution (known as Rialto [4] ) for their own cards.[ citation needed ]

Nvidia's professional Quadro line contains members drawn from the 6800 series: Quadro FX 4000 (AGP) and the Quadro FX 3400, 4400 and 4400g (both PCI Express). The 6800 series was also incorporated into laptops with the GeForce Go 6800 and Go 6800 Ultra GPUs.

PureVideo and the AGP GeForce 6800

PureVideo expanded the level of multimedia-video support from decoding of MPEG-2 video to decoding of more advanced codecs (MPEG-4, WMV9), enhanced post-processing (advanced de-interlacing), and limited acceleration for encoding. But perhaps ironically, the first GeForce product(s) to offer PureVideo, the AGP GeForce 6800/GT/Ultra, failed to support all of PureVideo's advertised features.

Media player software (WMP9) with support for WMV-acceleration did not become available until several months after the 6800's introduction. User and web reports showed little if any difference between PureVideo enabled GeForces and non-Purevideo cards. The prolonged public silence of Nvidia, after promising updated drivers, and test benchmarks gathered by users led the user community to conclude that the WMV9 decoder component of the AGP 6800's PureVideo unit is either non-functional or intentionally disabled. [ citation needed ]

In late 2005, an update to Nvidia's website finally confirmed what had long been suspected by the user community: WMV-acceleration is not available on the AGP 6800.

Of course, today's computers are fast enough to play and decode WMV9 video and other sophisticated codecs like MPEG-4, H.264 or VP8 without hardware acceleration, thus negating the need for something like PureVideo.

GeForce 6 series general features

GeForce 6800 Personal Cinema NVIDIA GeForce 6800 Personal Cinema.jpg
GeForce 6800 Personal Cinema

6800 chipset table

Board NameCore TypeCore
(MHz)
Memory
(MHz)
Pipeline
Config
Vertex
Processors
Memory
Interface
6800 UltraNV40/NV45/NV48400550166256-bit
6800 GTNV40/NV45/NV48350500166256-bit
6800 XTNV42450600125256-bit
6800 GTSNV42425500125256-bit
6800 GSNV40350500125256-bit
6800 GTONV40/NV45350450125256-bit
6800NV40/NV41/NV42325300/350125256-bit
6800 GoNV41M300300125256-bit
6800 Go UltraNV41M(0.13u)/NV42M(0.11u)4505500125256-bit
6800 XENV40275/300/325266/35083128-bit
6800 LENV4030035084256-bit

Notes

GeForce 6600 series

A Gigabyte 6600 GT PCI Express card Gigabyte GV-NX66T128VP.jpg
A Gigabyte 6600 GT PCI Express card
GeForce 6600 GT Personal Cinema NVIDIA GeForce 6600 GT Personal Cinema.jpg
GeForce 6600 GT Personal Cinema

The GeForce 6600 (NV43) was officially launched on August 12, 2004, several months after the launch of the 6800 Ultra. With half the pixel pipelines and vertex shaders of the 6800 GT, and a smaller 128-bit memory bus, the lower-performance and lower-cost 6600 is the mainstream product of the GeForce 6 series. The 6600 series retains the core rendering features of the 6800 series, including SLI. Equipped with fewer rendering units, the 6600 series processes pixel data at a slower rate than the more powerful 6800 series. However, the reduction in hardware resources, and migration to TSMC's 110 nm manufacturing process (versus the 6800's 130 nm process), make the 6600 both less expensive for Nvidia to manufacture and less expensive for customers to purchase.

Their 6600 series currently has three variants: the GeForce 6600LE, the 6600, and the 6600GT (in order from slowest to fastest.) The 6600 GT performs quite a bit better than the GeForce FX 5950 Ultra or Radeon 9800 XT, with the 6600 GT scoring around 8000 in 3DMark03, while the GeForce FX 5950 Ultra scored around 6000, and it is also much cheaper. Notably, the 6600 GT offered identical performance to ATI's high-end X800 PRO graphics card with drivers previous December 2004, when running the popular game Doom 3 . It was also about as fast as the higher-end GeForce 6800 when running games without anti-aliasing in most scenarios.

At introduction, the 6600 family was only available in PCI Express form. AGP models became available roughly a month later, through the use of Nvidia's AGP-PCIe bridge chip. A majority of the AGP GeForce 6600GTs have their memory clocked at 900 MHz, which is 100 MHz slower than the PCI-e cards, on which the memory operates at 1000 MHz. This can contribute to a performance decline when playing certain games. However, it was often possible to "overclock" the memory to its nominal frequency of 1000 MHz and there are AGP cards (for example from XFX) that use 1000 MHz by default.

6600 chipset table

Board NameCore TypeCore
(MHz)
Memory
(MHz)
Pipeline
Config
Vertex
Processors
Memory
Interface
6700 XLNV43525110083128-bit
6600 GT GDDR3 NV43500900/100083128-bit
6600 XLNV4340080083128-bit
6600 DDR2NV4335080083128-bit
6600NV43300500/55083128-bit
6600 LENV4330050043128-bit
Nvidia NV43 GPU 6600GT GPU.jpg
Nvidia NV43 GPU

Other data for PCI Express based cards:

Other data for AGP based cards:

GeForce 6500

The GeForce 6500 was released in October 2005 and is based on the same NV44 core as the value/budget (low-end or entry level) GeForce 6200TC, but with a higher GPU clock speed and more memory. The GeForce 6500 also supports SLI.

GeForce 6500

GeForce 6200

GeForce 6200 in a low-profile AGP form factor Sparkle SF-AG44DH.jpg
GeForce 6200 in a low-profile AGP form factor

With just 4 pixel pipelines, the 6200 series forms Nvidia's value/budget (low-end or entry level) product. The 6200 omits memory compression and SLI support, but otherwise offers similar rendering features as the 6600s. The later 6200 boards were based on the NV44 core(s), which is the final production silicon for the 6200 series. The 6200 is the only card in the series to feature keying for 3.3V AGP slots (barring some rare exceptions of higher-end cards from vendors like PNY).

However, at introduction, production silicon was not yet ready. Nvidia fulfilled 6200 orders by shipping binned/rejected 6600 series cores (NV43V). The rejects were factory-modified to disable four pixel pipelines, thereby converting the native 6600 product into a 6200 product. Some users were able to "unlock" early 6200 boards through a software utility (effectively converting the 6200 back into a 6600 with the complete set of eight pixel pipelines total) if they owned boards with an NV43 A2 or earlier revision of the core. Thus, not all NV43-based 6200 boards could successfully be unlocked (specifically those with a core revision of A4 or higher), and as soon as NV44 production silicon became available, Nvidia discontinued shipments of downgraded NV43V cores.

GeForce 6200 chip specifications

GeForce 6200

  • Core Clock: 300 MHz
  • Memory Clock: 550 MHz
  • Pixel Pipelines: 4
  • Vertex Processors: 3
  • Memory: 128/256/512 MiB [8] DDR on a 64-bit/128-bit interface

GeForce 6200 TurboCache / AGP

The GeForce 6200 TurboCache / AGP (NV44/NV44a) is a natively four-pipeline version of the NV43. GeForce 6200 TurboCache cards only have a very small (by modern standards) amount of memory, but attempt to make up for this by using system memory accessed through the PCI-Express bus.

GeForce 6200 TurboCache / AGP chip specifications

GeForce 6200 PCI-Express (NV44) TurboCache

  • Core Clock: 350 MHz
  • Memory Clock: 700 MHz
  • Pixel Pipelines: 4
  • Number of ROPs: 2
  • Vertex Processors: 3
  • Memory: 16/32/64/128 MiB DDR on a 32-bit/64-bit/128-bit interface
  • GeForce 6200 w/ TurboCache supporting 128 MiB, including 16 MiB of local TurboCache (32-bit)
  • GeForce 6200 w/ TurboCache supporting 128 MiB, including 32 MiB of local TurboCache (64-bit)
  • GeForce 6200 w/ TurboCache supporting 256 MiB, including 64 MiB of local TurboCache (64-bit)
  • GeForce 6200 w/ TurboCache supporting 256 MiB, including 128 MiB of local TurboCache (128-bit)

GeForce 6200 AGP (NV44a) without TurboCache

  • Core Clock: 350 MHz
  • Memory Clock: 500 MHz
  • Pixel Pipelines: 4
  • Number of ROPs: 2
  • Vertex Processors: 3
  • Memory: 128/256/512 MiB DDR or DDR2 on a 64-bit interface

GeForce 6200 AGP (NV44a2) without TurboCache

  • Core Clock: 350 MHz
  • Memory Clock: 540 MHz
  • Pixel Pipelines: 4
  • Number of ROPs: 2
  • Vertex Processors: 3
  • Memory: 128/256 MiB DDR2 with a 128-bit interface
  • Cooling: Passive heatsink

GeForce 6200 PCI (NV44) without TurboCache

BFG Technologies originally introduced a unique PCI variant of the GeForce 6200 via its namesake B.F.G. and 3D Fuzion product lines. Subsequently, PNY (GeForce 6200 256 MiB PCI), SPARKLE Computer (GeForce 6200 128 MiB PCI and GeForce 6200 256 MiB PCI), and eVGA (e-GeForce 6200 256 MiB PCI and e-GeForce 6200 512 MiB PCI) released their own PCI versions of the Geforce 6200 featuring higher memory clocks and resultant memory bandwidth.

Until the release of the ATI X1300 PCI, these were the only PCI DirectX 9 capable cards not based on previous generation GeForce FX technology or discontinued XGI Technology Volari V3XT chipsets.

Excluding SPARKLE's GeForce 8400 and 8500 series, Zotac GT 610 cards and Club 3D HD 5450 cards in late 2012, the enhanced 512 MiB Geforce 6200 PCI variants [9] remain among the most powerful PCI based systems available, making these cards desired by users lacking the option of upgrading to an AGP or PCI Express based discrete video card.

  • Core Clock: 350 MHz
  • Memory Clock: 400 MHz (BFG Technologies 6200 OC 410 MHz, PNY and EVGA 533 MHz)
  • Pixel Pipelines: 4
  • Memory: 512 (EVGA e-GeForce 6200 512 MiB PCI) / 256 (BFG Technologies 6200 OC PCI and EVGA e-Ge-Force 6200 PCI) / 128 (BFG Technologies 3DFuzion GeForce 6200 PCI) MiB DDR on a 64-bit interface

GeForce 6100 and 6150 series

In late 2005 Nvidia introduced a new member to the GeForce family, the 6100 series, also known as C51. The term GeForce 6100/6150 actually refers to an nForce4-based motherboard with an integrated NV44 core, as opposed to a standalone graphics card. Nvidia released this product both to follow up its immensely popular GeForce4 MX based nForce and nForce2 boards and to compete with ATI's RS480/482 and Intel's GMA 900/950 in the integrated graphics space. The 6100 series is very competitive, usually tying with or just edging out the ATI products in most benchmarks.

The motherboards use two different types of southbridges - the nForce 410 and the nForce 430. They are fairly similar in features to the nForce4 Ultra motherboards that were on the market before them. Both feature PCI Express and PCI support, eight USB 2.0 ports, integrated sound, two Parallel ATA ports, and Serial ATA 3.0 Gibit/s with Native Command Queuing (NCQ) – two SATA ports in the case of the 410, four in the 430. The 430 southbridge also supports Gigabit Ethernet with Nvidia's ActiveArmor hardware firewall, while the 410 supports standard 10/100 Ethernet only.

GeForce 6100 and 6150 series chip specifications

Both the 6100 and 6150 support Shader Model 3.0 and DirectX 9.0c. The 6150 also features support for High-Definition video decoding of H.264/VC1/MPEG2, PureVideo Processing, DVI, and video-out. The 6100 only supports SD decoding of MPEG2/WMV9. [10] Maximum supported resolution is 1920 × 1440 pixels (@75 Hz) for RGB display and 1600 × 1200 pixels (@65 Hz) for DVI-D display

GeForce 61XX abnormally high failure rate in notebook computers

In 2008, Nvidia took a $150 to 250M charge against revenue because the GPUs were failing at "higher than normal rates." [11] HP provided an extension to their warranty of up to 24 months for notebooks affected by this issue. A class action suit was filed against HP and Nvidia by Whatley Drake & Kallas LLC.[ citation needed ]

GeForce 6100

  • Manufacturing process: 90 nm
  • Core Clock: 425 MHz
  • Vertex Processors: 1
  • Pixel Pipelines: 2
  • Shader Model: 3
  • DirectX support: v9.0c
  • Video playback acceleration: SD video acceleration of MPEG2/WMV9 (HD video acceleration not supported)
  • Outputs: VGA only
  • Memory: Shared DDR/DDR2 (socket 754/939/AM2) system memory (selectable through BIOS - usually 16/32/64/128/256 MiB)

GeForce 6150

  • Manufacturing process: 90 nm
  • Core clock: 475 MHz [12]
  • Vertex processors: 1
  • Pixel pipelines: 2
  • Shader model: 3
  • DirectX support: v9.0c
  • Video playback acceleration: HD video acceleration of H.264/VC1/MPEG2
  • Outputs: VGA, DVI, RCA (Video)
  • Memory: Shared DDR2 (socket 939/AM2) system memory (selectable through BIOS - usually 16/32/64/128/256 MiB)
  • HT Bus (Bandwidth) = 2000 MT/s max

GeForce 6150LE

The GeForce 6150LE was primarily featured in the 2006 lineup of the Nvidia Business Platform. [13] The chip is used by Fujitsu-Siemens in its Esprimo green desktop, HP in its Pavilion Media Center a1547c Desktop PC and Compaq Presario SR1915 Desktop, and Dell in its Dimension C521 and E521 desktop PCs.

GeForce 6150SE

GeForce 6150SE (MCP61, also known as C61) is an updated, single-chip version of the Nvidia GeForce 6100. The MCP61 uses less power than the original C51 2-chip version of 6100. Its onboard video outperforms the 6150 in many 3D benchmarks despite its lower core frequency (425 MHz), because of added hardware Z-culling.

MCP61 introduced a bug in the SATA NCQ implementation. As a result, Nvidia employees have contributed code to disable NCQ operations under Linux. [14]

  • Manufacturing process: 90 nm
  • Core Clock: 425 MHz
  • HT Bus = 2000 MT/s max
  • Vertex Processors: 1
  • Pixel Pipelines: 2
  • Shader Model: 3
  • DirectX support: v9.0c
  • Outputs: VGA only

IntelliSample 4.0 and the GeForce 6 GPUs

Upon launch of the GeForce 7 family of graphics processing units, IntelliSample 4.0 was considered to be an exclusive feature of the GeForce 7 series of GPUs. However, version 91.47 (and subsequent versions) of the Nvidia ForceWare drivers enable the features of IntelliSample 4.0 on the GeForce 6 GPUs. IntelliSample 4.0 introduces two new antialiasing modes, known as Transparency Supersampling Antialiasing and Transparency Multisampling Antialiasing. These new antialiasing modes enhance the image quality of thin-lined objects such as fences, trees, vegetation and grass in various games.

One possible reason for the enabling of IntelliSample 4.0 for GeForce 6 GPUs might be the fact that the GeForce 7100 GS GPUs are based on NV44 chips, the same as the GeForce 6200 models. Because of this, Nvidia had to backport IntelliSample 4.0 features to the NV4x GPUs, and as a result, the entire GeForce 6 family is able to enjoy the benefits of Transparency Antialiasing.

It was already well known across various communities that Transparency Antialiasing could be used on GeForce 6 GPUs by using some third party tweak tools. As of Nvidia ForceWare drivers 175.16, GeForce 6 IntelliSample 4.0 support has been removed.

GeForce 6 model information

ModelLaunch
Code name
Fab (nm) [15]
Transistors (million)
Die size (mm2)
Core clock (MHz)
Memory clock (MHz)
Core config [lower-alpha 1]
Fillrate Memory
Performance (GFLOPS)
TDP (Watts)
MOperations/s
MPixels/s
MTexels/s
MVertices/s
Size (MB)
Bandwidth (GB/s)
Bus type
Bus width (bit)
GeForce 6100 + nForce 410October 20, 2005MCP51 TSMC 90 nm HyperTransport 425100–200 (DDR)
200–533 (DDR2)
2:1:2:1850425850106.25Up to 256 system RAM1.6–6.4 (DDR)
3.2–17.056 (DDR2)
DDR
DDR2
64
128
 ? ?
GeForce 6150 SE + nForce 430June 2006MCP61200
400[ citation needed ]
3.2
16.0[ citation needed ]
DDR2 ? ?
GeForce 6150 LE + nForce 430MCP61100–200 (DDR)
200–533 (DDR2)
1.6–6.4 (DDR)
3.2–17.056 (DDR2)
DDR
DDR2
 ? ?
GeForce 6150 + nForce 430October 20, 2005MCP51475950475950118.751.6–6.4 (DDR)
3.2–17.056 (DDR2)
 ? ?
GeForce 6200 LEApril 4, 2005NV44TSMC 110 nm75
110 [16]
AGP 8x
PCIe x16
35026670070070087.5128
256
4.256DDR64 ? ?
GeForce 6200AApril 4, 2005NV44A75
110 [17]
AGP 8x

PCI

300

350 [18]

250 (DDR)
250-333 (DDR2) [18]
4:3:4:21,400 [18] 700 [18] 1400 [18] 175
225 [19]
128
256 [18]
512 [18]
4
4-5.34 (DDR2) [20]
DDR
DDR2 [19]
64 [19]  ? ?
GeForce 6200October 12, 2004 (PCIe)
January 17, 2005 (AGP)
NV43146
154 [21]
AGP 8x
PCI
PCIe x16
3002754:3:4:41,2001,2001,200225128
256
8.8DDR1281.220
GeForce 6200 TurboCacheDecember 15, 2004NV4475
110 [16]
PCIe x16350200
275
350
4:3:4:21,4007001,400262.5128–256 System RAM incl.16/32–64/128 onboard3.2
4.4
5.6
DDR641.425
GeForce 6500October 1, 20054003331,6008001,600300128
256
5.328 ? ?
GeForce 6600 LE2005NV43146
154 [21]
AGP 8x
PCIe x16
3002004:3:4:41,2001,2001,2002256.41281.3 ?
GeForce 6600August 12, 2004275
400
8:3:8:42,4002,4008.8
12.8
DDR
DDR2
2.426
GeForce 6600 GTAugust 12, 2004 (PCIe)
November 14, 2004 (AGP)
500475 (AGP)
500 (PCIe)
4,0002,0004,00037515.2 (AGP) [22]
16 (PCIe)
GDDR34.047
GeForce 6800 LEJuly 22, 2004 (AGP)
January 16, 2005 (PCIe)
NV40 (AGP)
NV41, NV42 (PCIe)
IBM 130 nm222
287 (NV40) [23]
222
225 (NV41) [24]
198
222 (NV42) [25]
320 (AGP)
325 (PCIe)
3508:4:8:82,560 (AGP)
2,600 (PCIe)
2,560 (AGP)
2,600 (PCIe)
2,560 (AGP)
2,600 (PCIe)
320 (AGP)
325 (PCIe)
12822.4DDR2562.6 ?
GeForce 6800 XTSeptember 30, 2005300 (64 Bit)
325
266 (64 Bit)
350
500 (GDDR3)
2,400
2,600
2,400
2,600
2,400
2,600
300
325
2564.256
11.2
22.4
32 (GDDR3)
DDR
DDR2
GDDR3
64 [26]
128 [27]
256
2.636
GeForce 6800April 14, 2004 (AGP)
November 8, 2004 (PCIe)
32535012:5:12:123,9003,9003,900406.25128
256
22.4DDR2563.940
GeForce 6800 GTOApril 14, 2004NV45222
287 (NV45) [28]
PCIe x164504,2004,2004,200437.525628.8GDDR3 ? ?
GeForce 6800 GSDecember 8, 2005 (AGP)
November 7, 2005 (PCIe)
NV40 (AGP)
NV42 (PCIe)
TSMC 110 nm222
287 (NV40) [23]
198
222 (NV42) [25]
AGP 8x
PCIe x16
350 (AGP)
425 (PCIe)
5005,1005,1005,100531.25128
256
324.2
5.1
59
GeForce 6800 GTMay 4, 2004 (AGP)
June 28, 2004 (PCIe)
NV40 (AGP)
NV45 (PCIe)
IBM 130 nm222
287 (NV40) [23]
222
287 (NV45) [28]
AGP 8x
PCIe x16
35016:6:16:165,6005,6005,6005255.667
GeForce 6800 UltraMay 4, 2004 (AGP)
June 28, 2004 (PCIe)
March 14, 2005 (512 MB)
400525 (512 MB)
550 (256 MB)
6,4006,4006,400600256
512
33.6 (512 MB)
35.2 (256 MB)
6.4105
GeForce 6800 Ultra Extreme EditionMay 4, 2004NV40222
287 (NV40) [23]
AGP 8x4506007,2007,2007,20067525635.2 ? ?
ModelLaunch
Fab (nm) [15]
Transistors (million)
Die size (mm2)
Core clock (MHz)
Memory clock (MHz)
Core config [lower-alpha 1]
Fillrate Memory
Performance (GFLOPS)
TDP (Watts)
MOperations/s
MPixels/s
MTexels/s
MVertices/s
Size (MB)
Bandwidth (GB/s)
Bus type
Bus width (bit)

Features

ModelFeatures
OpenEXR HDR Scalable Link Interface (SLI) TurboCache PureVideo WMV9 Decoding
GeForce 6100NoNoNoLimited
GeForce 6150 SENoNoDriver-Side OnlyLimited
GeForce 6150NoNoNoYes
GeForce 6150 LENoNoDriver-Side OnlyYes
GeForce 6200NoNoYes (PCIe only)Yes
GeForce 6500NoYesYesYes
GeForce 6600 LEYesYes (No SLI Connector)NoYes
GeForce 6600YesYes (SLI Connector or PCIe Interface)NoYes
GeForce 6600 DDR2YesYes (SLI Connector or PCIe Interface)NoYes
GeForce 6600 GTYesYesNoYes
GeForce 6800 LEYesNoNoNo
GeForce 6800 XTYesYes (PCIe only)NoYes (NV42 only)
GeForce 6800YesYes (PCIe only)NoYes (NV41, NV42 only)
GeForce 6800 GTOYesYesNoNo
GeForce 6800 GSYesYes (PCIe only)NoYes (NV42 only)
GeForce 6800 GTYesYes (PCIe only)NoNo
GeForce 6800 UltraYesYes (PCIe only)NoNo

GeForce Go 6 (Go 6xxx) series

ModelLaunch
Fab (nm)
Core clock (MHz)
Memory clock (MHz)
Core config1
Fillrate Memory
MOperations/s
MPixels/s
MTexels/s
MVertices/s
Size (MB)
Bandwidth (GB/s)
Bus type
Bus width (bit)
GeForce Go 6100 + nForce Go 430Un­knownC51M110HyperTransport425System memory2:1:2:1850425850106.25Up to 128 MB systemSystem memoryDDR264/128
GeForce Go 6150 + nForce Go 430February 1, 2006
GeForce Go 6200February 1, 2006NV44MPCIe x163006004:3:4:212006001200225162.4DDR32
GeForce Go 6400February 1, 2006400700160080016002505.664
GeForce Go 6600September 29, 2005NV43M3008:3:8:4300015003000281.2512811.2128
GeForce Go 6800November 8, 2004NV41M130700
1100
12:5:12:1237522.4
35.2
DDR, DDR2
DDR3
256
GeForce Go 6800 UltraFebruary 24, 2005450540036005400562.5256

Discontinued support

Nvidia has ceased driver support for GeForce 6 series. The GeForce 6 series is the last to support the Windows 9x family of operating systems, as well as Windows NT 4.0. The successor GeForce 7 series only supports Windows 2000 and later (the Windows 8 drivers also support Windows 10).

See also

Notes and references

  1. "PureVideo: Digital Home Theater Video Quality for Mainstream PCs with GeForce 6 and 7 GPUs" (PDF). NVIDIA. p. 9. Retrieved August 31, 2024.
  2. "GPU timeline". Timetoast timelines. October 11, 1999. Retrieved April 13, 2023.
  3. Nvidia PureVideo – Product Comparison
  4. Smith, Tony (October 6, 2004). "ATI readies 'Rialto' PCIE-to-AGP bridge chip". The Register.com. Retrieved June 20, 2021.
  5. "GeForce 6800 Ultra Extreme Edition". EVGA.com. Archived from the original on March 29, 2008. Retrieved October 7, 2012.
  6. Chiappetta, Marco (August 11, 2004). "eVGA GeForce 6800 Ultra Extreme Edition". HotHardware. Retrieved October 7, 2012. The eVGA contest page still says 1100 MHz, but the cards shipped by eVGA were in fact 1200 MHz.
  7. Adobe Knowledge Base - 2D Graphics Acceleration (GPU) support in Acrobat and Adobe Reader (9.0 on Windows)
  8. Newegg.com - CHAINTECH SA62A-512 GeForce 6200 512 MiB 64-bit DDR2 SDRAM AGP 4X/8X Video Card - Retail
  9. eVGA NVIDIA GeForce 6200 512-P1-N402 PCI.
  10. PureVideo Support Table
  11. "NVIDIA Provides Second Quarter Fiscal 2009 Business Update" (Press release). NVIDIA. July 2, 2008.
  12. Tech Specs
  13. "NVIDIA Business Platform Certified Motherboards". NVIDIA.com.
  14. Zolnierkiewicz, Bartlomiej (October 26, 2007). "Re: [PATCH] ata: sata_nv fix mcp51 timeout with SWNCQ". linux-ide (Mailing list).
  15. 1 2 "3D accelerator database". Vintage 3D. Archived from the original on October 23, 2018. Retrieved September 1, 2024.
  16. 1 2 "NVIDIA NV44 GPU Specs". TechPowerUp. Retrieved September 1, 2024.
  17. "NVIDIA NV44B GPU Specs". TechPowerUp. Retrieved September 1, 2024.
  18. 1 2 3 4 5 6 7 "GeForce 6 Series-6200 - XFXforce.com". Archived from the original on September 16, 2008.
  19. 1 2 3 "GeForce 6200:Model PV-T44A-WANG - XFXforce.com". Archived from the original on April 12, 2008.
  20. "Products GeForce 6200:Features - XFXforce.com". Archived from the original on October 12, 2007.
  21. 1 2 "NVIDIA NV43 GPU Specs". TechPowerUp. Retrieved September 1, 2024.
  22. "Nvidia GeForce 6600 GT AGP". Techpowerup.com. Archived from the original on April 16, 2015. Retrieved September 1, 2024.
  23. 1 2 3 4 "NVIDIA NV40 GPU Specs". TechPowerUp. Retrieved September 1, 2024.
  24. "NVIDIA NV41 GPU Specs". TechPowerUp. Retrieved September 1, 2024.
  25. 1 2 "NVIDIA NV42 GPU Specs". TechPowerUp. Retrieved September 1, 2024.
  26. "映泰集团 :: V6802XA16 :: 产品规格". www.biostar.com.tw. Archived from the original on October 12, 2022. Retrieved September 1, 2024.
  27. "VGA/GPU Manufacturer - BIOSTAR Group". www.biostar.com.tw. Archived from the original on October 12, 2022. Retrieved September 1, 2024.
  28. 1 2 "NVIDIA NV45 GPU Specs". TechPowerUp. Retrieved September 1, 2024.

Reviews

Related Research Articles

<span class="mw-page-title-main">Accelerated Graphics Port</span> Expansion bus standard

Accelerated Graphics Port (AGP) is a parallel expansion card standard, designed for attaching a video card to a computer system to assist in the acceleration of 3D computer graphics. It was originally designed as a successor to PCI-type connections for video cards. Since 2004, AGP was progressively phased out in favor of PCI Express (PCIe), which is serial, as opposed to parallel; by mid-2008, PCI Express cards dominated the market and only a few AGP models were available, with GPU manufacturers and add-in board partners eventually dropping support for the interface in favor of PCI Express.

<span class="mw-page-title-main">GeForce FX series</span> Series of GPUs by Nvidia

The GeForce FX or "GeForce 5" series is a line of graphics processing units from the manufacturer Nvidia.

<span class="mw-page-title-main">GeForce 2 series</span> Series of GPUs by Nvidia

The GeForce 2 series (NV15) is the second generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in 2000, it is the successor to the GeForce 256.

<span class="mw-page-title-main">GeForce 3 series</span> Series of GPUs by Nvidia

The GeForce 3 series (NV20) is the third generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in February 2001, it advanced the GeForce architecture by adding programmable pixel and vertex shaders, multisample anti-aliasing and improved the overall efficiency of the rendering process.

<span class="mw-page-title-main">GeForce 4 series</span> Series of GPUs by Nvidia

The GeForce 4 series refers to the fourth generation of Nvidia's GeForce line of graphics processing units (GPUs). There are two different GeForce4 families, the high-performance Ti family, and the budget MX family. The MX family spawned a mostly identical GeForce4 Go (NV17M) family for the laptop market. All three families were announced in early 2002; members within each family were differentiated by core and memory clock speeds. In late 2002, there was an attempt to form a fourth family, also for the laptop market, the only member of it being the GeForce4 4200 Go (NV28M) which was derived from the Ti line.

The R420 GPU, developed by ATI Technologies, was the company's basis for its 3rd-generation DirectX 9.0/OpenGL 2.0-capable graphics cards. Used first on the Radeon X800, the R420 was produced on a 0.13 micrometer low-K photolithography process and used GDDR-3 memory. The chip was designed for AGP graphics cards.

<span class="mw-page-title-main">Voodoo 5</span> Graphics card line

The Voodoo 5 was the last and most powerful graphics card line that was released by 3dfx Interactive. All members of the family were based upon the VSA-100 graphics processor. Only the single-chip Voodoo 4 4500 and dual-chip Voodoo 5 5500 made it to market.

<span class="mw-page-title-main">Radeon R200 series</span> Series of video cards

The R200 is the second generation of GPUs used in Radeon graphics cards and developed by ATI Technologies. This GPU features 3D acceleration based upon Microsoft Direct3D 8.1 and OpenGL 1.3, a major improvement in features and performance compared to the preceding Radeon R100 design. The GPU also includes 2D GUI acceleration, video acceleration, and multiple display outputs. "R200" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.

<span class="mw-page-title-main">GeForce 7 series</span> Series of GPUs by Nvidia

The GeForce 7 series is the seventh generation of Nvidia's GeForce line of graphics processing units. This was the last series available on AGP cards.

The R520 is a graphics processing unit (GPU) developed by ATI Technologies and produced by TSMC. It was the first GPU produced using a 90 nm photolithography process.

<span class="mw-page-title-main">Radeon R300 series</span> Series of video cards

The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in Radeon graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in features and performance compared to the preceding R200 design. R300 was the first fully Direct3D 9-capable consumer graphics chip. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs.

<span class="mw-page-title-main">Radeon R100 series</span> Series of video cards

The Radeon R100 is the first generation of Radeon graphics chips from ATI Technologies. The line features 3D acceleration based upon Direct3D 7.0 and OpenGL 1.3, and all but the entry-level versions offloading host geometry calculations to a hardware transform and lighting (T&L) engine, a major improvement in features and performance compared to the preceding Rage design. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs. "R100" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.

<span class="mw-page-title-main">Matrox Parhelia</span> GPU by Matrox

The Matrox Parhelia-512 is a graphics processing unit (GPU) released by Matrox in 2002. It has full support for DirectX 8.1 and incorporates several DirectX 9.0 features. At the time of its release, it was best known for its ability to drive three monitors and its Coral Reef tech demo.

The G400 is a video card made by Matrox, released in September 1999. The graphics processor contains a 2D GUI, video, and Direct3D 6.0 3D accelerator. Codenamed "Toucan", it was a more powerful and refined version of its predecessor, the G200.

<span class="mw-page-title-main">GeForce 8 series</span> Series of GPUs by Nvidia

The GeForce 8 series is the eighth generation of Nvidia's GeForce line of graphics processing units. The third major GPU architecture developed by Nvidia, Tesla represents the company's first unified shader architecture.

<span class="mw-page-title-main">GeForce 9 series</span> Series of GPUs by Nvidia

The GeForce 9 series is the ninth generation of Nvidia's GeForce line of graphics processing units, the first of which was released on February 21, 2008. The products are based on an updated Tesla microarchitecture, adding PCI Express 2.0 support, improved color and z-compression, and built on a 65 nm process, later using 55 nm process to reduce power consumption and die size.

Radeon X800 is a series of graphics cards designed by ATI Technologies Inc. introduced in May 2004.

The Radeon X700 (RV410) series replaced the X600 in September 2004. X700 Pro is clocked at 425 MHz core, and produced on a 0.11 micrometre process. RV410 used a layout consisting of 8 pixel pipelines connected to 4 ROPs while maintaining the 6 vertex shaders of X800. The 110 nm process was a cost-cutting process, designed not for high clock speeds but for reducing die size while maintaining high yields. An X700 XT was planned for production, and reviewed by various hardware web sites, but was never released. It was believed that X700 XT set too high of a clock ceiling for ATI to profitably produce. X700 XT was also not adequately competitive with nVidia's impressive GeForce 6600GT. ATI would go on produce a card in the X800 series to compete instead.

<span class="mw-page-title-main">Radeon 9000 series</span> Series of video cards

The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in Radeon graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in features and performance compared to the preceding R200 design. R300 was the first fully Direct3D 9-capable consumer graphics chip. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs.