GeForce 4 series

Last updated

Nvidia GeForce 4 series
Geforce4-logo.png
GeForce 4 series logo
Release dateFebruary 6, 2002;22 years ago (February 6, 2002)
CodenameNV17, NV18, NV19, NV25, NV28
Architecture Kelvin
Cards
Entry-levelMX
Mid-rangeTi 4200, Ti 4400, Ti 4800 SE
High-endTi 4600, Ti 4800
API support
DirectX Direct3D 7.0 NV1x
Direct3D 8.0a NV2x
Vertex Shader 1.1
Pixel Shader 1.3
OpenGL OpenGL 1.3
History
Predecessor GeForce 3 series
Successor GeForce FX series
Support status
Unsupported

The GeForce 4 series (codenames below) refers to the fourth generation of Nvidia's GeForce line of graphics processing units (GPUs). There are two different GeForce4 families, the high-performance Ti family (NV25), and the budget MX family (NV17). The MX family spawned a mostly identical GeForce4 Go (NV17M) family for the laptop market. All three families were announced in early 2002; members within each family were differentiated by core and memory clock speeds. In late 2002, there was an attempt to form a fourth family, also for the laptop market, the only member of it being the GeForce4 4200 Go (NV28M) which was derived from the Ti line.

Contents

GeForce4 Ti

GeForce4 Ti 4800 (NV28 Ultra) GPU GeForce4 Ti 4800 GPU.jpg
GeForce4 Ti 4800 (NV28 Ultra) GPU
Albatron GeForce4 Ti 4800SE Albatron GeForce4 Ti 4800SE.jpg
Albatron GeForce4 Ti 4800SE
GeForce4 Ti 4600 Mac NVIDIA GeForce4 Ti 4600 Mac.jpg
GeForce4 Ti 4600 Mac

Architecture

The GeForce4 Ti (NV25) was launched in February 2002 [1] and was a revision of the GeForce 3 (NV20). It was very similar to its predecessor; the main differences were higher core and memory clock rates, a revised memory controller (known as Lightspeed Memory Architecture II/LMA II), updated pixel shaders with new instructions for Direct3D 8.0a support, [2] [3] an additional vertex shader (the vertex and pixel shaders were now known as nFinite FX Engine II), hardware anti-aliasing (Accuview AA), and DVD playback. [1] Legacy Direct3D 7-class fixed-function T&L was now implemented as vertex shaders. [2] Proper dual-monitor support (TwinView) was also brought over from the GeForce 2 MX. [4] The GeForce 4 Ti was superior to the GeForce 4 MX in virtually every aspect save for higher production cost, although the MX had the Nvidia VPE (video processing engine) which the Ti lacked.

Lineup

The initial two models were the Ti4400 (US$299) and the top-of-the-range Ti4600 (US$399). At the time of their introduction, Nvidia's main products were the entry-level GeForce 2 MX, the midrange GeForce4 MX models (released the same time as the Ti4400 and Ti4600), and the older but still high-performance GeForce 3 (demoted to the upper mid-range or performance niche). [1] However, ATI's Radeon 8500LE (9100) (a slower-clocked version of the flagship Radeon 8500) was somewhat cheaper than the Ti4400, and outperformed its price competitors, the GeForce 3 Ti200 and GeForce4 MX 460. The GeForce 3 Ti500 (US$299 prior to the GeForce4 release) was rendered obsolete quickly as it could not be produced cheap enough to justify a further price drop, though it filled the performance gap between the GeForce 3 Ti200 and the GeForce4 Ti4400. [5]

In consequence, Nvidia rolled out a slightly cheaper model: the Ti4200. Although the 4200 was initially supposed to be part of the launch of the GeForce4 line, Nvidia had delayed its release to sell off the soon-to-be discontinued GeForce 3 Ti500 chips. [6] In an attempt to prevent the Ti4200 damaging the Ti4400's sales, Nvidia set the Ti4200's memory speed at 222 MHz on the models with a 128  MiB frame buffera full 53 MHz slower than the Ti4400 (all of which had 128 MiB frame buffers). Models with a 64 MiB frame buffer were set to 250 MHz memory speed. This tactic didn't work however, for two reasons. Firstly, the Ti4400 was perceived as being not good enough for those who wanted top performance (who preferred the Ti4600), nor those who wanted good value for money (who typically chose the Ti4200), causing the Ti4400 to fade into obscurity. [7] Furthermore, some graphics card makers simply ignored Nvidia's guidelines for the Ti4200, and set the memory speed at 250 MHz on the 128 MiB models anyway. [8]

Then in late 2002, the NV25 core was replaced by the NV28 core, which differed only by addition of AGP-8X support. The Ti4200 with AGP-8X support was based on this chip, and sold as the Ti4200-8X. A Ti4800SE replaced the Ti4400 and a Ti4800 replaced the Ti4600 respectively when the 8X AGP NV28 core was introduced on these. [9] [10]

The only mobile derivative of the Ti series was the GeForce4 4200 Go (NV28M), launched in late 2002. [11] The solution featured the same feature-set and similar performance compared to the NV28-based Ti4200, although the mobile variant was clocked lower. It outperformed the Mobility Radeon 9000 by a large margin, as well as being Nvidia's first DirectX 8 laptop graphics solution. However, because the GPU was not designed for the mobile space, it had thermal output similar to the desktop part. The 4200 Go also lacked power-saving circuitry like the MX-based GeForce4 4x0 Go series or the RV250-based Mobility Radeon 9000. This caused problems for notebook manufacturers, especially with regards to battery life. [12]

Performance

The GeForce4 Ti outperformed the older GeForce 3 by a significant margin. [1] The competing ATI Radeon 8500 was generally faster than the GeForce 3 line, but was overshadowed by the GeForce 4 Ti in every area other than price and more advanced pixel shader (1.4) support. [1] Nvidia, however, missed a chance to dominate the upper-range/performance segment by delaying the release of the Ti4200 and by not rolling out 128 MiB models quickly enough; otherwise the Ti4200 was cheaper and faster than the previous top-line GeForce 3 Ti500 and Radeon 8500. [13] Besides the late introduction of the Ti4200, the limited release 128 MiB models of the GeForce 3 Ti200 proved unimpressive, letting the Radeon 8500LE and even the full 8500 dominated the upper-range performance for a while. [14] The Matrox Parhelia, despite having several DirectX 9.0 capabilities and other innovative features, was at most competitive with the GeForce 3 and GeForce 4 Ti 4200, but it was priced the same as the Ti 4600 at US$399.

The GeForce 4 Ti4200 enjoyed considerable longevity compared to its higher-clocked peers. Debuting at half the cost of the 4600 (US$199 versus US$399), the 4200 remained the best balance between price and performance until the launch of the short-lived DirectX 9 ATI Radeon 9500 Pro at the end of 2002. [15] The Ti4200 still managed to hold its own against several next generation DirectX 9 compliant GPUs released in late 2003, outperforming the GeForce FX 5200 and the midrange FX 5600, and performing similarly to the mid-range Radeon 9600 Pro (ATI's permanent successor to the Radeon 9500 Pro) in some situations. [16] [17] [18]

GeForce4 MX

Architecture

GeForce4 MX420 NVIDIA Personal Cinema GeForce4 MX 420.jpg
GeForce4 MX420

Though its lineage was of the past-generation GeForce2 (NV11 and NV15), the GeForce4 MX (NV17) did incorporate bandwidth and fill rate-saving techniques, dual-monitor support, and a multi-sampling anti-aliasing unit from the Ti series (NV25); the improved 128-bit DDR memory controller was crucial to solving the bandwidth limitations that plagued the GeForce 256 (NV10) and GeForce2 lines. It also owed some of its design heritage to Nvidia's high-end CAD products, and in performance-critical non-game applications it was remarkably effective. The most notable example is AutoCAD, in which the GeForce4 MX returned results within a single-digit percentage of GeForce4 Ti cards several times the price.

Many criticized the GeForce4 MX name as a misleading marketing ploy since it was less advanced than the preceding GeForce3 (NV20). [19] In the features comparison chart between the Ti and MX lines, it showed that the only "feature" that was missing on the MX was the nfiniteFX II enginethe DirectX 8 programmable vertex and pixel shaders. [20] However, the GeForce4 MX was not a GeForce4 Ti with the shader hardware removed, as the MX's performance in games that did not use shaders was considerably behind the Ti. [21]

In motion-video applications, the GeForce4 MX offered new functionality. It (and not the GeForce4 Ti) was the first GeForce member to feature the Nvidia VPE (video processing engine). It was also the first GeForce to offer hardware-iDCT and VLC (variable length code) decoding, making VPE a major upgrade from Nvidia's previous HDVP. In the application of MPEG-2 playback, VPE could finally compete head-to-head with ATI's video engine.

Lineup

GeForce4 MX440 NVIDIA Personal Cinema GeForce4 MX 440.jpg
GeForce4 MX440

There were 3 initial models: the MX420, the MX440 and the MX460. The MX420 had only Single Data Rate (SDR) memory and was designed for very low end PCs, replacing the GeForce2 MX100 and MX200. The GeForce4 MX440 was a mass-market OEM solution, replacing the GeForce2 MX/MX400 and GeForce2 Ti. The GeForce4 MX460 was initially meant to slot in between the MX440 and the Ti4400, while the release of the Ti4200 was held back.

In terms of 3D performance, the MX420 performed only slightly better than the GeForce2 MX400 and below the GeForce2 GTS. However, this was never really much of a problem, considering its target audience. The nearest thing to a direct competitor the MX420 had was ATI's Radeon 7000. In practice its main competitors were chipset-integrated graphics solutions, such as Intel's 845G and Nvidia's own nForce 2, but its main advantage over those was multiple-monitor support; Intel's solutions did not have this at all, and the nForce 2's multi-monitor support was much inferior to what the MX series offered.

NVIDIA GeForce PCX 4300 NVIDIA GeForce PCX 4300 ES.jpg
NVIDIA GeForce PCX 4300

The MX440 performed reasonably well for its intended audience, outperforming its closest competitor, the ATI Radeon 7500, as well as the discontinued GeForce2 Ti. Despite harsh criticism by gaming enthusiasts, the MX440 was successful in the PC OEM market as a replacement for the GeForce2 MX. Priced about 30% above the GeForce2 MX, it provided better performance, the ability to play a number of popular games that the GeForce2 could not run wellabove all elseto the average non-specialist it sounded as if it were a "real" GeForce4i.e., a GeForce4 Ti. While John Carmack initially warned gamers not to purchase the GeForce 4 MX440, its somewhat widespread adoption compelled id Software to make it the only DirectX 7.0 GPU supported by Doom 3. [22] [23] [24] When ATI launched its Radeon 9000 Pro in September 2002, it performed about the same as the MX440, but had crucial advantages with better single-texturing performance and proper support of DirectX 8 shaders. However, the 9000 was unable to break the MX440's entrenched hold on the OEM market. Nvidia's eventual answer to the Radeon 9000 was the GeForce FX 5200, but despite the 5200's DirectX 9 features it did not have a significant performance increase compared to the MX440 even in DirectX 7.0 games. This kept the MX440 in production while the 5200 was discontinued.

One of the fastest DirectX 7.0 compliant GPUs, the MX460's performance was similar to the discontinued GeForce 2 Ultra and the existing GeForce3 Ti200 (the remaining available member of the GeForce 3 family). However, ATI released the Radeon 8500LE which outperformed its price competitors, the GeForce 3 Ti200 and GeForce4 MX 460. ATI's move in turn compelled Nvidia to roll out the Ti4200 earlier than planned, also at a similar price to the MX 460, and soon afterwards discontinuing the Ti200. The Ti200, 8500LE, and Ti4200 were all DirectX 8.0 compliant while having similar market pricing to the MX460, while the 8500LE and Ti4200 also provided significantly better performance than the MX460, prevented the MX460 from ever being popular compared to the other GeForce 4 MX releases. [25]

The GeForce4 Go was derived from the MX line and it was announced along with the rest of the GeForce4 Ti and MX lineup in early 2002. There was the 420 Go, 440 Go, and 460 Go. However, ATI had beaten them to the market with the similarly performing Mobility Radeon 7500, and later the DirectX 8.0 compliant Mobility Radeon 9000. (Despite its name, the short-lived 4200 Go (NV28M) is not part of this lineup, it was instead derived from the Ti line.)

Like the Ti series, the MX was also updated in late 2002 to support AGP-8X with the NV18 core. The two new models were the MX440-8X, which was clocked slightly faster than the original MX440, and the MX440SE, which had a narrower memory bus, and was intended as a replacement of sorts for the MX420. The MX460, which had been discontinued by this point, was never replaced. Another variant followed in late 2003the MX 4000, which was a GeForce4 MX440SE with a slightly higher memory clock.

The GeForce4 MX line received a third and final update in 2004, with the PCX 4300, which was functionally equivalent to the MX 4000 but with support for PCI Express. In spite of its new codename (NV19), the PCX 4300 was in fact simply an NV18 core with a BR02 chip bridging the NV18's native AGP interface with the PCI-Express bus.

GeForce4 model information

ModelLaunch
Code name
Transistors (million)
Die size (mm2)
Core clock (MHz)
Memory clock (MHz)
Core config [a]
Fillrate MemorySupported API version
Performance (GFLOPS
FP32)
TDP (Watts)
MOperations/s
MPixels/s
MTexels/s
MVertices/s
Size (MB)
Bandwidth (GB/s)
Bus type
Bus width (bit)
GeForce4 MX IGP + nForce2October 1, 2002NV1F??FSB250133
200
2:0:4:25005001,000125Up to 128 system RAM2.128
6.4
DDR64
128
7.01.21.000?
GeForce4 MX420February 6, 2002NV1729 [26] 65AGP 4x
PCI
166642.656SDR
DDR
128 (SDR)
64 (DDR)
14
GeForce4 MX440 SE2002133
166 [27]
500 [27]

1000

64
128
2.128 5.312 [27] DDR64 128 [27] 13
GeForce MX4000December 14, 2003NV18B2965AGP 8x
PCI
16610002.6566414
GeForce PCX4300February 19, 2004PCIe x1612816
GeForce4 MX440February 6, 2002NV172965AGP 4x
PCI
2752005505501,100137.564
128
6.41281.10018
GeForce4 MX440 8xSeptember 25, 2002NV1829 [28] 65AGP 8x
PCI
166
250
2.656 [29]
8.0
64
128
19
GeForce4 MX460February 6, 2002NV172965AGP 4x
PCI
3002756006001,2001508.81281.20022
GeForce4 Ti4200April 16, 2002NV2563 [30] 142AGP 4x250222 (128 MB)
250 (64 MB)
4:2:8:41,0001,0002,0001257.104 (128 MB)
8.0 (64 MB)
8.0a1.315.0033
GeForce4 Ti4200 8xSeptember 25, 2002NV2863 [31] 142AGP 8x2508.034
GeForce4 Ti4400February 6, 2002NV2563142AGP 4x2752751,1001,1002,200137.51288.816.5037
GeForce4 Ti4400 8x
(Ti4800SE [b] )
January 20, 2003NV2863101AGP 8x38
GeForce4 Ti4600February 6, 2002NV2563142AGP 4x3003251,2001,2002,40015010.418.00 ?
GeForce4 Ti4600 8x
(Ti4800 [c] )
January 20, 2003NV2863101AGP 8x43
  1. Pixel shaders: vertex shaders: texture mapping units: render output units
  2. GeForce4 Ti4400 8x: Card manufacturers utilizing this chip labeled the card as a Ti4800SE. The surface of the chip has "Ti-8x" printed on it.
  3. GeForce4 Ti4600 8x: Card manufacturers utilizing this chip labeled the card as a Ti4600, and in some cases as a Ti4800. The surface of the chip has "Ti-8x" printed on it, as well as "4800" printed at the bottom.
ModelFeatures
nFiniteFX II EngineVideo Processing Engine (VPE)
GeForce4 MX420NoYes
GeForce4 MX440 SENoYes
GeForce4 MX4000NoYes
GeForce4 PCX4300NoYes
GeForce4 MX440NoYes
GeForce4 MX440 8XNoYes
GeForce4 MX460NoYes
GeForce4 Ti4200YesNo
GeForce4 Ti4200 8xYesNo
GeForce4 Ti4400YesNo
GeForce4 Ti4400 8xYesNo
GeForce4 Ti4600YesNo
GeForce4 Ti4600 8xYesNo

GeForce4 Go mobile GPU series

ModelLaunch
Core clock (MHz)
Memory clock (MHz)
Core config [a]
Fillrate Memory API support
MOperations/s
MPixels/s
MTexels/s
MVertices/s
Size (MB)
Bandwidth (GB/s)
Bus type
Bus width (bit)
GeForce4 Go 410February 6, 2002NV17MAGP 8x2002002:0:4:24004008000161.6SDR648.0a1.3
GeForce4 Go 420400323.2DDR
GeForce4 Go 440220440440440880647.04128
GeForce4 Go 460October 14, 200225050050050010008
GeForce4 Go 488NV18M30055060060012008.8
GeForce4 Go 4200November 14, 2002NV28M2004004:2:8:480080016001006.4

GeForce4 Go driver support

This family is a derivative of the GeForce4 MX family, produced for the laptop market. The GeForce4 Go family, performance wise, can be considered comparable to the MX line.

One possible solution to the lack of driver support for the Go family is the third party Omega Drivers. Using third party drivers can, among other things, invalidate warranties. The Omega Drivers are supported by neither laptop manufacturers, laptop ODMs, nor by Nvidia. Nvidia attempted legal action against a version of Omega Drivers that included the Nvidia logo. [32]

Discontinued support

Nvidia has ceased driver support for GeForce 4 series.

Final drivers

Product Support List Windows 95/98/Me – 81.98.
  • Driver version 81.98 for Windows 9x/Me was the last driver version ever released by Nvidia for these systems; no new official releases were later made for these systems.

The drivers for Windows 2000/XP can also be installed on later versions of Windows such as Windows Vista and 7; however, they do not support desktop compositing or the Aero effects of these operating systems.

Note: Despite claims in the documentation that 94.24 (released on May 17, 2006) supports the Geforce 4 series, it does not (94.24 actually supports only GeForce 6 and GeForce 7 series). [33]

(Products supported list also on this page)

Windows 95/98/Me Driver Archive
Windows XP/2000 Driver Archive

Unix Driver Archive

See also

Notes and references

  1. 1 2 3 4 5 Lal Shimpi, Anand (February 6, 2002). "Nvidia GeForce4 - NV17 and NV25 Come to Life". AnandTech. Retrieved June 14, 2008.
  2. 1 2 "A look at NVIDIA's GeForce4 chips". February 6, 2002.
  3. "ActiveWin.Com: NVIDIA GeForce 4 Ti 4600 - Review". www.activewin.com.
  4. Worobyew, Andrew.; Medvedev, Alexander. "Nvidia GeForce4 Ti 4400 and GeForce4 Ti 4600 (NV25) Review". Pricenfees. Archived from the original on October 12, 2018. Retrieved May 15, 2007.
  5. https://www.tomshardware.com/reviews/pc-graphics-xbox,423-19.html
  6. "Gaming Laptops 2017 in India". www.lastlaptop.com. Archived from the original on January 3, 2017. Retrieved January 2, 2017.
  7. https://www.anandtech.com/show/904
  8. https://www.anandtech.com/show/934/2
  9. Connolly, Chris. The GeForce4’s Last Gasp : MSI’s GeForce4 Ti4800 / Ti4600-8X, GamePC, January 20, 2003.
  10. R., Jason. MSI GeForce4 Ti4800SE 8X VIVO Video Card, Extreme Overclocking, March 30, 2003.
  11. "GeForce4 Go". Nvidia.com. Retrieved May 15, 2007.
  12. Witheiler, Matthew (November 14, 2002). "Nvidia GeForce4 4200 Go: Bringing mobile gaming to new heights". AnandTech. Retrieved June 14, 2008.
  13. https://www.anandtech.com/show/899/3
  14. Freeman, Vince. Nvidia GeForce4 Ti 4200 Review Archived June 24, 2003, at the Wayback Machine , Sharky Extreme, April 26, 2002.
  15. Wasson, Scott. ATI's Radeon 9500 Pro graphics card: DirectX 9 goes mainstream Archived July 6, 2007, at the Wayback Machine , Tech Report, November 27, 2002.
  16. Gasior, Geoff. ATI's Radeon 9600 Pro GPU: One step forward, two steps back? Archived November 27, 2006, at the Wayback Machine , Tech Report, April 16, 2003.
  17. Gasior, Geoff. Nvidia's GeForce FX 5200 GPU: Between capability and competence Archived August 9, 2007, at the Wayback Machine , Tech Report, April 29, 2003.
  18. https://hardware.slashdot.org/story/03/05/01/004236/geforce-fx-5200-reviewed
  19. https://www.toolify.ai/hardware/unveiling-the-mystery-of-nvidias-forgotten-geforce4-mx-460-2906628
  20. "GeForce4 MX". Nvidia.com. Retrieved June 14, 2008.
  21. https://www.toolify.ai/hardware/unveiling-the-mystery-of-nvidias-forgotten-geforce4-mx-460-2906628
  22. https://www.toolify.ai/hardware/unveiling-the-mystery-of-nvidias-forgotten-geforce4-mx-460-2906628
  23. https://www.osnews.com/story/649/john-carmack-dont-buy-a-geforce4-mx-for-doom-3/
  24. https://gamespot.com/gamespot/stories/news/0,10870,2847063,00.html
  25. https://www.toolify.ai/hardware/unveiling-the-mystery-of-nvidias-forgotten-geforce4-mx-460-2906628
  26. "NVIDIA GeForce4 MX 420 PCI Specs". TechPowerUp. Retrieved August 30, 2024.
  27. 1 2 3 4 Archived copy Archived 2023-07-11 at the Wayback Machine
  28. "NVIDIA GeForce4 MX 440-8x Specs". TechPowerUp. Retrieved August 30, 2024.
  29. "mx4408x-64bit166-Mhz". ImgBB. Archived from the original on December 25, 2022. Retrieved August 30, 2024.
  30. "NVIDIA GeForce4 Ti 4200 Specs". TechPowerUp. Retrieved August 30, 2024.
  31. "NVIDIA GeForce4 Ti 4200-8X Specs". TechPowerUp. Retrieved August 30, 2024.
  32. "Der Fall Omega vs. Nvidia". WCM - Das österreichische Computer Magazin (in German). WCM. July 24, 2003. Retrieved April 12, 2007.
    "The case Omega vs. Nvidia (English translation)". WCM - Austrian Computers the Magazine. WCM. July 24, 2003. Retrieved April 12, 2007.
  33. "Driver Details". NVIDIA.

Related Research Articles

<span class="mw-page-title-main">GeForce</span> Brand of GPUs by Nvidia

GeForce is a brand of graphics processing units (GPUs) designed by Nvidia and marketed for the performance market. As of the GeForce 40 series, there have been eighteen iterations of the design. The first GeForce products were discrete GPUs designed for add-on graphics boards, intended for the high-margin PC gaming market, and later diversification of the product line covered all tiers of the PC graphics market, ranging from cost-sensitive GPUs integrated on motherboards to mainstream add-in retail boards. Most recently, GeForce technology has been introduced into Nvidia's line of embedded application processors, designed for electronic handhelds and mobile handsets.

<span class="mw-page-title-main">GeForce FX series</span> Series of GPUs by Nvidia

The GeForce FX or "GeForce 5" series is a line of graphics processing units from the manufacturer Nvidia.

<span class="mw-page-title-main">GeForce 2 series</span> Series of GPUs by Nvidia

The GeForce 2 series (NV15) is the second generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in 2000, it is the successor to the GeForce 256.

<span class="mw-page-title-main">Graphics processing unit</span> Specialized electronic circuit; graphics accelerator

A graphics processing unit (GPU) is a specialized electronic circuit initially designed for digital image processing and to accelerate computer graphics, being present either as a discrete video card or embedded on motherboards, mobile phones, personal computers, workstations, and game consoles. After their initial design, GPUs were found to be useful for non-graphic calculations involving embarrassingly parallel problems due to their parallel structure. Other non-graphical uses include the training of neural networks and cryptocurrency mining.

<span class="mw-page-title-main">GeForce 3 series</span> Series of GPUs by Nvidia

The GeForce 3 series (NV20) is the third generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in February 2001, it advanced the GeForce architecture by adding programmable pixel and vertex shaders, multisample anti-aliasing and improved the overall efficiency of the rendering process.

<span class="mw-page-title-main">GeForce 6 series</span> Series of GPUs by Nvidia

The GeForce 6 series is the sixth generation of Nvidia's GeForce line of graphics processing units. Launched on April 14, 2004, the GeForce 6 family introduced PureVideo post-processing for video, SLI technology, and Shader Model 3.0 support.

<span class="mw-page-title-main">Voodoo 5</span> Graphics card line

The Voodoo 5 was the last and most powerful graphics card line that was released by 3dfx Interactive. All members of the family were based upon the VSA-100 graphics processor. Only the single-chip Voodoo 4 4500 and dual-chip Voodoo 5 5500 made it to market.

nForce3 Chipset by Nvidia

The nForce3 chipset was created by Nvidia as a Media and Communications Processor. Specifically, it was designed for use with the Athlon 64 processor.

<span class="mw-page-title-main">Radeon R200 series</span> Series of video cards

The R200 is the second generation of GPUs used in Radeon graphics cards and developed by ATI Technologies. This GPU features 3D acceleration based upon Microsoft Direct3D 8.1 and OpenGL 1.3, a major improvement in features and performance compared to the preceding Radeon R100 design. The GPU also includes 2D GUI acceleration, video acceleration, and multiple display outputs. "R200" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.

<span class="mw-page-title-main">GeForce 7 series</span> Series of GPUs by Nvidia

The GeForce 7 series is the seventh generation of Nvidia's GeForce line of graphics processing units. This was the last series available on AGP cards.

<span class="mw-page-title-main">Radeon R300 series</span> Series of video cards

The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in Radeon graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in features and performance compared to the preceding R200 design. R300 was the first fully Direct3D 9-capable consumer graphics chip. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs.

<span class="mw-page-title-main">Radeon R100 series</span> Series of video cards

The Radeon R100 is the first generation of Radeon graphics chips from ATI Technologies. The line features 3D acceleration based upon Direct3D 7.0 and OpenGL 1.3, and all but the entry-level versions offloading host geometry calculations to a hardware transform and lighting (T&L) engine, a major improvement in features and performance compared to the preceding Rage design. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs. "R100" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.

<span class="mw-page-title-main">Matrox Parhelia</span> GPU by Matrox

The Matrox Parhelia-512 is a graphics processing unit (GPU) released by Matrox in June 2002. It has full support for DirectX 8.1 and incorporates several DirectX 9.0 features. At the time of its release, it was best known for its ability to drive three monitors and its Coral Reef tech demo.

The GeForce 500 series is a series of graphics processing units developed by Nvidia, as a refresh of the Fermi based GeForce 400 series. It was first released on November 9, 2010 with the GeForce GTX 580.

<span class="mw-page-title-main">Radeon 9000 series</span> Series of video cards

The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in Radeon graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in features and performance compared to the preceding R200 design. R300 was the first fully Direct3D 9-capable consumer graphics chip. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs.

The R200 is the second generation of GPUs used in Radeon graphics cards and developed by ATI Technologies. This GPU features 3D acceleration based upon Microsoft Direct3D 8.1 and OpenGL 1.3, a major improvement in features and performance compared to the preceding Radeon R100 design. The GPU also includes 2D GUI acceleration, video acceleration, and multiple display outputs. "R200" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.

<span class="mw-page-title-main">GeForce 20 series</span> Series of GPUs by Nvidia

The GeForce 20 series is a family of graphics processing units developed by Nvidia. Serving as the successor to the GeForce 10 series, the line started shipping on September 20, 2018, and after several editions, on July 2, 2019, the GeForce RTX Super line of cards was announced.

<span class="mw-page-title-main">Turing (microarchitecture)</span> GPU microarchitecture by Nvidia

Turing is the codename for a graphics processing unit (GPU) microarchitecture developed by Nvidia. It is named after the prominent mathematician and computer scientist Alan Turing. The architecture was first introduced in August 2018 at SIGGRAPH 2018 in the workstation-oriented Quadro RTX cards, and one week later at Gamescom in consumer GeForce 20 series graphics cards. Building on the preliminary work of Volta, its HPC-exclusive predecessor, the Turing architecture introduces the first consumer products capable of real-time ray tracing, a longstanding goal of the computer graphics industry. Key elements include dedicated artificial intelligence processors and dedicated ray tracing processors. Turing leverages DXR, OptiX, and Vulkan for access to ray tracing. In February 2019, Nvidia released the GeForce 16 series GPUs, which utilizes the new Turing design but lacks the RT and Tensor cores.

<span class="mw-page-title-main">Kelvin (microarchitecture)</span> GPU microarchitecture by Nvidia

Kelvin is the codename for a GPU microarchitecture developed by Nvidia, and released in 2001, as the successor to the Celsius microarchitecture. It was named with reference to William Thomson and used with the GeForce 3 and 4 series.