GeForce 2 series

Last updated

GeForce 2 series
Geforce2logo.png
NVIDIA@180nm@Fixed-pipeline@NV15@GeForce2 GTS@F30213.01 0031A3 DSC02339 (26941631266).jpg
Top: Logo of the GeForce 2 series
Bottom: Nvidia GeForce 2 GTS (Asus Branded) with its cooler removed, showing the NV15 die
Release datemid-May, 2000;25 years ago (2000) [1]
CodenameNV11, NV15, NV16
Architecture Celsius
Models
  • GeForce MX series
  • GeForce GTS series
  • GeForce Pro series
  • GeForce Ti series
  • GeForce Ultra series
Cards
Entry-levelMX
Mid-rangeGTS, Pro
High-endTi, Ultra
API support
DirectX Direct3D 7.0
OpenGL OpenGL 1.2 (T&L)
History
Predecessor GeForce 256
Successor GeForce 3 series
Support status
Unsupported

The GeForce 2 series (NV15) is the second generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in 2000, it is the successor to the GeForce 256.

Contents

The GeForce 2 family comprised a number of models. The GeForce 2 GTS, GeForce 2 Ultra, GeForce 2 Pro, and GeForce 2 Ti are based upon the original architecture (NV15), varying only by chip and memory clock speeds. For the low-end segment and OEMs, the GeForce 2 MX series (NV11) was created, from which the GeForce 2 Go was derived for laptops. In addition, the GeForce 2 architecture is used for the Quadro series on the Quadro 2 Pro, 2 MXR, and 2 EX cards with special drivers meant to accelerate computer-aided design applications.

Architecture

GeForce2 Ultra GPU GeForce2 Ultra GPU.jpg
GeForce2 Ultra GPU
Die shot of a Geforce 2 GPU NVIDIA@180nm@Fixed-pipeline@NV15@GeForce2 GTS@F30213.01 0031A3 Stack-DSC03235-DSC03247 - ZS-DMap (26370843253).jpg
Die shot of a Geforce 2 GPU

The GeForce 2 architecture (NV15) is similar to the previous GeForce 256 line but with various improvements. Compared to the 220 nm GeForce 256, the GeForce 2 is built on a 180 nm manufacturing process, making the silicon more dense and allowing for more transistors and a higher clock speed. The most significant change for 3D acceleration is the addition of a second texture mapping unit to each of the four pixel pipelines. Some say[ who? ] the second TMU was there in the original Geforce NSR (Nvidia Shading Rasterizer) but dual-texturing was disabled due to a hardware bug; NSR's unique ability to do single-cycle trilinear texture filtering supports this suggestion. This doubles the texture fillrate per clock compared to the previous generation and is the reasoning behind the GeForce 2 GTS's naming suffix: GigaTexel Shader (GTS). The GeForce 2 also formally introduces the NSR (Nvidia Shading Rasterizer), a primitive type of programmable pixel pipeline that is somewhat similar to later pixel shaders. This functionality is also present in GeForce 256 but was unpublicized. Another hardware enhancement is an upgraded video processing pipeline, called HDVP (high definition video processor). HDVP supports motion video playback at HDTV-resolutions (MP@HL). [2]

In 3D benchmarks and gaming applications, the GeForce 2 GTS outperforms its predecessor by up to 40%. [3] In OpenGL games (such as Quake III), the card outperforms the ATI Radeon DDR and 3dfx Voodoo 5 5500 cards in both 16 bpp and 32 bpp display modes. However, in Direct3D games running 32 bpp, the Radeon DDR is sometimes able to take the lead. [4]

The GeForce 2 (NV15) architecture is quite memory bandwidth constrained. [5] The GPU wastes memory bandwidth and pixel fillrate due to unoptimized z-buffer usage, drawing of hidden surfaces, and a relatively inefficient RAM controller. The main competition for GeForce 2 GTS, the Radeon DDR (R100), has hardware functions (called HyperZ) that address these issues. [6] Because of the inefficient nature of the GeForce 2 GPUs, they could not approach their theoretical performance potential and the Radeon, even with its significantly less powerful 3D architecture, offered strong competition. The later NV17 revision of the NV11 design used in the GeForce4 MX was more efficient.

Releases

The first models to arrive after the original GeForce 2 GTS was the GeForce 2 Ultra and GeForce2 MX, launched on September 7, 2000. [7] On September 29, 2000 Nvidia started shipping graphics cards which had 16 and 32 MB of video memory size.

Architecturally identical to the GTS, the Ultra simply has higher core and memory clock rates. Meant to be a niche product, it was rumored that GeForce 2 Ultra was intended to prevent 3dfx taking the lead with their Voodoo 5 6000 that was ending up never released as 3dfx went bankrupt. The Ultra model actually outperforms the first GeForce 3 products in some cases, due to initial GeForce 3 cards having significantly lower fillrate. However, the Ultra loses its lead when anti-aliasing is enabled, because of the GeForce 3's new memory bandwidth/fillrate efficiency mechanisms; plus the GeForce 3 has a superior next-generation feature set with programmable vertex and pixel shaders for DirectX 8.0 games.

The GeForce 2 Pro, introduced shortly after the Ultra, was an alternative to the expensive top-line Ultra and is faster than the GTS.

In October 2001, the GeForce 2 Ti was positioned as a cheaper and less advanced alternative to the GeForce 3. Faster than the GTS and Pro but slower than the Ultra, the GeForce 2 Ti performed competitively against the Radeon 7500 (RV200), although the 7500 had the advantage of dual-display support. This mid-range GeForce 2 release was replaced by the GeForce4 MX series as the budget/performance choice in January 2002.

On their 2001 product web page, Nvidia initially placed the Ultra as a separate offering from the rest of the GeForce 2 lineup (GTS, Pro, Ti), however by late 2002 with the GeForce 2 considered a discontinued product line (being succeeded by the GeForce 4 MX), the Ultra was included along the GTS, Pro, and Ti in the GeForce 2 information page.

GeForce 2 MX

GeForce 2 MX200 AGP GeForce2MX200AGP.JPG
GeForce 2 MX200 AGP
Die shot of the MX400 GPU NVIDIA@180nm@Fixed-pipeline@NV11@GeForce2 MX400@Q26257.1 0315B3 Stack-DSC01049-DSC01078 - ZS-DMap (26336569015).jpg
Die shot of the MX400 GPU

Since the previous GeForce 256 line shipped without a budget variant, the RIVA TNT2 series was left to fill the "low-end" role—albeit with a comparably obsolete feature set. In order to create a better low-end option, Nvidia created the GeForce 2 MX series (NV11), which offered a set of standard features similar to the regular GeForce 2 (NV15), limited only by categorical tier of lower performance. In order to reduce production costs, the GeForce 2 MX cards had two 3D pixel pipelines removed and a reduced available memory bandwidth. The cards utilized either SDR SDRAM or DDR SDRAM with memory bus widths ranging from 32 to 128 bits, allowing circuit board cost to be varied. The MX series also provided dual-display support, something not found in the GeForce 256 and GeForce 2. With performance approaching the GeForce 256 while also being much more economical to produce, the GeForce 2 MX was successful in the OEM and budget market.

The prime competitors in the OEM and budget segment were ATI's Radeon SDR (which with all other R100 chip-equipped cards, regardless of clock/memory speed and memory configuration, was later renamed collectively as Radeon 7200), Radeon VE (RV100) (later renamed Radeon 7000), and the 3dfx Voodoo4 4500. [8] Sharing the same R100 GPU as the higher-end Radeon 32MB DDR (US$230), the Radeon SDR (US$150) was equipped with SDR SDRAM instead of DDR SDRAM found in its more expensive brethren although this did not bring down costs sufficiently to match the GeForce 2 MX. [9] Released 3 months after the GeForce 2 MX, the Radeon SDR lacked multi-monitor support but exhibited faster 32-bit 3D rendering over the GeForce 2 MX. [10] 3dfx's Voodoo4 4500 arrived too late, as well as being too expensive at US$150, but too slow to compete with Nvidia or ATI's offerings, and also lacking multi-monitor support. Next up, the Radeon VE's RV100 GPU was cut down considerably from the R100 to reduce production costs, so it did not offer hardware T&L, an emerging 3D rendering feature of the day that was the major attraction of Direct3D 7. Further, the Radeon VE featured only a single rendering pipeline, causing it to produce a substantially lower fillrate than the GeForce 2 MX. However the Radeon VE (US$100) had the advantage of somewhat better dual-monitor display software while matching the GeForce 2 MX on price. [11]

Members of the series include GeForce 2 MX, MX400, MX200, and MX100. The GPU was also used as an integrated graphics processor in the nForce chipset line and as a mobile graphics chip for notebooks called GeForce 2 Go.

Successor

The successor to the GeForce 2 (non-MX) line is the GeForce 3. The non-MX GeForce 2 line was reduced in price and saw the addition of the GeForce 2 Ti, in order to offer a mid-range alternative to the high-end GeForce 3 product.

Later, both the GeForce 2 and GeForce 2 MX lines were replaced with the GeForce4 MX.

Models

ModelLaunch
Code name
Fab (nm) [12]
Transistors (million)
Die size (mm2)
Core clock (MHz)
Memory clock (MHz)
Core config [a]
Fillrate Memory
Performance (GFLOPS
FP32)
TDP (Watts)
MOperations/s
MPixels/s
MTexels/s
MVertices/s
Size (MB)
Bandwidth (GB/s)
Bus type
Bus width (bit)
GeForce2 MX IGP + nForce 220/420June 4, 2001NV1A (IGP) / NV11 (MX) TSMC
180 nm
20 [13] 64FSB1751332:4:23503507000Up to 32 system RAM2.128
4.256
DDR64
128
0.7003
GeForce2 MX200March 3, 2001AGP 4x, PCI16632
64
1.328SDR641
GeForce2 MXJune 28, 20002.6561284
GeForce2 MX400March 3, 2001200166,200 (SDR)
166 (DDR)
4004008001.328 3.200 2.656SDR
DDR
64/128 (SDR)
64 (DDR)
0.8005
GeForce2 GTSApril 26, 2000NV1525 [14] 88AGP 4x1664:8:48008001,6005.312DDR1281.6006
GeForce2 ProDecember 5, 20002006.4?
GeForce2 TiOctober 1, 2001TSMC
150 nm
2501,0001,0002,0002.000?
GeForce2 UltraAugust 14, 2000TSMC
180 nm
230647.36?

GeForce2 Go mobile GPU series

ModelLaunch
Core clock (MHz)
Memory clock (MHz)
Core config [a]
Fillrate Memory
MOperations/s
MPixels/s
MTexels/s
MVertices/s
Size (MB)
Bandwidth (GB/s)
Bus type
Bus width (bit)
GeForce2 Go 100February 6, 2001NV11MAGP 4x1253322:0:4:225025050008, 161.328DDR32
GeForce2 GoNovember 11, 2000143166
332
28628657216, 322.656SDR
DDR
128
64
GeForce2 Go 200February 6, 2001332DDR64

Discontinued support

Nvidia GeForce2 Ultra NVIDIA GeForce2 Ultra.jpg
Nvidia GeForce2 Ultra

Nvidia has ceased driver support for GeForce 2 series, ending with GTS, Pro, Ti and Ultra models in 2005 and then with MX models in 2007.

Final drivers

GeForce 2 GTS, GeForce 2 Pro, GeForce 2 Ti and GeForce 2 Ultra:

GeForce 2 MX & MX x00 Series:

The drivers for Windows 2000/XP can also be installed on later versions of Windows such as Windows Vista and 7; however, they do not support desktop compositing or the Aero effects of these operating systems.

Note: Despite claims in the documentation that 94.24 (released on May 17, 2006) supports the Geforce 2 MX series, it does not (94.24 actually supports only GeForce 6 and GeForce 7 series). [15]

Windows 95/98/Me Driver Archive
Windows XP/2000 Driver Archive

Competing chipsets

See also

Related Research Articles

<span class="mw-page-title-main">GeForce 256</span> GPU by Nvidia

The GeForce 256 is the original release in Nvidia's "GeForce" product line. Announced on August 31, 1999 and released on October 11, 1999, the GeForce 256 improves on its predecessor by increasing the number of fixed pixel pipelines, offloading host geometry calculations to a hardware transform and lighting (T&L) engine, and adding hardware motion compensation for MPEG-2 video. It offered a notable leap in 3D PC gaming performance and was the first fully Direct3D 7-compliant 3D accelerator.

<span class="mw-page-title-main">GeForce FX series</span> Series of GPUs by Nvidia

The GeForce FX or "GeForce 5" series is a line of graphics processing units from the manufacturer Nvidia.

<span class="mw-page-title-main">GeForce 3 series</span> Series of GPUs by Nvidia

The GeForce 3 series (NV20) is the third generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in February 2001, it advanced the GeForce architecture by adding programmable pixel and vertex shaders, multisample anti-aliasing and improved the overall efficiency of the rendering process.

<span class="mw-page-title-main">GeForce 4 series</span> Series of GPUs by Nvidia

The GeForce 4 series refers to the fourth generation of Nvidia's GeForce line of graphics processing units (GPUs). There are two different GeForce4 families, the high-performance Ti family, and the budget MX family. The MX family spawned a mostly identical GeForce4 Go (NV17M) family for the laptop market. All three families were announced in early 2002; members within each family were differentiated by core and memory clock speeds. In late 2002, there was an attempt to form a fourth family, also for the laptop market, the only member of it being the GeForce4 4200 Go (NV28M) which was derived from the Ti line.

<span class="mw-page-title-main">GeForce 6 series</span> Series of GPUs by Nvidia

The GeForce 6 series is the sixth generation of Nvidia's GeForce line of graphics processing units. Launched on April 14, 2004, the GeForce 6 family introduced PureVideo post-processing for video, SLI technology, and Shader Model 3.0 support.

<span class="mw-page-title-main">Voodoo 5</span> Graphics card line

The Voodoo 5 was the last and most powerful graphics card line that was released by 3dfx Interactive. All members of the family were based upon the VSA-100 graphics processor. Only the single-chip Voodoo 4 4500 and dual-chip Voodoo 5 5500 made it to market.

<span class="mw-page-title-main">HyperZ</span> Brand for a set of GPU processing techniques by ATI

HyperZ is the brand for a set of processing techniques developed by ATI Technologies and later Advanced Micro Devices and implemented in their Radeon-GPUs. HyperZ was announced in November 2000 and was still available in the TeraScale-based Radeon HD 2000 Series and in current Graphics Core Next-based graphics products.

<span class="mw-page-title-main">S3 Savage</span> Line of PC graphics chipsets by S3

Savage was a product-line of PC graphics chipsets designed by S3.

<span class="mw-page-title-main">Radeon R200 series</span> Series of video cards

The R200 is the second generation of GPUs used in Radeon graphics cards and developed by ATI Technologies. This GPU features 3D acceleration based upon Microsoft Direct3D 8.1 and OpenGL 1.3, a major improvement in features and performance compared to the preceding Radeon R100 design. The GPU also includes 2D GUI acceleration, video acceleration, and multiple display outputs. "R200" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.

<span class="mw-page-title-main">Radeon R300 series</span> Series of video cards

The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in Radeon graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in features and performance compared to the preceding R200 design. R300 was the first fully Direct3D 9-capable consumer graphics chip. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs.

<span class="mw-page-title-main">Radeon R100 series</span> Series of video cards

The Radeon R100 is the first generation of Radeon graphics chips from ATI Technologies. The line features 3D acceleration based upon Direct3D 7.0 and OpenGL 1.3, and all but the entry-level versions offloading host geometry calculations to a hardware transform and lighting (T&L) engine, a major improvement in features and performance compared to the preceding Rage design. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs. "R100" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.

<span class="mw-page-title-main">Matrox Parhelia</span> GPU by Matrox

The Matrox Parhelia-512 is a graphics processing unit (GPU) released by Matrox in 2002. It has full support for DirectX 8.1 and incorporates several DirectX 9.0 features. At the time of its release, it was best known for its ability to drive three monitors and its Coral Reef tech demo.

The G400 is a video card made by Matrox, released in September 1999. The graphics processor contains a 2D GUI, video, and Direct3D 6.0 3D accelerator. Codenamed "Toucan", it was a more powerful and refined version of its predecessor, the G200.

<span class="mw-page-title-main">Voodoo3</span> Series of gaming video cards

Voodoo3 was a series of computer gaming video cards manufactured and designed by 3dfx Interactive. It was the successor to the company's high-end Voodoo2 line and was based heavily upon the older Voodoo Banshee product. Voodoo3 was announced at COMDEX '98 and arrived on store shelves in early 1999. The Voodoo3 line was the first product manufactured by the combined STB Systems and 3dfx.

Transform, clipping, and lighting is a term used in computer graphics.

<span class="mw-page-title-main">Radeon HD 8000 series</span> Family of GPUs by AMD

The Radeon HD 8000 series is a family of computer GPUs developed by AMD. AMD was initially rumored to release the family in the second quarter of 2013, with the cards manufactured on a 28 nm process and making use of the improved Graphics Core Next architecture. However the 8000 series turned out to be an OEM rebadge of the 7000 series.

<span class="mw-page-title-main">Radeon 9000 series</span> Series of video cards

The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in Radeon graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in features and performance compared to the preceding R200 design. R300 was the first fully Direct3D 9-capable consumer graphics chip. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs.

The R200 is the second generation of GPUs used in Radeon graphics cards and developed by ATI Technologies. This GPU features 3D acceleration based upon Microsoft Direct3D 8.1 and OpenGL 1.3, a major improvement in features and performance compared to the preceding Radeon R100 design. The GPU also includes 2D GUI acceleration, video acceleration, and multiple display outputs. "R200" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.

<span class="mw-page-title-main">GeForce 900 series</span> Series of GPUs by Nvidia

The GeForce 900 series is a family of graphics processing units developed by Nvidia, succeeding the GeForce 700 series and serving as the high-end introduction to the Maxwell microarchitecture, named after James Clerk Maxwell. They were produced with TSMC's 28 nm process.

References

  1. Ross, Alex (April 26, 2000). "NVIDIA GeForce2 GTS Guide". SharkyExtreme. Archived from the original on August 23, 2004.
  2. Lal Shimpi, Anand (April 26, 2000). "NVIDIA GeForce 2 GTS". Anandtech. p. 2. Retrieved July 2, 2009.
  3. Lal Shimpi, Anand (April 26, 2000). "NVIDIA GeForce 2 GTS". Anandtech. Retrieved June 14, 2008.
  4. Witheiler, Matthew (July 17, 2000). "ATI Radeon 64MB DDR". Anandtech. Retrieved June 14, 2008.
  5. Lal Shimpi, Anand (August 14, 2000). "NVIDIA GeForce 2 Ultra". Anandtech. Retrieved June 14, 2008.
  6. Lal Shimpi, Anand (April 25, 2000). "ATI Radeon 256 Preview (HyperZ)". Anandtech. p. 5. Retrieved June 14, 2008.
  7. "Press Release-NVIDIA". www.nvidia.com. Retrieved April 22, 2018.
  8. https://www.anandtech.com/show/721
  9. https://www.anandtech.com/show/635/13
  10. FastSite (December 27, 2000). "ATI RADEON 32MB SDR Review". X-bit labs. Archived from the original on July 25, 2008. Retrieved June 14, 2008.
  11. https://www.anandtech.com/show/721/12
  12. "3D accelerator database". Vintage 3D. Archived from the original on October 23, 2018. Retrieved August 30, 2024.
  13. "NVIDIA GeForce2 MX PCI Specs". TechPowerUp. Retrieved August 30, 2024.
  14. "NVIDIA NV15 GPU Specs | TechPowerUp GPU Database" . Retrieved August 30, 2024.
  15. "Driver Details". NVIDIA.