GeForce 3 series

Last updated

GeForce 3 series
Nvidia GeForce 3 Series Logo.png
Release dateFebruary 27, 2001;23 years ago (February 27, 2001)
CodenameNV20
Architecture Kelvin
Models
  • GeForce 3 series
  • GeForce 3 Ti series
Cards
Mid-rangeTi 200
High-endGeForce 3 (original), Ti 500
API support
DirectX Direct3D 8.0
Vertex Shader 1.1
Pixel Shader 1.1
OpenGL OpenGL 1.3
History
Predecessor GeForce 2 (NV15)
Successor GeForce 4 Ti (NV25)
Support status
Unsupported

The GeForce 3 series (NV20) is the third generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in February 2001, [1] it advanced the GeForce architecture by adding programmable pixel and vertex shaders, multisample anti-aliasing and improved the overall efficiency of the rendering process.

Contents

The GeForce 3 was unveiled during the 2001 Macworld Conference & Expo/Tokyo 2001 in Makuhari Messe and powered realtime demos of Pixar's Junior Lamp and id Software's Doom 3 . Apple would later announce launch rights for its new line of computers.

The GeForce 3 family comprises 3 consumer models: the GeForce 3, the GeForce 3 Ti200, and the GeForce 3 Ti500. A separate professional version, with a feature-set tailored for computer aided design, was sold as the Quadro DCC. A derivative of the GeForce 3, known as the NV2A, is used in the Microsoft Xbox game console.

Architecture

GeForce3 Ti 200 GPU Geforce3gpu.jpg
GeForce3 Ti 200 GPU

The GeForce 3 was introduced three months after Nvidia acquired the assets of 3dfx. It was marketed as the nFinite FX Engine, and was the first Microsoft Direct3D 8.0 compliant 3D-card. Its programmable shader architecture enabled applications to execute custom visual effects programs in Microsoft Shader language 1.1. It is believed that the fixed-function T&L hardware from GeForce 2 was still included on the chip for use with Direct3D 7.0 applications, as the single vertex shader was not fast enough to emulate it yet. [2] With respect to pure pixel and texel throughput, the GeForce 3 has four pixel pipelines which each can sample two textures per clock. This is the same configuration as GeForce 2, excluding the slower GeForce 2 MX line.

To take better advantage of available memory performance, the GeForce 3 has a memory subsystem dubbed Lightspeed Memory Architecture (LMA). This is composed of several mechanisms that reduce overdraw, conserve memory bandwidth by compressing the z-buffer (depth buffer) and better manage interaction with the DRAM.

Other architectural changes include EMBM support [3] [4] (first introduced by Matrox in 1999) and improvements to anti-aliasing functionality. Previous GeForce chips could perform only super-sampled anti-aliasing (SSAA), a demanding process that renders the image at a large size internally and then scales it down to the end output resolution. GeForce 3 adds multi-sampling anti-aliasing (MSAA) and Quincunx anti-aliasing methods, both of which perform significantly better than super-sampling anti-aliasing at the expense of quality. With multi-sampling, the render output units super-sample only the Z-buffers and stencil buffers, and using that information get greater geometry detail needed to determine if a pixel covers more than one polygonal object. This saves the pixel/fragment shader from having to render multiple fragments for pixels where the same object covers all of the same sub-pixels in a pixel. This method fails with texture maps which have varying transparency (e.g. a texture map that represents a chain link fence). Quincunx anti-aliasing is a blur filter that shifts the rendered image a half-pixel up and a half-pixel left in order to create sub-pixels which are then averaged together in a diagonal cross pattern, destroying both jagged edges but also some overall image detail. Finally, the GeForce 3's texture sampling units were upgraded to support 8-tap anisotropic filtering, compared to the previous limit of 2-tap with GeForce 2. With 8-tap anisotropic filtering enabled, distant textures can be noticeably sharper.

A derivative of the GeForce 3, known as the NV2A, is used in the Microsoft Xbox game console. It is clocked the same as the original GeForce 3 but features an additional vertex shader. [5] [6] [7] [8] [9]

Performance

The GeForce 3 GPU (NV20) has the same theoretical pixel and texel throughput per clock as the GeForce 2 (NV15). The GeForce 2 Ultra is clocked 25% faster than the original GeForce 3 and 43% faster than the Ti200; this means that in select instances, like Direct3D 7 T&L benchmarks, the GeForce 2 Ultra and sometimes even GTS can outperform the GeForce 3 and Ti200, because the newer GPUs use the same fixed-function T&L unit, but are clocked lower. [10] The GeForce 2 Ultra also has considerable raw memory bandwidth available to it, only matched by the GeForce 3 Ti500. However, when comparing anti-aliasing performance the GeForce 3 is clearly superior because of its MSAA support and memory bandwidth/fillrate management efficiency.

When comparing the shading capabilities to the Radeon 8500, reviewers noted superior precision with the ATi card. [11]

Product positioning

Nvidia refreshed the lineup in October 2001 with the release of the GeForce 3 Ti200 and Ti500. This coincided with ATI's releases of the Radeon 7500 and Radeon 8500. The Ti500 has higher core and memory clocks (240 MHz core/250 MHz RAM) than the original GeForce 3 (200 MHz/230 MHz), and generally matches the Radeon 8500. The Ti200 was the slowest, and lowest-priced GeForce 3 release. It is clocked lower (175 MHz/200 MHz) yet it surpasses the Radeon 7500 in speed and feature set although lacking dual-monitor implementation.

The original GeForce3 and Ti500 were only released in 64 MiB configurations, while the Ti200 was also released as 128 MiB versions.

The GeForce 4 Ti (NV25), introduced in April 2002, was a revision of the GeForce 3 architecture. [12] [13] The GeForce 4 Ti was very similar to the GeForce 3; the main differences were higher core and memory speeds, a revised memory controller, improved vertex and pixel shaders, hardware anti-aliasing and DVD playback. Proper dual-monitor support was also brought over from the GeForce 2 MX. With the GeForce 4 Ti 4600 as the new flagship product, this was the beginning of the end of the GeForce 3 Ti 500 which was already difficult to produce due to poor yields, and it was later completely replaced by the much cheaper but similarly performing GeForce 4 Ti 4200. [14] Also announced at the same time was the GeForce 4 MX (NV17), which despite the name was closer in terms of architecture and feature set to the GeForce 2 (NV 11 and NV15). [15] The GeForce 3 Ti200 was still kept in production for a short while as it occupied a niche spot between the (delayed) GeForce 4 Ti4200 and GeForce 4 MX460, with performance equivalent to the DirectX 7.0 compliant MX460 while also having full DirectX 8.0 support, although lacking the ability to support dual-monitors. [16] However, ATI released the Radeon 8500LE (a slower clocked version of the 8500) which outperformed both the Ti200 and MX460. ATI's move in turn compelled Nvidia to roll out the Ti4200 earlier than planned, also at a similar price to the MX 460, and soon afterwards discontinuing the Ti200 by summer 2002 due to naming confusion with the GeForce 4 MX and Ti lines. The GeForce 3 Ti200 still outperforms the Radeon 9000 (RV250), which was introduced around the time of the Ti200's discontinuation; unlike the 8500LE which was just a slower-clocked 8500, the 9000 was a major redesign to reduce cost and power usage. [17]

Specifications

ModelLaunch
Code name
Transistors (million)
Die size (mm2)
Core clock (MHz)
Memory clock (MHz)
Core config [a]
Fillrate Memory
Performance (GFLOPS
FP32)
TDP (Watts)
MOperations/s
MPixels/s
MTexels/s
MVertices/s
Size (MB)
Bandwidth (GB/s)
Bus type
Bus width (bit)
GeForce3 Ti200October 1, 2001NV2057128AGP 4x, PCI1752004:1:8:4700700140043.7564
128
6.4DDR1288.750?
GeForce3February 27, 2001200230800800160050647.3610.00?
GeForce3 Ti500October 1, 200124025096096019206064
128
8.012.0029

Discontinued support

Nvidia has ceased driver support for GeForce 3 series.

Nvidia GeForce3 Ti 500 GeForce3 Ti 500.jpg
Nvidia GeForce3 Ti 500

Final drivers

Product Support List Windows 95/98/Me – 81.98.
  • Driver version 81.98 for Windows 9x/Me was the last driver version ever released by Nvidia for these systems; no new official releases were later made for these systems.

The drivers for Windows 2000/XP can also be installed on later versions of Windows such as Windows Vista and 7; however, they do not support desktop compositing or the Aero effects of these operating systems.

Note: Despite claims in the documentation that 94.24 (released on May 17, 2006) supports the Geforce 3 series, it does not (94.24 actually supports only GeForce 6 and GeForce 7 series). [18]

(Products supported list also on this page)

Windows 95/98/Me Driver Archive
Windows XP/2000 Driver Archive
Unix Driver Archive

See also

Related Research Articles

<span class="mw-page-title-main">GeForce 256</span> GPU by Nvidia

The GeForce 256 is the original release in Nvidia's "GeForce" product line. Announced on August 31, 1999 and released on October 11, 1999, the GeForce 256 improves on its predecessor by increasing the number of fixed pixel pipelines, offloading host geometry calculations to a hardware transform and lighting (T&L) engine, and adding hardware motion compensation for MPEG-2 video. It offered a notable leap in 3D PC gaming performance and was the first fully Direct3D 7-compliant 3D accelerator.

<span class="mw-page-title-main">GeForce FX series</span> Series of GPUs by Nvidia

The GeForce FX or "GeForce 5" series is a line of graphics processing units from the manufacturer Nvidia.

<span class="mw-page-title-main">GeForce 2 series</span> Series of GPUs by Nvidia

The GeForce 2 series (NV15) is the second generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in 2000, it is the successor to the GeForce 256.

<span class="mw-page-title-main">Graphics processing unit</span> Specialized electronic circuit; graphics accelerator

A graphics processing unit (GPU) is a specialized electronic circuit initially designed for digital image processing and to accelerate computer graphics, being present either as a discrete video card or embedded on motherboards, mobile phones, personal computers, workstations, and game consoles. After their initial design, GPUs were found to be useful for non-graphic calculations involving embarrassingly parallel problems due to their parallel structure. Other non-graphical uses include the training of neural networks and cryptocurrency mining.

<span class="mw-page-title-main">GeForce 4 series</span> Series of GPUs by Nvidia

The GeForce 4 series refers to the fourth generation of Nvidia's GeForce line of graphics processing units (GPUs). There are two different GeForce4 families, the high-performance Ti family (NV25), and the budget MX family (NV17). The MX family spawned a mostly identical GeForce4 Go (NV17M) family for the laptop market. All three families were announced in early 2002; members within each family were differentiated by core and memory clock speeds. In late 2002, there was an attempt to form a fourth family, also for the laptop market, the only member of it being the GeForce4 4200 Go (NV28M) which was derived from the Ti line.

<span class="mw-page-title-main">GeForce 6 series</span> Series of GPUs by Nvidia

The GeForce 6 series is the sixth generation of Nvidia's GeForce line of graphics processing units. Launched on April 14, 2004, the GeForce 6 family introduced PureVideo post-processing for video, SLI technology, and Shader Model 3.0 support.

The R420 GPU, developed by ATI Technologies, was the company's basis for its 3rd-generation DirectX 9.0/OpenGL 2.0-capable graphics cards. Used first on the Radeon X800, the R420 was produced on a 0.13 micrometer low-K photolithography process and used GDDR-3 memory. The chip was designed for AGP graphics cards.

<span class="mw-page-title-main">Voodoo 5</span> Graphics card line

The Voodoo 5 was the last and most powerful graphics card line that was released by 3dfx Interactive. All members of the family were based upon the VSA-100 graphics processor. Only the single-chip Voodoo 4 4500 and dual-chip Voodoo 5 5500 made it to market.

<span class="mw-page-title-main">Radeon R200 series</span> Series of video cards

The R200 is the second generation of GPUs used in Radeon graphics cards and developed by ATI Technologies. This GPU features 3D acceleration based upon Microsoft Direct3D 8.1 and OpenGL 1.3, a major improvement in features and performance compared to the preceding Radeon R100 design. The GPU also includes 2D GUI acceleration, video acceleration, and multiple display outputs. "R200" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.

<span class="mw-page-title-main">GeForce 7 series</span> Series of GPUs by Nvidia

The GeForce 7 series is the seventh generation of Nvidia's GeForce line of graphics processing units. This was the last series available on AGP cards.

<span class="mw-page-title-main">Radeon X1000 series</span> GPU series by ATI Technologies

The R520 is a graphics processing unit (GPU) developed by ATI Technologies and produced by TSMC. It was the first GPU produced using a 90 nm photolithography process.

<span class="mw-page-title-main">Radeon R300 series</span> Series of video cards

The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in Radeon graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in features and performance compared to the preceding R200 design. R300 was the first fully Direct3D 9-capable consumer graphics chip. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs.

<span class="mw-page-title-main">Radeon R100 series</span> Series of video cards

The Radeon R100 is the first generation of Radeon graphics chips from ATI Technologies. The line features 3D acceleration based upon Direct3D 7.0 and OpenGL 1.3, and all but the entry-level versions offloading host geometry calculations to a hardware transform and lighting (T&L) engine, a major improvement in features and performance compared to the preceding Rage design. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs. "R100" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.

<span class="mw-page-title-main">Matrox Parhelia</span> GPU by Matrox

The Matrox Parhelia-512 is a graphics processing unit (GPU) released by Matrox in June 2002. It has full support for DirectX 8.1 and incorporates several DirectX 9.0 features. At the time of its release, it was best known for its ability to drive three monitors and its Coral Reef tech demo.

The G400 is a video card made by Matrox, released in September 1999. The graphics processor contains a 2D GUI, video, and Direct3D 6.0 3D accelerator. Codenamed "Toucan", it was a more powerful and refined version of its predecessor, the G200.

<span class="mw-page-title-main">Radeon 9000 series</span> Series of video cards

The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in Radeon graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in features and performance compared to the preceding R200 design. R300 was the first fully Direct3D 9-capable consumer graphics chip. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs.

The R200 is the second generation of GPUs used in Radeon graphics cards and developed by ATI Technologies. This GPU features 3D acceleration based upon Microsoft Direct3D 8.1 and OpenGL 1.3, a major improvement in features and performance compared to the preceding Radeon R100 design. The GPU also includes 2D GUI acceleration, video acceleration, and multiple display outputs. "R200" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.

TeraScale is the codename for a family of graphics processing unit microarchitectures developed by ATI Technologies/AMD and their second microarchitecture implementing the unified shader model following Xenos. TeraScale replaced the old fixed-pipeline microarchitectures and competed directly with Nvidia's first unified shader microarchitecture named Tesla.

References

  1. "NVIDIA GeForce3 Roundup - July 2001".
  2. "Programming at Last".
  3. Labs, iXBT. "April 2002 3Digest - NVIDIA GeForce3". iXBT Labs.
  4. "GeForce RTX 20 Series Graphics Cards and Laptops".
  5. www.gamespot.com%2Farticles%2Finside-the-xbox-gpu%2F1100-2764159%2F&usg=AOvVaw1vJa2HmTv8rbV4v44JRGy-&opi=89978449
  6. "Microsoft clarify NV2A". Eurogamer.net. March 28, 2001. Archived from the original on March 20, 2020. Retrieved October 10, 2024.
  7. Graphics Processor Specifications, IGN, 2001
  8. "Anandtech Microsoft's Xbox". Anandtech.com. Archived from the original on November 4, 2010. Retrieved October 10, 2024.
  9. Smith, Tony (February 14, 2001). "TSMC starts fabbing Nvidia Xbox chips". The Register . Retrieved October 10, 2024.
  10. "NVIDIA's GeForce3 graphics processor". techreport.com. June 26, 2001. Retrieved June 25, 2017.
  11. "ATI's Radeon 8500: Off the beaten path". techreport.com. December 31, 2001. Retrieved June 25, 2017.
  12. "A look at NVIDIA's GeForce4 chips". February 6, 2002.
  13. "ActiveWin.Com: NVIDIA GeForce 4 Ti 4600 - Review". www.activewin.com.
  14. https://www.tomshardware.com/reviews/pc-graphics-xbox,423-19.html
  15. Lal Shimpi, Anand (February 6, 2002). "Nvidia GeForce4 - NV17 and NV25 Come to Life". AnandTech. Retrieved October 1, 2024.
  16. https://www.tomshardware.com/reviews/pc-graphics-xbox,423-18.html
  17. Worobyew, Andrew.; Medvedev, Alexander. "Nvidia GeForce4 Ti 4400 and GeForce4 Ti 4600 (NV25) Review". Pricenfees. Archived from the original on October 12, 2018. Retrieved October 1, 2024.
  18. "Driver Details". NVIDIA.