HyperZ is the brand for a set of processing techniques developed by ATI Technologies and later Advanced Micro Devices and implemented in their Radeon-GPUs. HyperZ was announced in November 2000 [1] and was still available in the TeraScale-based Radeon HD 2000 Series [2] [3] and in current Graphics Core Next-based graphics products. [4]
On the Radeon R100-based cores, Radeon DDR through 7500, where HyperZ debuted, ATI claimed a 20% improvement in overall rendering efficiency. They stated that with HyperZ, Radeon could be said to offer 1.5 gigatexels per second fillrate performance instead of the card's apparent theoretical rate of 1.2 gigatexels. In testing it was shown that HyperZ did indeed offer a tangible performance improvement that allowed the less endowed Radeon to keep up with the less efficient GeForce 2 GTS. [5]
HyperZ consists of three mechanisms:
With each new microarchitecture, ATI has revised and improved the technology.
A depth buffer, also known as a z-buffer, is a type of data buffer used in computer graphics to represent depth information of objects in 3D space from a particular perspective. The depth is stored as a height map of the scene, the values representing a distance to camera, with 0 being the closest. The encoding scheme may be flipped with the highest number being the value closest to camera. Depth buffers are an aid to rendering a scene to ensure that the correct polygons properly occlude other polygons. Z-buffering was first described in 1974 by Wolfgang Straßer in his PhD thesis on fast algorithms for rendering occluded objects. A similar solution to determining overlapping polygons is the painter's algorithm, which is capable of handling non-opaque scene elements, though at the cost of efficiency and incorrect results.
The GeForce FX or "GeForce 5" series is a line of graphics processing units from the manufacturer Nvidia.
ATI Technologies Inc., commonly called ATI, was a Canadian semiconductor technology corporation based in Markham, Ontario, that specialized in the development of graphics processing units and chipsets. Founded in 1985, the company listed publicly in 1993 and was acquired by AMD in 2006. As a major fabrication-less or fabless semiconductor company, ATI conducted research and development in-house and outsourced the manufacturing and assembly of its products. With the decline and eventual bankruptcy of 3dfx in 2000, ATI and its chief rival Nvidia emerged as the two dominant players in the graphics processors industry, eventually forcing other manufacturers into niche roles.
The GeForce 2 series (NV15) is the second generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in 2000, it is the successor to the GeForce 256.
The GeForce 3 series (NV20) is the third generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in February 2001, it advanced the GeForce architecture by adding programmable pixel and vertex shaders, multisample anti-aliasing and improved the overall efficiency of the rendering process.
The GeForce 4 series refers to the fourth generation of Nvidia's GeForce line of graphics processing units (GPUs). There are two different GeForce4 families, the high-performance Ti family, and the budget MX family. The MX family spawned a mostly identical GeForce4 Go (NV17M) family for the laptop market. All three families were announced in early 2002; members within each family were differentiated by core and memory clock speeds. In late 2002, there was an attempt to form a fourth family, also for the laptop market, the only member of it being the GeForce4 4200 Go (NV28M) which was derived from the Ti line.
Radeon is a brand of computer products, including graphics processing units, random-access memory, RAM disk software, and solid-state drives, produced by Radeon Technologies Group, a division of AMD. The brand was launched in 2000 by ATI Technologies, which was acquired by AMD in 2006 for US$5.4 billion.
Core Image is a pixel-accurate, near-realtime, non-destructive image processing technology in Mac OS X. Implemented as part of the QuartzCore framework of Mac OS X 10.4 and later, Core Image provides a plugin-based architecture for applying filters and effects within the Quartz graphics rendering layer. The framework was later added to iOS in iOS 5.
The R420 GPU, developed by ATI Technologies, was the company's basis for its 3rd-generation DirectX 9.0/OpenGL 2.0-capable graphics cards. Used first on the Radeon X800, the R420 was produced on a 0.13 micrometer low-K photolithography process and used GDDR-3 memory. The chip was designed for AGP graphics cards.
The Voodoo 5 was the last and most powerful graphics card line that was released by 3dfx Interactive. All members of the family were based upon the VSA-100 graphics processor. Only the single-chip Voodoo 4 4500 and dual-chip Voodoo 5 5500 made it to market.
Mesa, also called Mesa3D and The Mesa 3D Graphics Library, is an open source implementation of OpenGL, Vulkan, and other graphics API specifications. Mesa translates these specifications to vendor-specific graphics hardware drivers.
The R200 is the second generation of GPUs used in Radeon graphics cards and developed by ATI Technologies. This GPU features 3D acceleration based upon Microsoft Direct3D 8.1 and OpenGL 1.3, a major improvement in features and performance compared to the preceding Radeon R100 design. The GPU also includes 2D GUI acceleration, video acceleration, and multiple display outputs. "R200" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.
The R520 is a graphics processing unit (GPU) developed by ATI Technologies and produced by TSMC. It was the first GPU produced using a 90 nm photolithography process.
The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in Radeon graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in features and performance compared to the preceding R200 design. R300 was the first fully Direct3D 9-capable consumer graphics chip. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs.
The Radeon R100 is the first generation of Radeon graphics chips from ATI Technologies. The line features 3D acceleration based upon Direct3D 7.0 and OpenGL 1.3, and all but the entry-level versions offloading host geometry calculations to a hardware transform and lighting (T&L) engine, a major improvement in features and performance compared to the preceding Rage design. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs. "R100" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.
The Matrox Parhelia-512 is a graphics processing unit (GPU) released by Matrox in 2002. It has full support for DirectX 8.1 and incorporates several DirectX 9.0 features. At the time of its release, it was best known for its ability to drive three monitors and its Coral Reef tech demo.
The Radeon R700 is the engineering codename for a graphics processing unit series developed by Advanced Micro Devices under the ATI brand name. The foundation chip, codenamed RV770, was announced and demonstrated on June 16, 2008 as part of the FireStream 9250 and Cinema 2.0 initiative launch media event, with official release of the Radeon HD 4800 series on June 25, 2008. Other variants include enthusiast-oriented RV790, mainstream product RV730, RV740 and entry-level RV710.
The All-in-Wonder was a combination graphics card/TV tuner card designed by ATI Technologies. It was introduced on November 11, 1996. ATI had previously used the Wonder trademark on other graphics cards, however, they were not full TV/graphics combo cards. ATI also made other TV oriented cards that use the word Wonder, and remote control. The All-in-Wonder line debuted with the Rage chipset series. The cards were available in two forms, built by third-party manufacturers as well as by ATI itself.
The R300 GPU, introduced in August 2002 and developed by ATI Technologies, is its third generation of GPU used in Radeon graphics cards. This GPU features 3D acceleration based upon Direct3D 9.0 and OpenGL 2.0, a major improvement in features and performance compared to the preceding R200 design. R300 was the first fully Direct3D 9-capable consumer graphics chip. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs.
The R200 is the second generation of GPUs used in Radeon graphics cards and developed by ATI Technologies. This GPU features 3D acceleration based upon Microsoft Direct3D 8.1 and OpenGL 1.3, a major improvement in features and performance compared to the preceding Radeon R100 design. The GPU also includes 2D GUI acceleration, video acceleration, and multiple display outputs. "R200" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.