This article needs additional citations for verification .(February 2023) |
Fixed-function is a term canonically used to contrast 3D graphics APIs and earlier GPUs designed prior to the advent of shader-based 3D graphics APIs and GPU architectures.
Historically fixed-function APIs consisted of a set of function entry points that would approximately or directly map to dedicated logic for their named purpose in GPUs designed to support them. As shader based GPUs and APIs evolved, fixed-function APIs were implemented by graphics driver engineers using the more general purpose shading architecture. This approach served as a segue that would continue providing the fixed-function API abstraction most developers were experienced with while allowing further development and enhancements of the newer shader-based architectures.
OpenGL, OpenGL ES and DirectX (Direct3D) are all 3D graphics APIs that went through the transition from the fixed-function programming model to the shader-based programming model. [1] Below is a table of when the transition from fixed-function to shaders was made:
3D API | Last Fixed-function Version | First Shader Version |
---|---|---|
OpenGL | v1.5 | v2.0 |
OpenGL ES | v1.1 | v2.0 |
DirectX | v7.0 | v8.0 |
Fixed function APIs tend to be a simpler programming abstraction with a series of well-defined and specifically named graphics pipeline stages. Shader-based APIs treat graphics data (vertices and pixels / texels) generically and allow a great deal of flexibility in how this data is modulated. More sophisticated rendering techniques are possible using a shader-based API.
Microsoft DirectX is a collection of application programming interfaces (APIs) for handling tasks related to multimedia, especially game programming and video, on Microsoft platforms. Originally, the names of these APIs all began with "Direct", such as Direct3D, DirectDraw, DirectMusic, DirectPlay, DirectSound, and so forth. The name DirectX was coined as a shorthand term for all of these APIs and soon became the name of the collection. When Microsoft later set out to develop a gaming console, the X was used as the basis of the name Xbox to indicate that the console was based on DirectX technology. The X initial has been carried forward in the naming of APIs designed for the Xbox such as XInput and the Cross-platform Audio Creation Tool (XACT), while the DirectX pattern has been continued for Windows APIs such as Direct2D and DirectWrite.
OpenGL is a cross-language, cross-platform application programming interface (API) for rendering 2D and 3D vector graphics. The API is typically used to interact with a graphics processing unit (GPU), to achieve hardware-accelerated rendering.
In computer graphics, rasterisation or rasterization is the task of taking an image described in a vector graphics format (shapes) and converting it into a raster image. The rasterized image may then be displayed on a computer display, video display or printer, or stored in a bitmap file format. Rasterization may refer to the technique of drawing 3D models, or the conversion of 2D rendering primitives such as polygons, line segments into a rasterized format.
Direct3D is a graphics application programming interface (API) for Microsoft Windows. Part of DirectX, Direct3D is used to render three-dimensional graphics in applications where performance is important, such as games. Direct3D uses hardware acceleration if it is available on the graphics card, allowing for hardware acceleration of the entire 3D rendering pipeline or even only partial acceleration. Direct3D exposes the advanced graphics capabilities of 3D graphics hardware, including Z-buffering, W-buffering, stencil buffering, spatial anti-aliasing, alpha blending, color blending, mipmapping, texture blending, clipping, culling, atmospheric effects, perspective-correct texture mapping, programmable HLSL shaders and effects. Integration with other DirectX technologies enables Direct3D to deliver such features as video mapping, hardware 3D rendering in 2D overlay planes, and even sprites, providing the use of 2D and 3D graphics in interactive media ties.
A graphics processing unit (GPU) is a specialized electronic circuit designed to manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display device. GPUs are used in embedded systems, mobile phones, personal computers, workstations, and game consoles.
QuickDraw 3D, or QD3D for short, is a 3D graphics API developed by Apple Inc. starting in 1995, originally for their Macintosh computers, but delivered as a cross-platform system.
Direct3D and OpenGL are competing application programming interfaces (APIs) which can be used in applications to render 2D and 3D computer graphics. As of 2005, graphics processing units (GPUs) almost always implement one version of both of these APIs. Examples include: DirectX 9 and OpenGL 2 circa 2004; DirectX 10 and OpenGL 3 circa 2008; and most recently, DirectX 11 and OpenGL 4 circa 2011. GPUs that support more recent versions of the standards are backwards compatible with applications that use the older standards; for example, one can run older DirectX 9 games on a more recent DirectX 11-certified GPU.
In computer graphics, a shader is a computer program that calculates the appropriate levels of light, darkness, and color during the rendering of a 3D scene - a process known as shading. Shaders have evolved to perform a variety of specialized functions in computer graphics special effects and video post-processing, as well as general-purpose computing on graphics processing units.
General-purpose computing on graphics processing units is the use of a graphics processing unit (GPU), which typically handles computation only for computer graphics, to perform computation in applications traditionally handled by the central processing unit (CPU). The use of multiple video cards in one computer, or large numbers of graphics chips, further parallelizes the already parallel nature of graphics processing.
A shading language is a graphics programming language adapted to programming shader effects. Such language forms usually consist of special data types, like "vector", "matrix", "color" and "normal". Due to the variety of target markets for 3D computer graphics, different shading languages have been developed.
Software rendering is the process of generating an image from a model by means of computer software. In the context of computer graphics rendering, software rendering refers to a rendering process that is not dependent upon graphics hardware ASICs, such as a graphics card. The rendering takes place entirely in the CPU. Rendering everything with the (general-purpose) CPU has the main advantage that it is not restricted to the (limited) capabilities of graphics hardware, but the disadvantage is that more transistors are needed to obtain the same speed.
CUDA is a parallel computing platform and application programming interface (API) that allows software to use certain types of graphics processing units (GPUs) for general purpose processing, an approach called general-purpose computing on GPUs (GPGPU). CUDA is a software layer that gives direct access to the GPU's virtual instruction set and parallel computational elements, for the execution of compute kernels.
In the field of 3D computer graphics, the unified shader model refers to a form of shader hardware in a graphical processing unit (GPU) where all of the shader stages in the rendering pipeline have the same capabilities. They can all read textures and buffers, and they use instruction sets that are almost identical.
AMD FireStream was AMD's brand name for their Radeon-based product line targeting stream processing and/or GPGPU in supercomputers. Originally developed by ATI Technologies around the Radeon X1900 XTX in 2006, the product line was previously branded as both ATI FireSTREAM and AMD Stream Processor. The AMD FireStream can also be used as a floating-point co-processor for offloading CPU calculations, which is part of the Torrenza initiative. The FireStream line has been discontinued since 2012, when GPGPU workloads were entirely folded into the AMD FirePro line.
Mantle was a low-overhead rendering API targeted at 3D video games. AMD originally developed Mantle in cooperation with DICE, starting in 2013. Mantle was designed as an alternative to Direct3D and OpenGL, primarily for use on personal computers, although Mantle supports the GPUs present in the PlayStation 4 and in the Xbox One. In 2015, Mantle's public development was suspended and in 2019 completely discontinued, as DirectX 12 and the Mantle-derived Vulkan rose in popularity.
TeraScale is the codename for a family of graphics processing unit microarchitectures developed by ATI Technologies/AMD and their second microarchitecture implementing the unified shader model following Xenos. TeraScale replaced the old fixed-pipeline microarchitectures and competed directly with Nvidia's first unified shader microarchitecture named Tesla.
Metal is a low-level, low-overhead hardware-accelerated 3D graphic and compute shader API created by Apple. It debuted in iOS 8. Metal combines functions similar to OpenGL and OpenCL in one API. It is intended to improve performance by offering low-level access to the GPU hardware for apps on iOS, iPadOS, macOS, and tvOS. It can be compared to low-level APIs on other platforms such as Vulkan and DirectX 12.
In computing, a compute kernel is a routine compiled for high throughput accelerators, separate from but used by a main program. They are sometimes called compute shaders, sharing execution units with vertex shaders and pixel shaders on GPUs, but are not limited to execution on one class of device, or graphics APIs.
This is a glossary of terms relating to computer graphics.
Cg and High-Level Shader Language (HLSL) are two names given to a high-level shading language developed by Nvidia and Microsoft for programming shaders. Cg/HLSL is based on the C programming language and although they share the same core syntax, some features of C were modified and new data types were added to make Cg/HLSL more suitable for programming graphics processing units.