Deferred shading

Last updated
Diffuse Color G-Buffer Deferred rendering pass col.jpg
Diffuse Color G-Buffer
Z-Buffer Deferred rendering pass dep.jpg
Z-Buffer
Surface Normal G-Buffer Deferred rendering pass nor.jpg
Surface Normal G-Buffer
Final compositing (to calculate the shadows shown in this image, other techniques such as shadow mapping, shadow feelers or a shadow volume must be used together with deferred shading). Deferred rendering pass res.jpg
Final compositing (to calculate the shadows shown in this image, other techniques such as shadow mapping, shadow feelers or a shadow volume must be used together with deferred shading).

In the field of 3D computer graphics, deferred shading is a screen-space shading technique that is performed on a second rendering pass, after the vertex and pixel shaders are rendered. [2] It was first suggested by Michael Deering in 1988. [3]

Contents

On the first pass of a deferred shader, only data that is required for shading computation is gathered. Positions, normals, and materials for each surface are rendered into the geometry buffer (G-buffer) using "render to texture". After this, a pixel shader computes the direct and indirect lighting at each pixel using the information of the texture buffers in screen space.

Screen space directional occlusion [4] can be made part of the deferred shading pipeline to give directionality to shadows and interreflections.

Advantages

The primary advantage of deferred shading is the decoupling of scene geometry from lighting. Only one geometry pass is required, and each light is only computed for those pixels that it actually affects. This gives the ability to render many lights in a scene without a significant performance hit. [5] There are some other advantages claimed for the approach. These advantages may include simpler management of complex lighting resources, ease of managing other complex shader resources, and the simplification of the software rendering pipeline.

Disadvantages

One key disadvantage of deferred rendering is the inability to handle transparency within the algorithm, although this problem is a generic one in Z-buffered scenes and it tends to be handled by delaying and sorting the rendering of transparent portions of the scene. [6] Depth peeling can be used to achieve order-independent transparency in deferred rendering, but at the cost of additional batches and g-buffer size. Modern hardware, supporting DirectX 10 and later, is often capable of performing batches fast enough to maintain interactive frame rates. When order-independent transparency is desired (commonly for consumer applications) deferred shading is no less effective than forward shading using the same technique.

Another serious disadvantage is the difficulty with using multiple materials. It's possible to use many different materials, but it requires more data to be stored in the G-buffer, which is already quite large and takes up a large amount of the memory bandwidth. [7]

One more disadvantage is that, due to separating the lighting stage from the geometric stage, hardware anti-aliasing does not produce correct results anymore since interpolated subsamples would result in nonsensical position, normal, and tangent attributes. One of the usual techniques to overcome this limitation is using edge detection on the final image and then applying blur over the edges, [8] however recently more advanced post-process edge-smoothing techniques have been developed, such as MLAA [9] [10] (used in Killzone 3 and Dragon Age II , among others), FXAA [11] (used in Crysis 2 , FEAR 3 , Duke Nukem Forever ), SRAA, [12] DLAA [13] (used in Star Wars: The Force Unleashed II ), and post MSAA (used in Crysis 2 as default anti-aliasing solution). Although it is not an edge-smoothing technique, temporal anti-aliasing (used in Halo: Reach and Unreal Engine ) can also help give edges a smoother appearance. [14] DirectX 10 introduced features allowing shaders to access individual samples in multisampled render targets (and depth buffers in version 10.1), giving users of this API access to hardware anti-aliasing in deferred shading. These features also allow them to correctly apply HDR luminance mapping to anti-aliased edges, where in earlier versions of the API any benefit of anti-aliasing may have been lost.

Deferred lighting

Deferred lighting (also known as Light Pre-Pass) is a modification of the Deferred Shading. [15] This technique uses three passes, instead of two in deferred shading. On first pass over the scene geometry, only normals and specular spread factor are written to the color buffer. The screen-space, “deferred” pass then accumulates diffuse and specular lighting data separately, so a last pass must be made over the scene geometry to output final image with per-pixel shading. The apparent advantage of deferred lighting is a dramatic reduction in the size of the G-Buffer. The obvious cost is the need to render the scene geometry twice instead of once. An additional cost is that the deferred pass in deferred lighting must output diffuse and specular irradiance separately, whereas the deferred pass in deferred shading need only output a single combined radiance value.

Due to reduction of the size of the G-buffer this technique can partially overcome one serious disadvantage of the deferred shading - multiple materials. Another problem that can be solved is MSAA. Deferred lighting can be used with MSAA on DirectX 9 hardware.[ citation needed ]

Deferred lighting in commercial games

Use of the technique has increased in video games because of the control it enables in terms of using a large amount of dynamic lights and reducing the complexity of required shader instructions. Some examples of games using deferred lighting are:

Deferred shading in commercial games

In comparison to deferred lighting, this technique is not very popular[ citation needed ] due to high memory size and bandwidth requirements, especially on seventh generation consoles where graphic memory size and bandwidth are limited and often bottlenecks.

Game engines featuring deferred shading or rendering techniques

History

The idea of deferred shading was originally introduced by Michael Deering and his colleagues in a paper [3] published in 1988 titled The triangle processor and normal vector shader: a VLSI system for high performance graphics. Although the paper never uses the word "deferred", a key concept is introduced; each pixel is shaded only once after depth resolution. Deferred shading as we know it today, using G-buffers, was introduced in a paper by Saito and Takahashi in 1990, [56] although they too do not use the word "deferred". The first deferred shaded video game was Shrek , an Xbox launch title shipped in 2001. [57] Around 2004, implementations on commodity graphics hardware started to appear. [58] The technique later gained popularity for applications such as video games, finally becoming mainstream around 2008 to 2010. [59]

Related Research Articles

<span class="mw-page-title-main">Rendering (computer graphics)</span> Process of generating an image from a model

Rendering or image synthesis is the process of generating a photorealistic or non-photorealistic image from a 2D or 3D model by means of a computer program. The resulting image is referred to as a rendering. Multiple models can be defined in a scene file containing objects in a strictly defined language or data structure. The scene file contains geometry, viewpoint, textures, lighting, and shading information describing the virtual scene. The data contained in the scene file is then passed to a rendering program to be processed and output to a digital image or raster graphics image file. The term "rendering" is analogous to the concept of an artist's impression of a scene. The term "rendering" is also used to describe the process of calculating effects in a video editing program to produce the final video output.

<span class="mw-page-title-main">Scanline rendering</span> 3D computer graphics image rendering method

Scanline rendering is an algorithm for visible surface determination, in 3D computer graphics, that works on a row-by-row basis rather than a polygon-by-polygon or pixel-by-pixel basis. All of the polygons to be rendered are first sorted by the top y coordinate at which they first appear, then each row or scan line of the image is computed using the intersection of a scanline with the polygons on the front of the sorted list, while the sorted list is updated to discard no-longer-visible polygons as the active scan line is advanced down the picture.

<span class="mw-page-title-main">Texture mapping</span> Method of defining surface detail on a computer-generated graphic or 3D model

Texture mapping is a method for mapping a texture on a computer-generated graphic. "Texture" in this context can be high frequency detail, surface texture, or color.

<span class="mw-page-title-main">Shading</span> Depicting depth through varying levels of darkness

Shading refers to the depiction of depth perception in 3D models or illustrations by varying the level of darkness. Shading tries to approximate local behavior of light on the object's surface and is not to be confused with techniques of adding shadows, such as shadow mapping or shadow volumes, which fall under global behavior of light.

<span class="mw-page-title-main">GeForce 3 series</span> Series of GPUs by Nvidia

The GeForce 3 series (NV20) is the third generation of Nvidia's GeForce line of graphics processing units (GPUs). Introduced in February 2001, it advanced the GeForce architecture by adding programmable pixel and vertex shaders, multisample anti-aliasing and improved the overall efficiency of the rendering process.

<span class="mw-page-title-main">Shadow volume</span> Computer graphics technique

Shadow volume is a technique used in 3D computer graphics to add shadows to a rendered scene. They were first proposed by Frank Crow in 1977 as the geometry describing the 3D shape of the region occluded from a light source. A shadow volume divides the virtual world in two: areas that are in shadow and areas that are not.

<span class="mw-page-title-main">Volume rendering</span> Representing a 3D-modeled object or dataset as a 2D projection

In scientific visualization and computer graphics, volume rendering is a set of techniques used to display a 2D projection of a 3D discretely sampled data set, typically a 3D scalar field.

<span class="mw-page-title-main">Shader</span> Type of program in a graphical processing unit (GPU)

In computer graphics, a shader is a computer program that calculates the appropriate levels of light, darkness, and color during the rendering of a 3D scene—a process known as shading. Shaders have evolved to perform a variety of specialized functions in computer graphics special effects and video post-processing, as well as general-purpose computing on graphics processing units.

A first-person shooter engine is a video game engine specialized for simulating 3D environments for use in a first-person shooter video game. First-person refers to the view where the players see the world from the eyes of their characters. Shooter refers to games which revolve primarily around wielding firearms and killing other entities in the game world, either non-player characters or other players.

A shading language is a graphics programming language adapted to programming shader effects. Shading languages usually consist of special data types like "vector", "matrix", "color" and "normal".

In computer graphics, per-pixel lighting refers to any technique for lighting an image or scene that calculates illumination for each pixel on a rendered image. This is in contrast to other popular methods of lighting such as vertex lighting, which calculates illumination at each vertex of a 3D model and then interpolates the resulting values over the model's faces to calculate the final per-pixel color values.

<span class="mw-page-title-main">Shadow mapping</span> Method to draw shadows in computer graphic images

Shadow mapping or shadowing projection is a process by which shadows are added to 3D computer graphics. This concept was introduced by Lance Williams in 1978, in a paper entitled "Casting curved shadows on curved surfaces." Since then, it has been used both in pre-rendered and realtime scenes in many console and PC games.

Multisample anti-aliasing (MSAA) is a type of spatial anti-aliasing, a technique used in computer graphics to remove jaggies.

Computer graphics lighting is the collection of techniques used to simulate light in computer graphics scenes. While lighting techniques offer flexibility in the level of detail and functionality available, they also operate at different levels of computational demand and complexity. Graphics artists can choose from a variety of light sources, models, shading techniques, and effects to suit the needs of each application.

Tiled rendering is the process of subdividing a computer graphics image by a regular grid in optical space and rendering each section of the grid, or tile, separately. The advantage to this design is that the amount of memory and bandwidth is reduced compared to immediate mode rendering systems that draw the entire frame at once. This has made tile rendering systems particularly common for low-power handheld device use. Tiled rendering is sometimes known as a "sort middle" architecture, because it performs the sorting of the geometry in the middle of the graphics pipeline instead of near the end.

The term post-processing is used in the video and film industry for quality-improvement image processing methods used in video playback devices, such as stand-alone DVD-Video players; video playing software; and transcoding software. It is also commonly used in real-time 3D rendering to add additional effects.

Order-independent transparency (OIT) is a class of techniques in rasterisational computer graphics for rendering transparency in a 3D scene, which do not require rendering geometry in sorted order for alpha compositing.

PICA200 is a graphics processing unit (GPU) designed by Digital Media Professionals Inc. (DMP), a Japanese GPU design startup company, for use in embedded devices such as vehicle systems, mobile phones, cameras, and game consoles. The PICA200 is an IP Core which can be licensed to other companies to incorporate into their SOCs. It was most notably licensed for use in the Nintendo 3DS.

Luminous Engine, originally called Luminous Studio, is a multi-platform game engine developed and used internally by Square Enix and later on by Luminous Productions. The engine was developed for and targeted at eighth-generation hardware and DirectX 11-compatible platforms, such as Xbox One, the PlayStation 4, and versions of Microsoft Windows. It was conceived during the development of Final Fantasy XIII-2 to be compatible with next generation consoles that their existing platform, Crystal Tools, could not handle.

This is a glossary of terms relating to computer graphics.

References

  1. Hargreaves, Shawn; Harris, Mark (2004). "6800 Leagues Under the Sea: Deferred Shading" (PDF). Nvidia . Archived (PDF) from the original on November 22, 2009. Retrieved January 6, 2021.
  2. "Forward Rendering vs. Deferred Rendering".
  3. 1 2 Deering, Michael; Stephanie Winner; Bic Schediwy; Chris Duffy; Neil Hunt (1988). "The triangle processor and normal vector shader: A VLSI system for high performance graphics". ACM SIGGRAPH Computer Graphics. 22 (4): 21–30. doi:10.1145/378456.378468.
  4. O'Donnell, Yuriy (July 18, 2011). "Deferred Screen Space Directional Occlusion". kayru.org. Archived from the original on October 22, 2012.
  5. Kayi, Celal Cansin. "Deferred Rendering in XNA 4" (PDF). Linnaeus University . Archived (PDF) from the original on August 13, 2013. Retrieved January 6, 2021.
  6. "SDK 9.51 – Featured Code Samples". Nvidia . January 17, 2007. Archived from the original on March 8, 2005. Retrieved March 28, 2007.
  7. Engel, Wolfgang (March 16, 2008). "Light Pre-Pass Renderer". Diary of a Graphics Programmer. Archived from the original on April 7, 2008. Retrieved January 6, 2021.
  8. "Deferred shading tutorial" (PDF). Pontifical Catholic University of Rio de Janeiro. Archived from the original (PDF) on March 6, 2009. Retrieved February 14, 2008.
  9. "MLAA: Efficiently Moving Antialiasing from the GPU to the CPU" (PDF). Intel . Retrieved December 2, 2018.
  10. "Morphological antialiasing and topological reconstruction" (PDF). Gustave Eiffel University . Archived (PDF) from the original on April 3, 2012. Retrieved January 6, 2021.
  11. "Archived copy" (PDF). Archived from the original (PDF) on November 25, 2011. Retrieved November 7, 2011.{{cite web}}: CS1 maint: archived copy as title (link)
  12. Chajdas, Matthäus G.; McGuire, Morgan; Luebke, David (February 1, 2011). "Subpixel Reconstruction Antialiasing". Nvidia . Archived from the original on January 27, 2011. Retrieved January 6, 2021.
  13. Andreev, Dmitry (2011). "Anti-Aliasing from a Different Perspective". and.intercon.ru. Archived from the original on April 4, 2011. Retrieved January 6, 2021.
  14. Andreev, Dmitry (March 4, 2011). "Anti-Aliasing from a Different Perspective (GDC 2011 Extended Slides)". and.intercon.ru. Archived from the original on April 5, 2011. Retrieved January 6, 2021.
  15. "Real-Time Rendering · Deferred lighting approaches". realtimerendering.com.
  16. "Assassin's Creed III: The Redesigned Anvil Engine". www.GameInformer.com. Archived from the original on March 30, 2012.
  17. "BioShock Infinite development is PS3 focused and uses Uncharted 2 tech". blorge.com. Archived from the original on October 3, 2011.
  18. Chetan Jags (July 18, 2023). "BlackMesa XenEngine: Part 4 – Lighting & Shadows". chetanjags.wordpress.com. Retrieved September 18, 2023.
  19. "Tech Interview: Crackdown 2". Eurogamer.net. June 26, 2010.
  20. guest11b095. "A Bit More Deferred Cry Engine3". slideshare.net.{{cite web}}: CS1 maint: numeric names: authors list (link)
  21. "Dead Space by Electronic Arts". NVIDIA. Retrieved February 14, 2008.
  22. "Face-Off: Dead Space 2". Eurogamer . Retrieved February 1, 2010.
  23. "Face-Off: Dead Space 3". Eurogamer . Retrieved February 18, 2013.
  24. "Google Translate". google.com.
  25. "GregaMan, Manage Blog". capcom-unity.com.
  26. "Normals". Imgur.
  27. "Tech Interview: Halo: Reach". Eurogamer.net. December 11, 2010.
  28. 1 2 "Tech Analysis: Metal Gear Solid 5's FOX Engine". Eurogamer.net. April 5, 2013.
  29. "Archived copy" (PDF). Archived from the original (PDF) on September 15, 2011. Retrieved July 12, 2011.{{cite web}}: CS1 maint: archived copy as title (link)
  30. "The Making of Shift 2 Unleashed Article • Page 2 • Eurogamer.net". Eurogamer.net. May 14, 2011.
  31. "StarCraft II Effects & techniques" (PDF). AMD. Retrieved July 9, 2012.
  32. "CGSociety Maintenance". cgsociety.org. Archived from the original on April 2, 2015. Retrieved July 12, 2011.
  33. "Deferred Rendering « PlatinumGames Inc". platinumgames.com. Archived from the original on November 27, 2010.
  34. "Ghost of Tsushima Analysis: A PS4 Graphics Powerhouse". gamingbolt.com.
  35. Silard Šimon. "Frictional Games interview". playsomnia.com.
  36. DICE. "SPU-Based Deferred Shading in BATTLEFIELD 3 for Playstation 3". slideshare.net.
  37. "Valve Developer Wiki - Dota 2" . Retrieved April 10, 2012.
  38. "Archived copy" (PDF). Archived from the original (PDF) on July 11, 2011. Retrieved July 12, 2011.{{cite web}}: CS1 maint: archived copy as title (link)
  39. Miner Wars 2081
  40. "Tech Interview: Metro 2033 Interview • Page 2 • Eurogamer.net". Eurogamer.net. February 25, 2010.
  41. "History - Electric Sheep Games" . Retrieved April 14, 2011.
  42. Shishkovtsov, Oles (March 7, 2005). "GPU Gems 2: Chapter 9. Deferred Shading in S.T.A.L.K.E.R". Nvidia . Retrieved February 2, 2011.
  43. "Deferred shading in Tabula Rasa". NVIDIA. Archived from the original on February 3, 2009. Retrieved February 14, 2008.
  44. "Steam Users' Forums - View Single Post - Taking the Physx load off the CPU..." steampowered.com.
  45. "Steam Users' Forums - View Single Post - Trine 2 rendering information - anti-aliasing, overheating, stereo, input lag, etc". steampowered.com.
  46. "CryENGINE 3 Specifications". Crytek GmbH. Archived from the original on March 27, 2009. Retrieved March 27, 2009.
  47. "Lighting you up in Battlefield 3". DICE. March 3, 2011. Archived from the original (PDF) on August 25, 2011. Retrieved September 15, 2011.
  48. "GameStart – Feature List". Archived from the original on December 2, 2011.
  49. "Infinity Development Journal – Deferred Lighting". I-Novae Studios. April 3, 2009. Archived from the original on January 26, 2013. Retrieved January 26, 2011.
  50. "BUILD: Deferred rendering". February 26, 2009. Retrieved April 8, 2015.
  51. "Torque 3D Development - Advanced Lighting (deferred lighting hybrid)". March 3, 2009. Retrieved July 2, 2015.
  52. Vosburgh, Ethan (September 9, 2010). "Unity 3 Feature Preview – Deferred Rendering". Unity Technologies. Retrieved January 26, 2011.
  53. "Unreal Engine 4 - Rendering Overview". Epic Games. Retrieved June 6, 2015.
  54. "Vision Engine 8.2 Brings 3D Technologies Cross-Platform". October 10, 2011. Archived from the original on November 16, 2012. Retrieved April 8, 2015.
  55. "The Graphics Technology of Fallout 4". Bethesda Softworks . November 4, 2015. Retrieved April 24, 2020.
  56. Saito, Takafumi; Tokiichiro Takahashi (1990). "Comprehensible rendering of 3-D shapes". ACM SIGGRAPH Computer Graphics. 24 (4): 197–206. doi:10.1145/97880.97901.
  57. Geldreich, Rich. "GDC 2004 Presentation on Deferred Lighting and Shading". Archived from the original on March 11, 2014. Retrieved August 24, 2013.
  58. "Deferred Shading" (PDF). NVIDIA. Retrieved March 28, 2007.
  59. Klint, Josh. "Deferred Rendering in Leadwerks Engine" (PDF). Leadwerks. Archived from the original (PDF) on December 9, 2008.