This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these template messages)(Learn how and when to remove this template message)
Pre-rendering is the process in which video footage is not rendered in real-time by the hardware that is outputting or playing back the video. Instead, the video is a recording of footage that was previously rendered on different equipment (typically one that is more powerful than the hardware used for playback). Pre-rendered assets (typically movies) may also be outsourced by the developer to an outside production company. Such assets usually have a level of complexity that is too great for the target platform to render in real-time.
Rendering or image synthesis is the automatic process of generating a photorealistic or non-photorealistic image from a 2D or 3D model by means of computer programs. Also, the results of displaying such a model can be called a render. A scene file contains objects in a strictly defined language or data structure; it would contain geometry, viewpoint, texture, lighting, and shading information as a description of the virtual scene. The data contained in the scene file is then passed to a rendering program to be processed and output to a digital image or raster graphics image file. The term "rendering" may be by analogy with an "artist's rendering" of a scene.
The term pre-rendered refers to anything that is not rendered in real-time. This includes content that could have been run in real-time with more effort on the part of the developer (e.g. video that covers a large number of a game's environments without pausing to load, or video of a game in an early state of development that is rendered in slow-motion and then played back at regular speed). This term is generally not used to refer to video captures of real-time rendered graphics despite the fact that video is technically pre-rendered by its nature. The term is also not used to refer to hand drawn assets or photographed assets (these assets not being computer rendered in the first place).
The advantage of pre-rendering is the ability to use graphic models that are more complex and computationally intensive than those that can be rendered in real-time, due to the possibility of using multiple computers over extended periods of time to render the end results. For instance, a comparison could be drawn between rail-shooters Maximum Force (which used pre-rendered 3D levels but 2D sprites for enemies) and Virtua Cop (using 3D polygons); Maximum Force was more realistic looking due to the limitations of Virtua Cop's 3D engine, but Virtua Cop has actual depth (able to portray enemies close and far away, along with body-specific hits and multiple hits) compared to the limits of the 2D sprite enemies in Maximum Force.
Maximum Force is a light gun shooter arcade game developed by Mesa Logic for Atari Games in 1997. In 1998, Atari Games re-released the game as part of one machine called Area 51/Maximum Force Duo that also included Area 51, and later ported the game to both the Sony PlayStation and Sega Saturn game consoles.
Virtua Cop is a lightgun shooter created by Sega AM2 and designed by Yu Suzuki. Its original incarnation was an arcade game in 1994. It was ported to the Sega Saturn in 1995 and Windows in 1997. The Saturn version included support for both the Virtua Gun and Saturn mouse, as well as a new "Training Mode" which consists of a randomly generated shooting gallery.
The disadvantage of pre-rendering, in the case of video game graphics, is a generally lower level of interactivity, if any, with the player. Another negative side of pre-rendered assets is that changes cannot be made during gameplay. A game with pre-rendered backgrounds is forced to use fixed camera angles, and a game with pre-rendered video generally cannot reflect any changes the game's characters might have undergone during gameplay (such as wounds or customized clothing) without having an alternate version of the video stored. This is generally not feasible due to the large amount of space required to store pre-rendered assets of high quality. However, in some advanced implementations, such as in Final Fantasy VIII , real-time assets were composited with pre-rendered video, allowing dynamic backgrounds and changing camera angles. Another problem is that a game with pre-rendered lighting cannot easily change the state of the lighting in a convincing manner.
A video game is an electronic game that involves interaction with a user interface to generate visual feedback on a two- or three-dimensional video display device such as a TV screen, virtual reality headset or computer monitor. Since the 1980s, video games have become an increasingly important part of the entertainment industry, and whether they are also a form of art is a matter of dispute.
Final Fantasy VIII is a role-playing video game developed and published by Square for the PlayStation console. Released in 1999, it is the eighth main installment in the Final Fantasy series. Set on an unnamed fantasy world with science fiction elements, the game follows a group of young mercenaries, led by Squall Leonhart, as they are drawn into a conflict sparked by Ultimecia, a sorceress from the future who wishes to compress time. During the quest to defeat Ultimecia, Squall struggles with his role as leader and develops a romance with one of his comrades, Rinoa Heartilly.
As the technology continued to advance in the mid-2000s, video game graphics were able to achieve the photorealism that was previously limited to pre-rendering, as seen in the growth of Machinima .
Machinima is the use of real-time computer graphics engines to create a cinematic production. Most often, video games are used to generate the computer animation.
Pre-rendered graphics are used primarily as cut scenes in modern video games, where they are also known as full motion video. In the late 1990s and early 2000s, when most 3D game engines had pre-calculated/fixed Lightmaps and texture mapping, developers often turned to pre-rendered graphics which had a much higher level of realism. However this has lost favor since the mid-2000s, as advances in consumer PC and video game graphics have enabled the use of the game's own engine to render these cinematics. For instance, the id Tech 4 engine used in Doom 3 allowed bump mapping and dynamic per-pixel lighting, previously only found in pre-rendered videos.
A full motion video (FMV) is a video game narration technique that relies upon pre-recorded video files to display action in the game. While many games feature FMVs as a way to present information during cutscenes, games that are primarily presented through FMVs are referred to as full-motion video games or interactive movies.
A lightmap is a data structure used in lightmapping, a form of surface caching in which the brightness of surfaces in a virtual scene is pre-calculated and stored in texture maps for later use. Lightmaps are most commonly applied to static objects in applications that use real-time 3D computer graphics, such as video games, in order to provide lighting effects such as global illumination at a relatively low computational cost.
id Tech 4, popularly known as the Doom 3 engine, is a game engine developed by id Software and first used in the video game Doom 3. The engine was designed by John Carmack, who also created previous game engines, such as those for Doom and Quake, which are widely recognized as significant advances in the field. This OpenGL-based game engine has also been used in Quake 4, Prey, Enemy Territory: Quake Wars, Wolfenstein, and Brink.
The first video game to use pre-rendering was the 1982 arcade game Xevious .The Sharp X68000 enhanced remake of Ys I: Ancient Ys Vanished , released in 1991, used 3D pre-rendered graphics for the boss sprites, though this ended up creating what is considered "a bizarre contrast" with the game's mostly 2D graphics. One of the first games to extensively use pre-rendered graphics along with full motion video was The 7th Guest . Released in 1992 as one of the first PC games exclusively on CD-ROM, the game was hugely popular, although reviews from critics were mixed. The game featured pre-rendered video sequences that were at a resolution of 640x320 at 15 frames per second, a feat previously thought impossible on personal computers. Shortly after, the release of Myst in 1993 made the use of pre-rendered graphics and CD-ROMs even more popular; most of the rendered work of Myst became the basis for the re-make realMyst: Interactive 3D Edition with its free-roaming real-time 3D graphics. The most graphically advanced use of entirely pre-rendered graphics in games is often claimed to be Myst IV: Revelation , released in 2004.
Xevious is a vertically scrolling shooter released by Namco in arcades in December 1982. It runs on Namco Galaga hardware, and was designed by Masanobu Endō. In North America, the game was manufactured and distributed by Atari, Inc.
A video game remake is a video game closely adapted from an earlier title, usually for the purpose of modernizing a game for newer hardware and contemporary audiences. Typically, a remake of such game software shares essentially the same title, fundamental gameplay concepts, and core story elements of the original game.
Ys I: Ancient Ys Vanished is the first installment of Ys, an action role-playing video game series developed by Nihon Falcom in 1987. The name is commonly misspelled Y's due to an error on the packaging of an English-language release.
The use of pre-rendered backgrounds and movies also was made popular by the Resident Evil and Final Fantasy franchises on the original PlayStation, both of which use pre-rendered backgrounds and movies extensively to provide a visual presentation that is far greater than the console can provide with real-time 3D. These games include real-time elements (characters, items, etc.) in addition to pre-rendered backgrounds to provide interactivity. Often, a game using pre-rendered backgrounds can devote additional processing power to the remaining interactive elements, resulting in a level of detail greater than the norm for the host platform. In some cases, the visual quality of the interactive elements is still far behind the pre-rendered backgrounds.
Games such as Warcraft III: Reign of Chaos have used both types of cutscenes; pre-rendered for the beginning and end of a campaign, and the in-game engine for level briefings and character dialogue during a mission.
Some games also use 16-bit pre-rendered skybox, like Half-Life (only GoldSrc version), Re-Volt , Quake II , and others.
CG movies such as Toy Story , Shrek and Final Fantasy: The Spirits Within are entirely pre-rendered.
Another increasingly common pre-rendering method is the generation of texture sets for 3D games, which are often used with complex real-time algorithms to simulate extraordinarily high levels of detail. While making Doom 3 , id Software used pre-rendered models as the basis for generating normal, specular and diffuse lighting maps that simulate the detail of the original model in real-time.
Pre-rendered lighting is a technique that is losing popularity. Processor-intensive ray tracing algorithms can be used during a game's production to generate light textures, which are simply applied on top of the usual hand drawn textures.
A game engine is a software-development environment designed for people to build video games. Developers use game engines to construct games for consoles, mobile devices, and personal computers. The core functionality typically provided by a game engine includes a rendering engine ("renderer") for 2D or 3D graphics, a physics engine or collision detection, sound, scripting, animation, artificial intelligence, networking, streaming, memory management, threading, localization support, scene graph, and may include video support for cinematics. Implementers often economize on the process of game development by reusing/adapting, in large part, the same game engine to produce different games or to aid in porting games to multiple platforms.
Ys is a series of role-playing video games developed by Nihon Falcom. The first game in the series, Ys I: Ancient Ys Vanished, was released on the NEC PC-8801 in 1987. Ys games have also appeared on the Sharp X1, MSX2, FM-7, NEC PC-9801, Sharp X68000, Master System, Sega Genesis, Sega Saturn, Famicom, NES, Nintendo DS, Microsoft Windows, PlayStation 2, PlayStation Portable, TurboGrafx-CD, Apple IIGS, mobile phones, Super NES, PlayStation Vita, PlayStation 4, Nintendo Switch, and Xbox One. As of 2017, the series had sold over 4.8 million copies worldwide.
id Tech 1, also known as Doom engine, is the game engine that powers the id Software games Doom and Doom II: Hell on Earth. It is also used in Heretic, Hexen: Beyond Heretic, Strife: Quest for the Sigil, Hacx: Twitch 'n Kill, Freedoom, and other games produced by licensees. It was created by John Carmack, with auxiliary functions written by Mike Abrash, John Romero, Dave Taylor, and Paul Radek. Originally developed on NeXT computers, it was ported to DOS for Doom's initial release and was later ported to several game consoles and operating systems.
A first-person shooter engine is a video game engine specialized for simulating 3D environments for use in a first-person shooter video game. First-person refers to the view where the players see the world from the eyes of their characters. Shooter refers to games which revolve primarily around wielding firearms and killing other entities in the game world, either non-player characters or other players.
The Quake engine is the game engine developed by id Software to power their 1996 video game Quake. It featured true 3D real-time rendering and is now licensed under the terms of the GNU General Public License (GPL).
Real-time computer graphics or real-time rendering is the sub-field of computer graphics focused on producing and analyzing images in real time. The term can refer to anything from rendering an application's graphical user interface (GUI) to real-time image analysis, but is most often used in reference to interactive 3D computer graphics, typically using a graphics processing unit (GPU). One example of this concept is a video game that rapidly renders changing 3D environments to produce an illusion of motion.
Low poly is a polygon mesh in 3D computer graphics that has a relatively small number of polygons. Low poly meshes occur in real-time applications as contrast with high poly meshes in animated movies and special effects of the same era. The term low poly is used in both a technical and a descriptive sense; the number of polygons in a mesh is an important factor to optimize for performance but can give an undesirable appearance to the resulting graphics.
Software rendering is the process of generating an image from a model by means of computer software. In the context of computer graphics rendering, software rendering refers to a rendering process that is not dependent upon graphics hardware ASICs, such as a graphics card. The rendering takes place entirely in the CPU. Rendering everything with the (general-purpose) CPU has the main advantage that it is not restricted to the (limited) capabilities of graphics hardware, but the disadvantage that more semiconductors are needed to obtain the same speed.
High-dynamic-range rendering, also known as high-dynamic-range lighting, is the rendering of computer graphics scenes by using lighting calculations done in high dynamic range (HDR). This allows preservation of details that may be lost due to limiting contrast ratios. Video games and computer-generated movies and special effects benefit from this as it creates more realistic scenes than with the more simplistic lighting models used.
In computer graphics, per-pixel lighting refers to any technique for lighting an image or scene that calculates illumination for each pixel on a rendered image. This is in contrast to other popular methods of lighting such as vertex lighting, which calculates illumination at each vertex of a 3D model and then interpolates the resulting values over the model's faces to calculate the final per-pixel color values.
Shark 3D is a 3D software program and engine developed by Spinor for creating and running interactive virtual 3D worlds. It is used for video games, films, animated series, broadcasting graphics, and 3D industry applications.
Image-based lighting (IBL) is a 3D rendering technique which involves capturing an omnidirectional representation of real-world light information as an image, typically using a specialized camera. This image is then projected onto a dome or sphere analogously to environment mapping, and this is used to simulate the lighting for the objects in the scene. This allows highly detailed real-world lighting to be used to light a scene, instead of trying to accurately model illumination using an existing rendering technique.
3D computer graphics or three-dimensional computer graphics, are graphics that use a three-dimensional representation of geometric data that is stored in the computer for the purposes of performing calculations and rendering 2D images. Such images may be stored for viewing later or displayed in real-time.
Computer graphics are pictures and films created using computers. Usually, the term refers to computer-generated image data created with the help of specialized graphical hardware and software. It is a vast and recently developed area of computer science. The phrase was coined in 1960, by computer graphics researchers Verne Hudson and William Fetter of Boeing. It is often abbreviated as CG, though sometimes erroneously referred to as computer-generated imagery (CGI).
Isometric video game graphics are graphics employed in video games and pixel art where the viewpoint is angled to reveal facets of the environment that would not be visible from a top-down perspective or side view, thereby producing a three-dimensional effect. Despite the name, isometric computer graphics are not necessarily truly isometric—i.e., the x, y, and z axes are not necessarily oriented 120° to each other. Instead, a variety of angles are used; some form of parallel projection, such as dimetric projection with a 2:1 pixel ratio, is the most common. The terms "3/4 perspective", "2.5D", and "pseudo-3D" are also sometimes used, although these terms can possess slightly different meanings in other contexts.
A variety of computer graphic techniques have been used to display video game content throughout the history of video games. The predominance of individual techniques have evolved over time, primarily due to hardware advances and restrictions such as the processing power of central or graphics processing units.
Away3D is an open-source platform for developing interactive 3D graphics for video games and applications, in Adobe Flash or HTML5. The platform consists of a 3D world editor, a 3D graphics engine, a 3D physics engine and a compressed 3D model file format (AWD).
In computing, Stage3D is an Adobe Flash Player API for rendering interactive 3D graphics with GPU-acceleration, within Flash games and applications. Flash Player or AIR applications written in ActionScript 3 may use Stage3D to render 3D graphics, and such applications run natively on Windows, Mac OS X, Linux, Apple iOS and Google Android. Stage3D is similar in purpose and design to WebGL.
A cutscene or event scene is a sequence in a video game that is not interactive, breaking up the gameplay. Such scenes could be used to show conversations between characters, set the mood, reward the player, introduce new gameplay elements, show the effects of a player's actions, create emotional connections, improve pacing or foreshadow future events.