This article needs additional citations for verification .(February 2010) |
A front projection effect is an in-camera visual effects process in film production for combining foreground performance with pre-filmed background footage. In contrast to rear projection, which projects footage onto a screen from behind the performers, front projection projects the pre-filmed material over the performers and onto a highly reflective background surface.
In contrast to rear projection, in front projection the background image is projected onto both the performer and a highly reflective background screen, with the result that the projected image is bounced off the screen and into the lens of a camera. This is achieved by having a screen made of a retroreflective material such as Scotchlite, a product of the 3M company that is also used to make screens for movie theaters. Such material is made from millions of glass beads affixed to the surface of the cloth. These glass beads reflect light back only in the direction from which it came, far more efficiently than any common surface.
The actor (or subject) performs in front of the reflective screen with a movie camera pointing straight at them. Just in front of the camera is a one-way mirror angled at 45 degrees. At 90 degrees to the camera is a projector which projects an image of the background onto the mirror which reflects the image onto the performer and the highly reflective screen; the image is too faint to appear on the actor but shows up clearly on the screen. In this way, the actor becomes their own matte. The combined image is transmitted through the mirror and recorded by the camera. The technique is shown and explained in the "making-of" documentary of the 1972 sci-fi film Silent Running . [1]
Front projection was invented by Will Jenkins. [2] For this he holds U.S. patent 2,727,427 , issued on December 20, 1955 for an "Apparatus for Production of Light Effects in Composite Photography" and U.S. patent 2,727,429 , issued the same day for an "Apparatus for Production of Composite Photographic Effects."
It was first experimented with in 1949, shortly after the invention of Scotchlite, and had appeared in feature films by 1963, when the Japanese film Matango used it extensively for its yacht scenes. [3] Another early appearance was in 1966, during the filming of 2001: A Space Odyssey . The actors in ape suits were filmed on a stage at Elstree Studios and combined with footage of Africa (the effect is revealed in the leopard's glowing eyes reflecting back the light). Dennis Muren used a very similar solution for his 1967 debut film Equinox , although Muren's technique didn't employ Scotchlite. Two British films released in 1969, On Her Majesty's Secret Service and The Assassination Bureau , used the technique, as did the 1968 films Barbarella [4] and Where Eagles Dare .
Front projection was chosen as the main method for shooting Christopher Reeve's flying scenes in Superman . However, they still faced the problem of having Reeve actually fly in front of the camera. Effects wizard Zoran Perisic patented a new refinement to front projection that involved placing a zoom lens on both the movie camera and the projector. These zoom lenses are synchronized to zoom in and out simultaneously in the same direction. As the projection lens zooms in, it projects a smaller image on the screen; the camera lens zooms in at the same time, and to the same degree, so that the projected image (the background plate) appears unchanged, as seen through the camera. However, the subject placed in front of the front projection screen appears to have moved closer to the camera; thus Superman flies towards the camera. The technique is analogous to the more commonly discussed dolly zoom effect.
Perisic called this technique "Zoptic". The process was also used in two of the Superman sequels (but not used in the fourth movie due to budget constraints), Return to Oz , Radio Flyer , High Road to China , Deal of the Century , Megaforce , Thief of Baghdad , Greatest American Hero (TV), as well as Perisic's films as director, Sky Bandits (also known as Gunbus) and The Phoenix and the Magic Carpet . [5]
Introvision is a front projection composite photography system using a pair of perpendicular reflex screens to combine two projected scenes with a scene staged live before the camera in a single shot.
It allows foreground, midground and background elements to be combined in-camera: such as sandwiching stage action (such as actors) between two projected elements, foreground and background. [6]
In its simplest form, images from a projector are directed at a beam splitter oriented at forty-five degrees. Two retro reflective screens are used, one to return the reflected image and one to return the pass through image. Set between the beam splitter and the retro reflective screens are mattes with cut outs that allow the projected image to strike each retro reflective screens in select areas. This combination, as seen by the camera, gives the appearance of images behind the actors (reflected image) and in front of the actors (pass through image). The camera sees the pass through image on the reverse side of the beam splitter and the reflected image through the beam splitter and combines the two eliminating the need for compositing in post production. To compensate for the large difference in the distance from the camera to the two screens an additional lens is used in the pass through image path.
The more complicated setup involves the use of two cameras, two projectors and multiple beam-splitters, light traps, filters and aperture control systems. This setup provides the opportunity to use different content for foreground and background.
Introvision was first used in 1980–81 during the filming of the science-fiction movie Outland to combine star Sean Connery and other performers with models of the Io mining colony. [7] It was also used in the telefilm Inside the Third Reich to place actors portraying Adolf Hitler and Albert Speer in the long-destroyed Reichstag, [8] as well as Under Siege , Army of Darkness and The Fugitive , where it seemed to place Harrison Ford on top of a bus that was then rammed by a train. Adventures in Babysitting employed IntroVision to place children in multiple situations of peril such as hanging from the rafters and scaling the "Smurfit-Stone Building" in Chicago, and Stand By Me used IntroVision during the train sequence. [9] Most movie companies brought small units to the Introvision sound stages near Poinsettia and Santa Monica Boulevard in Hollywood, California. Scenes were often shot near the end of the production schedule to allow for the shooting of "live" plates to have been done while on location.
Compared to back projection, the front projection process used less studio space, and generally produced sharper and more saturated images, as the background plate was not being viewed through a projection screen. The process also had several advantages over bluescreen matte photography, which could suffer from clipping, mismatched mattes, film shrinkage, black or blue haloing, garbage matte artifacts, and image degradation/excessive grain. It could be less time-consuming, and therefore less expensive, than the process of optically separating and combining the background and foreground images using an optical printer. It also allowed the director and/or director of photography to view the combined sequence live, allowing for such effects to be filmed more like a regular sequence, and the performers could be specifically directed to time their actions to action or movement on the projected images.
However, advancements in digital compositing and the increasing use of digital cameras have made digital the most common method of choice. The last major blockbuster to extensively use front projection was the Sylvester Stallone action thriller Cliffhanger from 1993.[ citation needed ] More recently, the film Oblivion made extensive use of front projection (though not retro-reflective) to display various sky backgrounds in the home set. Spectre also used this technique for its snow mountain hospital and glass building interiors. The advantages for the in-camera effect were a reduced need for digital effects and green screen, interactive lighting in a reflective set, and to provide a real background for the actors.
70 mm film is a wide high-resolution film gauge for motion picture photography, with a negative area nearly 3.5 times as large as the standard 35 mm motion picture film format. As used in cameras, the film is 65 mm (2.6 in) wide. For projection, the original 65 mm film is printed on 70 mm (2.8 in) film. The additional 5 mm contains the four magnetic stripes, holding six tracks of stereophonic sound. Although later 70 mm prints use digital sound encoding, the vast majority of existing and surviving 70 mm prints pre-date this technology.
Chroma key compositing, or chroma keying, is a visual-effects and post-production technique for compositing (layering) two or more images or video streams together based on colour hues. The technique has been used in many fields to remove a background from the subject of a photo or video – particularly the newscasting, motion picture, and video game industries. A colour range in the foreground footage is made transparent, allowing separately filmed background footage or a static image to be inserted into the scene. The chroma keying technique is commonly used in video production and post-production. This technique is also referred to as colour keying, colour-separation overlay, or by various terms for specific colour-related variants such as green screen or blue screen; chroma keying can be done with backgrounds of any colour that are uniform and distinct, but green and blue backgrounds are more commonly used because they differ most distinctly in hue from any human skin colour. No part of the subject being filmed or photographed may duplicate the colour used as the backing, or the part may be erroneously identified as part of the backing.
Special effects are illusions or visual tricks used in the theatre, film, television, video game, amusement park and simulator industries to simulate the imagined events in a story or virtual world. It is sometimes abbreviated as SFX, but this may also refer to sound effects.
Cinematography is the art of motion picture photography.
A slide projector is an optical device for projecting enlarged images of photographic slides onto a screen. Many projectors have mechanical arrangements to show a series of slides loaded into a special tray sequentially.
Visual effects is the process by which imagery is created or manipulated outside the context of a live-action shot in filmmaking and video production. The integration of live-action footage and other live-action footage or CGI elements to create realistic imagery is called VFX.
An LCD projector is a type of video projector for displaying video, images or computer data on a screen or other flat surface. It is a modern equivalent of the slide projector or overhead projector. To display images, LCD projectors typically send light from a metal-halide lamp through a prism or series of dichroic filters that separates light to three polysilicon panels – one each for the red, green and blue components of the video signal. As polarized light passes through the panels, individual pixels can be opened to allow light to pass or closed to block the light. The combination of open and closed pixels can produce a wide range of colors and shades in the projected image.
A movie projector is an opto-mechanical device for displaying motion picture film by projecting it onto a screen. Most of the optical and mechanical elements, except for the illumination and sound devices, are present in movie cameras. Modern movie projectors are specially built video projectors.
Compositing is the process or technique of combining visual elements from separate sources into single images, often to create the illusion that all those elements are parts of the same scene. Live-action shooting for compositing is variously called "chroma key", "blue screen", "green screen" and other names. Today, most compositing is achieved through digital image manipulation. Pre-digital compositing techniques, however, go back as far as the trick films of Georges Méliès in the late 19th century, and some are still in use.
Mattes are used in photography and special effects filmmaking to combine two or more image elements into a single, final image. Usually, mattes are used to combine a foreground image with a background image. In this case, the matte is the background painting. In film and stage, mattes can be physically huge sections of painted canvas, portraying large scenic expanses of landscapes.
This article contains a list of cinematic techniques that are divided into categories and briefly described.
Rear projection is one of many in-camera effects cinematic techniques in film production for combining foreground performances with pre-filmed backgrounds. It was widely used for many years in driving scenes, or to show other forms of "distant" background motion.
The sodium vapor process is a photochemical film technique for combining actors and background footage. It originated in the British film industry in the late 1950s and was used extensively by Walt Disney Productions in the 1960s and 1970s as an alternative to the more common bluescreen process. Wadsworth E. Pohl is credited with the invention or development of both of these processes, and received an Academy Award in 1965 for the sodium vapor process as used in the film Mary Poppins.
In cinematography, bipacking, or a bipack, is the process of loading two reels of film into a camera, so that they both pass through the camera gate together. It was used both for in-camera effects and as an early subtractive colour process.
Anamorphic format is the cinematography technique of shooting a widescreen picture on standard 35 mm film or other visual recording media with a non-widescreen native aspect ratio. It also refers to the projection format in which a distorted image is "stretched" by an anamorphic projection lens to recreate the original aspect ratio on the viewing screen.
Introvision was a variation on a front-projection process that allowed film makers to view a finished composite of live action and plate photography through the camera's viewfinder on set and in real time. During its heyday, starting with the feature film, Outland in 1981, Introvision enjoyed the novelty of visual effect compositing in-camera, thus eliminating the need to wait for photo-chemical compositing to determine if the effect shot was successful.
A holographic screen is a two-dimensional display technology that uses coated glass media for the projection surface of a video projector. "Holographic" refers not to a stereoscopic effect, but to the coating that bundles light using formed microlenses. The lens design and attributes match the holographic area. The lenses may appear similar to the Fresnel lenses used in overhead projectors. The resulting effect is that of a free-space display, because the image carrier appears very transparent. Additionally, the beam manipulation by the lenses can be used to make the image appear to be floating in front of or behind the glass, rather than directly on it. However, this display is only two-dimensional and not true three-dimensional. It is unclear if such a technology will be able to provide acceptable three-dimensional images in the future.
Multi-image is the now largely obsolete practice and business of using 35mm slides (diapositives) projected by single or multiple slide projectors onto one or more screens in synchronization with an audio voice-over or music track. Multi-image productions are also known as multi-image slide presentations, slide shows and diaporamas and are a specific form of multimedia or audio-visual production.
The history of film technology traces the development of techniques for the recording, construction and presentation of motion pictures. When the film medium came about in the 19th century, there already was a centuries old tradition of screening moving images through shadow play and the magic lantern that were very popular with audiences in many parts of the world. Especially the magic lantern influenced much of the projection technology, exhibition practices and cultural implementation of film. Between 1825 and 1840, the relevant technologies of stroboscopic animation, photography and stereoscopy were introduced. For much of the rest of the century, many engineers and inventors tried to combine all these new technologies and the much older technique of projection to create a complete illusion or a complete documentation of reality. Colour photography was usually included in these ambitions and the introduction of the phonograph in 1877 seemed to promise the addition of synchronized sound recordings. Between 1887 and 1894, the first successful short cinematographic presentations were established. The biggest popular breakthrough of the technology came in 1895 with the first projected movies that lasted longer than 10 seconds. During the first years after this breakthrough, most motion pictures lasted about 50 seconds, lacked synchronized sound and natural colour, and were mainly exhibited as novelty attractions. In the first decades of the 20th century, movies grew much longer and the medium quickly developed into one of the most important tools of communication and entertainment. The breakthrough of synchronized sound occurred at the end of the 1920s and that of full color motion picture film in the 1930s. By the start of the 21st century, physical film stock was being replaced with digital film technologies at both ends of the production chain by digital image sensors and projectors.
The Williams process or Williams doublematting process is a matte creation technique patented by the American cinematographer Frank D. Williams in 1918. Unlike prior matte techniques, it allowed for the integration of the actors' movements with previously shot backgrounds.