Hyperlapse or moving time-lapse (also stop-motion time-lapse, walklapse, spacelapse) is a technique in time-lapse photography for creating motion shots. In its simplest form, a hyperlapse is achieved by moving the camera a short distance between each shot. The first film using the hyperlapse technique dates to 1995.
Regular time-lapse involves taking photos at a regular interval with a camera mounted on a tripod or using a motorized dolly and/or pan-and-tilt head to add limited motion to the shot. Hyperlapse relies on the time-lapse principle, but adds movement over much longer distances. [1] This technique allows using long exposures to create motion blur. The resulting image sequence is stabilized in post-production. The camera can also be mounted on a hand-held gimbal to achieve smooth motion while walking.
A "walking hyperlapse" is a special hyperlapse technique that requires a person in the frame to walk at a specified interval. When played back, the person will appear to be walking at normal speed, while everything else appears to move quickly through the scene. For example, a hyperlapse recorded at 1 frame per second while a person is walking at 124 beats per minute, will capture a frame on every other step. When the hyperlapse is played back at 24 frames per second, the person will appear to be walking at normal speeds. [2]
The first film using the hyperlapse technique seems to have been Pacer, shot on Super 8 film in Montreal in 1995 by Guy Roland, after experiments during the 1980s and 1990s. [3] It has been suggested that the term "hyper-lapse" itself was first used in 2011 by American filmmaker Dan Eckert, [4] and sustainably coined by Shahab Gabriel Behzumi´s Berlin Hyperlapse in 2012. [5]
Films made from images derived from Google Street View and Google Maps have also been called hyperlapse videos. [6] [7] Software that can help produce hyperlapse-style videos include Hyperlapse from Instagram and a similarly named program from Microsoft. [8] [9]
Unmanned aerial vehicles have been used to create aerial hyperlapses since at least 2015. [10]
Subgenres of hyperlapse are flowmotion and hyperzoom. Flowmotion was developed in the 2010s by British filmmaker Rob Whitworth. It combines hyperlapse, timelapse and regular film shots to create the suggestion of a story proceeding in one long, almost uninterrupted take. [11] [12] Hyperzoom was developed by Geoff Tompkinson and uses film and post-production techniques to create a seamless flight through diverse locations. [13]
Film editing is both a creative and a technical part of the post-production process of filmmaking. The term is derived from the traditional process of working with film which increasingly involves the use of digital technology. When putting together some sort of video composition, typically, you would need a collection of shots and footages that vary from one another. The act of adjusting the shots you have already taken, and turning them into something new is known as film editing.
Video is an electronic medium for the recording, copying, playback, broadcasting, and display of moving visual media. Video was first developed for mechanical television systems, which were quickly replaced by cathode-ray tube (CRT) systems, which, in turn, were replaced by flat-panel displays of several types.
Chroma key compositing, or chroma keying, is a visual-effects and post-production technique for compositing (layering) two or more images or video streams together based on colour hues. The technique has been used in many fields to remove a background from the subject of a photo or video – particularly the newscasting, motion picture, and video game industries. A colour range in the foreground footage is made transparent, allowing separately filmed background footage or a static image to be inserted into the scene. The chroma keying technique is commonly used in video production and post-production. This technique is also referred to as colour keying, colour-separation overlay, or by various terms for specific colour-related variants such as green screen or blue screen; chroma keying can be done with backgrounds of any colour that are uniform and distinct, but green and blue backgrounds are more commonly used because they differ most distinctly in hue from any human skin colour. No part of the subject being filmed or photographed may duplicate the colour used as the backing, or the part may be erroneously identified as part of the backing.
Slow motion is an effect in film-making whereby time appears to be slowed down. It was invented by the Austrian priest August Musger in the early 20th century. This can be accomplished through the use of high-speed cameras and then playing the footage produced by such cameras at a normal rate like 30 fps, or in post production through the use of software.
In video technology, 24p refers to a video format that operates at 24 frames per second frame rate with progressive scanning. Originally, 24p was used in the non-linear editing of film-originated material. Today, 24p formats are being increasingly used for aesthetic reasons in image acquisition, delivering film-like motion characteristics. Some vendors advertise 24p products as a cheaper alternative to film acquisition.
Time-lapse photography is a technique in which the frequency at which film frames are captured is much lower than the frequency used to view the sequence. When played at normal speed, time appears to be moving faster and thus lapsing. For example, an image of a scene may be captured at 1 frame per second but then played back at 30 frames per second; the result is an apparent 30 times speed increase. Similarly, film can also be played at a much lower rate than at which it was captured, which slows down an otherwise fast action, as in slow motion or high-speed photography.
Mattes are used in photography and special effects filmmaking to combine two or more image elements into a single, final image. Usually, mattes are used to combine a foreground image with a background image. In this case, the matte is the background painting. In film and stage, mattes can be physically huge sections of painted canvas, portraying large scenic expanses of landscapes.
Virtual cinematography is the set of cinematographic techniques performed in a computer graphics environment. It includes a wide variety of subjects like photographing real objects, often with stereo or multi-camera setup, for the purpose of recreating them as three-dimensional objects and algorithms for the automated creation of real and simulated camera angles. Virtual cinematography can be used to shoot scenes from otherwise impossible camera angles, create the photography of animated films, and manipulate the appearance of computer-generated effects.
Cutting on action or matching on action refers to film editing and video editing techniques where the editor cuts from one shot to another view that matches the first shot's action.
Previsualization is the visualizing of scenes or sequences in a movie before filming. It is a concept used in other creative arts, including animation, performing arts, video game design, and still photography. Previsualization typically describes techniques like storyboarding, which uses hand-drawn or digitally-assisted sketches to plan or conceptualize movie scenes.
Motion interpolation or motion-compensated frame interpolation (MCFI) is a form of video processing in which intermediate film, video or animation frames are generated between existing ones by means of interpolation, in an attempt to make animation more fluid, to compensate for display motion blur, and for fake slow motion effects.
In photography and cinematography, headroom or head room is a concept of aesthetic composition that addresses the relative vertical position of the subject within the frame of the image. Headroom refers specifically to the distance between the top of the subject's head and the top of the frame, but the term is sometimes used instead of lead room, nose room or 'looking room' to include the sense of space on both sides of the image. The amount of headroom that is considered aesthetically pleasing is a dynamic quantity; it changes relative to how much of the frame is filled by the subject. Rather than pointing and shooting, one must compose the image to be pleasing. Too much room between a subject's head and the top of frame results in dead space.
Rob Whitworth is a British photographer and urban film maker, based in Norwich, with flow motion based works throughout Asia. His works are mainly based on time-lapse, and they have received nine million online views and multiple awards.
In motion picture technology—either film or video—high frame rate (HFR) refers to higher frame rates than typical prior practice.
Hyperlapse is a mobile app created by Instagram that enables users to produce hyperlapse and time-lapse videos. It was released on August 26, 2014. The app enables users to record videos up to 45 minutes of footage in a single take, which can be subsequently accelerated to create a hyperlapse cinematographic effect. Whereas time-lapses are normally produced by stitching together stills from traditional cameras, the app uses an image stabilization algorithm that steadies the appearance of video by eliminating jitter. Unlike Instagram, the app offers no filters. Instead, the only post-production option available to users is the modification of playback speed which can range from 1x to 40x normal playback speed.
Pixel Camera, formerly Google Camera, is a camera phone application developed by Google for the Android operating system. Development for the application began in 2011 at the Google X research incubator led by Marc Levoy, which was developing image fusion technology for Google Glass. It was publicly released for Android 4.4+ on the Google Play on April 16, 2014. It was initially supported on all devices running Android 4.4 KitKat and higher, but became only officially supported on Google Pixel devices in the following years. The app was renamed Pixel Camera in October 2023, with the launch of the Pixel 8 and Pixel 8 Pro.
16K resolution is a display resolution with approximately 16,000 pixels horizontally. The most commonly discussed 16K resolution is 15360 × 8640, which doubles the pixel count of 8K UHD in each dimension, for a total of four times as many pixels. This resolution has 132.7 megapixels, 16 times as many pixels as 4K resolution and 64 times as many pixels as 1080p resolution.
This glossary of motion picture terms is a list of definitions of terms and concepts related to motion pictures, filmmaking, cinematography, and the film industry in general.
The Canon EOS 6D Mark II is a 26.2-megapixel full-frame digital single-lens reflex camera announced by Canon on June 29, 2017.
Timelapse of the Entire Universe is a 2018 short epic animated pseudo-documentary web film created by American astronomy-themed musician and filmmaker John D. Boswell. Inspired by the Cosmic Calendar, the 10-minute film is a hyperlapse of the universe from its start to current humanity, with every second representing 22 million years, with the entire humanity represented in a short time, using current knowledge. The film was originally released on Boswell's YouTube channel Melodysheep on March 7, but it was eventually taken down due to a copyright infringement regarding Morgan Freeman's voice. A revised version was uploaded 3 days later, on March 10, 2018. A year and 10 days later, a follow-up, Timelapse of the Future, was released.