Motion interpolation or motion-compensated frame interpolation (MCFI) is a form of video processing in which intermediate film, video or animation frames are generated between existing ones by means of interpolation, in an attempt to make animation more fluid, to compensate for display motion blur, and for fake slow motion effects.
Motion interpolation is a common, optional feature of various modern display devices such as HDTVs and video players, aimed at increasing perceived framerate or alleviating display motion blur, a common problem on LCD flat-panel displays.
A display's framerate is not always equivalent to that of the content being displayed. In other words, a display capable of or operating at a high framerate does not necessarily mean that it can or must perform motion interpolation. For example, a TV running at 120 Hz and displaying 24 FPS content will simply display each content frame for five of the 120 display frames per second. This has no effect on the picture other than eliminating the need for 3:2 pulldown and thus film judder as a matter of course (since 120 is evenly divisible by 24). Eliminating judder results in motion that is less "jumpy" and which matches that of a theater projector. Motion interpolation can be used to reduce judder, but it is not required in order to do so. [1]
The advertised frame-rate of a specific display may refer to either the maximum number of content frames which may be displayed per second, or the number of times the display is refreshed in some way, irrespective of content. In the latter case, the actual presence or strength of any motion interpolation option may vary. In addition, the ability of a display to show content at a specific framerate does not mean that display is capable of accepting content running at that rate; most consumer displays above 60 Hz do not accept a higher frequency signal, but rather use the extra frame capability to eliminate judder, reduce ghosting, or create interpolated frames.
As an example, a TV may be advertised as "240 Hz", which would mean one of two things:
This section needs expansion. You can help by adding to it. (May 2012) |
Motion interpolation features are included with several video player applications.
Some video editing software and plugins offer motion interpolation effects to enhance digitally-slowed video. FFmpeg is a free software non-interactive tool with such functionality. Adobe After Effects has this in a feature called "Pixel Motion". AI software company Topaz Labs produces Video AI, a video upscaling application with motion interpolation. The effects plugin "Twixtor" is available for most major video editing suites, and offers similar functionality.
Motion interpolation on certain brands of TVs is sometimes accompanied by visual anomalies in the picture, described by CNET's David Carnoy as a "little tear or glitch" in the picture, appearing for a fraction of a second. He adds that the effect is most noticeable when the technology suddenly kicks in during a fast camera pan. [1] Television and display manufacturers refer to this phenomenon as a type of digital artifact. Due to the improvement of associated technology over time, such artifacts appear less frequently with modern consumer TVs, though they have yet to be eliminated "the artifacts happens more often when the gap between frames are bigger".[ citation needed ]
As a byproduct of the perceived increase in frame rate, motion interpolation may introduce a "video" (versus "film") look. This look is commonly referred to as the "soap opera effect" (SOE), in reference to the distinctive appearance of most broadcast television soap operas or pre-2000s multicam sitcoms, which were typically shot using less expensive 60i video rather than film. [8] Many complain that the soap opera effect ruins the theatrical look of cinematic works, by making it appear as if the viewer is either on set or watching a behind the scenes featurette. [9] Almost all manufacturers provide ways to disable the feature, but because methods and terminology differ, the UHD Alliance proposed that all televisions have a "Filmmaker Mode" button on remote controls to disable motion smoothing. [10]
Motion interpolation so annoys filmmakers that Tom Cruise and Christopher McQuarrie released a public service announcement in 2018 describing the effect and how to disable it. [11] Some sports viewers appreciate motion interpolation, [10] as it can reduce motion blur produced by camera pans and shaky cameras, and thus potentially yield better clarity of such images. It may also be used to increase the apparent framerate of video games for a more realistic feel, although the addition of display lag may be an undesired side effect. [12] This "video look" is created deliberately by the VidFIRE technique to restore archive television programs that only survive as film telerecordings, such as early seasons of the TV series Doctor Who . [13] The main differences between an artificially (interpolated) and naturally (in-camera) high framerate are that in-camera is not subject to any of the aforementioned artifacts, contains more accurate (or "true to life") image data, and requires more storage space and bandwidth, since frames are not produced in real time.[ citation needed ]
NTSC is the first American standard for analog television, published and adopted in 1941. In 1961, it was assigned the designation System M. It is also known as EIA standard 170.
Frame rate, most commonly expressed in frames per second or FPS, is typically the frequency (rate) at which consecutive images (frames) are captured or displayed. This definition applies to film and video cameras, computer animation, and motion capture systems. In these contexts, frame rate may be used interchangeably with frame frequency and refresh rate, which are expressed in hertz. Additionally, in the context of computer graphics performance, FPS is the rate at which a system, particularly a GPU, is able to generate frames, and refresh rate is the frequency at which a display shows completed frames. In electronic camera specifications frame rate refers to the maximum possible rate frames could be captured, but in practice, other settings may reduce the actual frequency to a lower number than the frame rate.
Motion compensation in computing is an algorithmic technique used to predict a frame in a video given the previous and/or future frames by accounting for motion of the camera and/or objects in the video. It is employed in the encoding of video data for video compression, for example in the generation of MPEG-2 files. Motion compensation describes a picture in terms of the transformation of a reference picture to the current picture. The reference picture may be previous in time or even from the future. When images can be accurately synthesized from previously transmitted/stored images, the compression efficiency can be improved.
Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of the characteristics of the human visual system.
Telecine is the process of transferring film into video and is performed in a color suite. The term is also used to refer to the equipment used in this post-production process.
Slow motion is an effect in film-making whereby time appears to be slowed down. It was invented by the Austrian priest August Musger in the early 20th century. This can be accomplished through the use of high-speed cameras and then playing the footage produced by such cameras at a normal rate like 30 fps, or in post production through the use of software.
The refresh rate, also known as vertical refresh rate or vertical scan rate in reference to terminology originating with the cathode-ray tubes (CRTs), is the number of times per second that a raster-based display device displays a new image. This is independent from frame rate, which describes how many images are stored or generated every second by the device driving the display. On CRT displays, higher refresh rates produce less flickering, thereby reducing eye strain. In other technologies such as liquid-crystal displays, the refresh rate affects only how often the image can potentially be updated.
In video technology, 24p refers to a video format that operates at 24 frames per second frame rate with progressive scanning. Originally, 24p was used in the non-linear editing of film-originated material. Today, 24p formats are being increasingly used for aesthetic reasons in image acquisition, delivering film-like motion characteristics. Some vendors advertise 24p products as a cheaper alternative to film acquisition.
Deinterlacing is the process of converting interlaced video into a non-interlaced or progressive form. Interlaced video signals are commonly found in analog television, VHS, Laserdisc, digital television (HDTV) when in the 1080i format, some DVD titles, and a smaller number of Blu-ray discs.
Flicker is a visible change in brightness between cycles displayed on video displays. It applies to the refresh interval on cathode-ray tube (CRT) televisions and computer monitors, as well as plasma computer displays and televisions.
576i is a standard-definition digital video mode, originally used for digitizing 625 line analogue television in most countries of the world where the utility frequency for electric power distribution is 50 Hz. Because of its close association with the legacy colour encoding systems, it is often referred to as PAL, PAL/SECAM or SECAM when compared to its 60 Hz NTSC-colour-encoded counterpart, 480i.
Progressive segmented Frame is a scheme designed to acquire, store, modify, and distribute progressive scan video using interlaced equipment.
Flicker-free is a term given to video displays, primarily cathode ray tubes, operating at a high refresh rate to reduce or eliminate the perception of screen flicker. For televisions, this involves operating at a 100 Hz or 120 Hz hertz field rate to eliminate flicker, compared to standard televisions that operate at 50 Hz or 60 Hz (NTSC), most simply done by displaying each field twice, rather than once. For computer displays, this is usually a refresh rate of 70–90 Hz, sometimes 100 Hz or higher. This should not be confused with motion interpolation, though they may be combined – see implementation, below.
This article discusses moving image capture, transmission and presentation from today's technical and creative points of view; concentrating on aspects of frame rates.
High-motion is the characteristic of video or film footage displayed possessing a sufficiently high frame rate that moving images do not blur or strobe even when tracked closely by the eye. The most common forms of high motion are NTSC and PAL video at their native display rates. Movie film does not portray high motion even when shown on television monitors.
Pixel Plus, is a proprietary digital filter image processing technology developed by Philips, who claims that it enhances the display of analogue broadcast signals on their TVs.
Television standards conversion is the process of changing a television transmission or recording from one video system to another. Converting video between different numbers of lines, frame rates, and color models in video pictures is a complex technical problem. However, the international exchange of television programming makes standards conversion necessary so that video may be viewed in another nation with a differing standard. Typically video is fed into video standards converter which produces a copy according to a different video standard. One of the most common conversions is between the NTSC and PAL standards.
Display motion blur, also called HDTV blur and LCD motion blur, refers to several visual artifacts that are frequently found on modern consumer high-definition television sets and flat-panel displays for computers.
In motion picture technology—either film or video—high frame rate (HFR) refers to higher frame rates than typical prior practice.