Film frame

Last updated

In filmmaking, video production, animation, and related fields, a frame is one of the many still images which compose the complete moving picture. The term is derived from the fact that, from the beginning of modern filmmaking toward the end of the 20th century, and in many places still up to the present, the single images have been recorded on a strip of photographic film that quickly increased in length, historically; each image on such a strip looks rather like a framed picture when examined individually.

Contents

The term may also be used more generally as a noun or verb to refer to the edges of the image as seen in a camera viewfinder or projected on a screen. Thus, the camera operator can be said to keep a car in frame by panning with it as it speeds past.

Overview

When the moving picture is displayed, each frame is flashed on a screen for a short time (nowadays, usually 1/24, 1/25 or 1/30 of a second) and then immediately replaced by the next one. Persistence of vision blends the frames together, producing the illusion of a moving image.

The frame is also sometimes used as a unit of time, so that a momentary event might be said to last six frames, the actual duration of which depends on the frame rate of the system, which varies according to the video or film standard in use. In North America and Japan, 30 frames per second (fps) is the broadcast standard, with 24 frames/s now common in production for high-definition video shot to look like film. In much of the rest of the world, 25 frames/s is standard.

In systems historically based on NTSC standards, for reasons originally related to the Chromilog NTSC TV systems, the exact frame rate is actually (3579545 / 227.5) / 525 = 29.97002616 fps. [lower-alpha 1] This leads to many synchronization problems which are unknown outside the NTSC world, and also brings about hacks such as drop-frame timecode.

In film projection, 24 fps is the normal, except in some special venue systems, such as IMAX, Showscan and Iwerks 70, where 30, 48 or even 60 frame/s have been used. Silent films and 8 mm amateur movies used 16 or 18 frame/s.

Physical film frames

In a strip of movie film, individual frames are separated by frame lines. Normally, 24 frames are needed for one second of film. In ordinary filming, the frames are photographed automatically, one after the other, in a movie camera. In special effects or animation filming, the frames are often shot one at a time.

The size of a film frame varies, depending on the still film format or the motion picture film format. In the smallest 8 mm amateur format for motion pictures film, it is only about 4.8 by 3.5 mm, while an IMAX frame is as large as 69.6 by 48.5 mm. The larger the frame size is in relation to the size of the projection screen, the sharper the image will appear.

The size of the film frame of motion picture film also depends on the location of the holes, the size of the holes, the shape of the holes. and the location and type of sound stripe.

The most common film format, 35 mm, has a frame size of 36 by 24 mm when used in a still 35 mm camera where the film moves horizontally, but the frame size varies when used for motion picture where the film moves vertically (with the exception of VistaVision and Technirama where the film moves horizontally). Using a 4-perf pulldown, there are exactly 16 frames in one foot of 35 mm film, leading to film frames sometimes being counted in terms of "feet and frames". The maximum frame size is 18 by 24 mm, (silent/full aperture), but this is significantly reduced by the application of sound track(s). A system called KeyKode is often used to identify specific physical film frames in a production.

Video frames

Historically, video frames were represented as analog waveforms in which varying voltages represented the intensity of light in an analog raster scan across the screen. Analog blanking intervals separated video frames in the same way that frame lines did in film. For historical reasons, most systems used an interlaced scan system in which the frame typically consisted of two video fields sampled over two slightly different periods of time. This meant that a single video frame was usually not a good still picture of the scene, unless the scene being shot was completely still.

With the dominance of digital technology, modern video systems now represent the video frame as a rectangular raster of pixels, either in an RGB color space or a color space such as YCbCr, and the analog waveform is typically found nowhere other than in legacy I/O[ clarification needed ] devices.

Standards for the digital video frame raster include Rec. 601 for standard-definition television and Rec. 709 for high-definition television.

Video frames are typically identified using SMPTE time code.

Line and resolution

The frame is composed of picture elements just like a chess board. Each horizontal set of picture elements is known as a line. The picture elements in a line are transmitted as sine signals where a pair of dots, one dark and one light can be represented by a single sine. The product of the number of lines and the number of maximum sine signals per line is known as the total resolution of the frame. The higher the resolution the more faithful the displayed image is to the original image. But higher resolution introduces technical problems and extra cost. So a compromise should be reached in system designs both for satisfactory image quality and affordable price.

Viewing distance

The key parameter to determine the lowest resolution still satisfactory to viewers is the viewing distance, i.e. the distance between the eyes and the monitor. The total resolution is inversely proportional to the square of the distance. If d is the distance, r is the required minimum resolution and k is the proportionality constant which depends on the size of the monitor;

Since the number of lines is approximately proportional to the resolution per line, the above relation can also be written as

where n is the number of lines. That means that the required resolution is proportional to the height of the monitor and inversely proportional to the viewing distance.

Moving picture

In moving picture (TV) the number of frames scanned per second is known as the frame rate. The higher the frame rate, the better the sense of motion. But again, increasing the frame rate introduces technical difficulties. So the frame rate is fixed at 25 (System B/G) or 29.97 (System M). To increase the sense of motion it is customary to scan the very same frame in two consecutive phases. In each phase only half of the lines are scanned; only the lines with odd numbers in the first phase and only the lines with even numbers in the second phase. Each scan is known as a field. So the field rate is two times the frame rate.

Example (System B)

In system B the number of lines is 625 and the frame rate is 25. The maximum video bandwidth is 5 MHz. [1] The maximum number of sine signals the system is theorically capable of transmitting is given as follows:

The system is able to transmit 5 000 000 sine signals in a second. Since the frame rate is 25, the maximum number of sine signals per frame is 200 000. Dividing this number by the number of lines gives the maximum number of sine signals in a line which is 320. (Actually about 19% of each line is devoted to auxiliary services. So the number of maximum useful sine signals is about 260.)

Still frame

A badly chosen still can give a misleading impression.
This still may imply that the content concerns the letter W (thumbtime=1).
A better preview, which implies an interview, for the same video (thumbtime=58).

A still frame is a single static image taken from a film or video, which are kinetic (moving) images. Still frames are also called freeze frame, video prompt, preview or misleadingly thumbnail, keyframe, poster frame, [2] [3] or screen shot/grab/capture/dump. Freeze frames are widely used on video platforms and in video galleries, to show viewers a preview or a teaser. Many video platforms have a standard to display a frame from mid-time of the video. Some platforms offer the option to choose a different frame individually. [4] [5]

Video and film artists sometimes use still frames within the video/film to achieve special effects, like freeze-frame shots or still motion. [6]

Investigations

For criminal investigations it has become a frequent use to publish still frames from surveillance videos in order to identify suspect persons and to find more witnesses. [7] Videos of the J.F. Kennedy assassination have been often discussed frame-by-frame for various interpretations. [8] For medical diagnostics it is very useful to watch still frames of Magnetic resonance imaging videos. [9]

Fourth wall usage

Some humor in animation is based on the fourth wall aspect of the film frame itself, with some animation showing characters leaving what is assumed to be the edge of the film or the film malfunctioning. This latter one is used often in films as well. This hearkens back to some early cartoons, where characters were aware that they were in a cartoon, specifically that they could look at the credits and be aware of something that isn't part of the story as presented. These jokes include:

See also

Notes

  1. In actual practice, the master oscillator is 14.31818 MHz, which is divided by 4 to give the 3.579545 MHz color "burst" frequency, which is further divided by 455 to give the 31468.5275 KHz "equalizing pulse" frequency, this is further divided by 2 toorizontal line rate), the "equalizing pulse" frequency is divided by 525 to give the 59.9401 Hz "vertical drive" frequency, and this is further divided by 2 to give the 29.9700 vertical frame rate. "Equalizing pulses" perform two essential functions: 1) their use during the vertical retrace interval allows for the vertical synch to be more effectively separated from the horizontal synch, as these, along with the video itself, are an example of "in band" signaling, and 2) by alternately including or excluding one "equalizing pulse", the required half-line offset necessary for interlaced video may be accommodated.

Related Research Articles

Digital video Digital electronic representation of moving visual images

Digital video is an electronic representation of moving visual images (video) in the form of encoded digital data. This is in contrast to analog video, which represents moving visual images in the form of analog signals. Digital video comprises a series of digital images displayed in rapid succession.

NTSC Analog color television system developed in the United States

The National Television System Committee (NTSC) developed the analog television format encoding system that was introduced in North America in 1954 and stayed in use until digital conversion. It is one of three major analog format television standards, the others being PAL and SECAM. All the countries using NTSC are currently in the process of conversion, or have already converted to the ATSC standard, or to DVB, ISDB or DTMB.

Video Electronic moving image

Video is an electronic medium for the recording, copying, playback, broadcasting, and display of moving visual media. Video was first developed for mechanical television systems, which were quickly replaced by cathode-ray tube (CRT) systems which, in turn, were replaced by flat panel displays of several types.

Interlaced video Technique for doubling the perceived frame rate of a video display

Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of the phi phenomenon.

Progressive scanning is a format of displaying, storing, or transmitting moving images in which all the lines of each frame are drawn in sequence. This is in contrast to interlaced video used in traditional analog television systems where only the odd lines, then the even lines of each frame are drawn alternately, so that only half the number of actual image frames are used to produce video. The system was originally known as "sequential scanning" when it was used in the Baird 240 line television transmissions from Alexandra Palace, United Kingdom in 1936. It was also used in Baird's experimental transmissions using 30 lines in the 1920s. Progressive scanning became universally used in computer screens beginning in the early 21st century.

Telecine Process for broadcasting content stored on film stock

Telecine is the process of transferring film into video and is performed in a color suite. The term is also used to refer to the equipment used in the post-production process. Telecine enables a motion picture, captured originally on film stock, to be viewed with standard video equipment, such as television sets, video cassette recorders (VCR), DVD, Blu-ray Disc or computers. Initially, this allowed television broadcasters to produce programmes using film, usually 16mm stock, but transmit them in the same format, and quality, as other forms of television production. Furthermore, telecine allows film producers, television producers and film distributors working in the film industry to release their productions on video and allows producers to use video production equipment to complete their filmmaking projects. Within the film industry, it is also referred to as a TK, because TC is already used to designate timecode. Motion picture film scanners are similar to telecines.

The refresh rate is the number of times per second that a raster-based display device displays a new image. This is independent from frame rate, which describes how many images are stored or generated every second by the device driving the display.

Display resolution Number of distinct pixels in each dimension that can be displayed

The display resolution or display modes of a digital television, computer monitor or display device is the number of distinct pixels in each dimension that can be displayed. It can be an ambiguous term especially as the displayed resolution is controlled by different factors in cathode ray tube (CRT) displays, flat-panel displays and projection displays using fixed picture-element (pixel) arrays.

Deinterlacing is the process of converting interlaced video into a non-interlaced or progressive form. Interlaced video signals are commonly found in analog television, digital television (HDTV) when in the 1080i format, some DVD titles, and a smaller number of Blu-ray discs.

1080i is a combination of frame resolution and scan type. 1080i is used in high-definition television (HDTV) and high-definition video. The number "1080" refers to the number of horizontal lines on the screen. The "i" is an abbreviation for "interlaced"; this indicates that only the odd lines, then the even lines of each frame are drawn alternately, so that only half the number of actual image frames are used to produce video. A related display resolution is 1080p, which also has 1080 lines of resolution; the "p" refers to progressive scan, which indicates that the lines of resolution for each frame are "drawn" on the screen in sequence.

Film-out is the process in the computer graphics, video production and filmmaking disciplines of transferring images or animation from videotape or digital files to a traditional film print. Film-out is a broad term that encompasses the conversion of frame rates, color correction, as well as the actual printing, also called scannior recording.

High-definition video is video of higher resolution and quality than standard-definition. While there is no standardized meaning for high-definition, generally any video image with considerably more than 480 vertical scan lines or 576 vertical lines (Europe) is considered high-definition. 480 scan lines is generally the minimum even though the majority of systems greatly exceed that. Images of standard resolution captured at rates faster than normal, by a high-speed camera may be considered high-definition in some contexts. Some television series shot on high-definition video are made to look as if they have been shot on film, a technique which is often known as filmizing.

Optical resolution describes the ability of an imaging system to resolve detail, in the object that is being imaged.

Motion picture film scanner Device that digitises film stock

A motion picture film scanner is a device used in digital filmmaking to scan original film for storage as high-resolution digital intermediate files.

High-speed photography Photography genre

High-speed photography is the science of taking pictures of very fast phenomena. In 1948, the Society of Motion Picture and Television Engineers (SMPTE) defined high-speed photography as any set of photographs captured by a camera capable of 69 frames per second or greater, and of at least three consecutive frames. High-speed photography can be considered to be the opposite of time-lapse photography.

Raster scan Rectangular pattern of image capture and reconstruction

A raster scan, or raster scanning, is the rectangular pattern of image capture and reconstruction in television. By analogy, the term is used for raster graphics, the pattern of image storage and transmission used in most computer bitmap image systems. The word raster comes from the Latin word rastrum, which is derived from radere ; see also rastrum, an instrument for drawing musical staff lines. The pattern left by the lines of a rake, when drawn straight, resembles the parallel lines of a raster: this line-by-line scanning is what creates a raster. It is a systematic process of covering the area progressively, one line at a time. Although often a great deal faster, it is similar in the most general sense to how one's gaze travels when one reads lines of text. The data to be drawn is stored in an area of memory called the refresh buffer or frame buffer. This memory area holds the values for each pixel on the screen. These values are retrieved from the refresh buffer and painted onto the screen one row at a time.

Television standards conversion is the process of changing a television transmission or recording from one television system to another. The most common is from NTSC to PAL or the other way around. This is done so television programs in one nation may be viewed in a nation with a different standard. The video is fed through a video standards converter, which makes a copy in a different video system.

The technology of television has evolved since its early days using a mechanical system invented by Paul Gottlieb Nipkow in 1884. Every television system works on the scanning principle first implemented in the rotating disk scanner of Nipkow. This turns a two-dimensional image into a time series of signals that represent the brightness and color of each resolvable element of the picture. By repeating a two-dimensional image quickly enough, the impression of motion can be transmitted as well. For the receiving apparatus to reconstruct the image, synchronization information is included in the signal to allow proper placement of each line within the image and to identify when a complete image has been transmitted and a new image is to follow.

High-definition television describes a television system providing a substantially higher image resolution than the previous generation of technologies. The term has been used since 1936, in more recent times it refers to the generation following standard-definition television (SDTV), often abbreviated to HDTV or HD-TV. It is the current de facto standard video format used in most broadcasts: terrestrial broadcast television, cable television, satellite television and Blu-ray Discs.

This glossary defines terms that are used in the document "Defining Video Quality Requirements: A Guide for Public Safety", developed by the Video Quality in Public Safety (VQIPS) Working Group. It contains terminology and explanations of concepts relevant to the video industry. The purpose of the glossary is to inform the reader of commonly used vocabulary terms in the video domain. This glossary was compiled from various industry sources.

References

  1. Reference Data for Radio Engineers, ITT Howard W.Sams Co., New York, 1977, section 30
  2. Microsoft: Add a poster frame to your video, retrieved 29 June 2014
  3. Indezine: Poster Frames for Videos in PowerPoint 2010 for Windows, retrieved 29 June 2014
  4. Vimeo: How do I change the thumbnail of my video?, retrieved 29 June 2014
  5. MyVideo: Editing my video, retrieved 29 June 2014
  6. Willie Witte: SCREENGRAB, retrieved 29 June 2014
  7. Wistv: Assaults, shooting in Five Points under investigation, retrieved 29 June 2014
  8. "The Other Shooter: The Saddest and Most Expensive 26 Seconds of Amateur Film Ever Made | Motherboard". motherboard.vice.com. Archived from the original on 30 November 2012. Retrieved 11 January 2022.
  9. Lister Hill National Center for Biomedical Communications: A classic diagnosis with a new ‘spin’, retrieved 29 June 2014