This article needs additional citations for verification . (August 2010) (Learn how and when to remove this template message)
Video is an electronic medium for the recording, copying, playback, broadcasting, and display of moving visual media.Video was first developed for mechanical television systems, which were quickly replaced by cathode ray tube (CRT) systems which were later replaced by flat panel displays of several types.
Video systems vary in display resolution, aspect ratio, refresh rate, color capabilities and other qualities. Analog and digital variants exist and can be carried on a variety of media, including radio broadcast, magnetic tape, optical discs, computer files, and network streaming.
Video technology was first developed for mechanical television systems, which were quickly replaced by cathode ray tube (CRT) television systems, but several new technologies for video display devices have since been invented. Video was originally exclusively a live technology. Charles Ginsburg led an Ampex research team developing one of the first practical video tape recorder (VTR). In 1951 the first video tape recorder captured live images from television cameras by converting the camera's electrical impulses and saving the information onto magnetic video tape.
Video recorders were sold for US$50,000 in 1956, and videotapes cost US$300 per one-hour reel.However, prices gradually dropped over the years; in 1971, Sony began selling videocassette recorder (VCR) decks and tapes into the consumer market.
The use of digital techniques in video created digital video. It could not initially compete with analog video, due to early digital uncompressed video requiring impractically high bitrates. Practical digital video was made possible with discrete cosine transform (DCT) coding,a lossy compression process developed in the early 1970s. DCT coding was adapted into motion-compensated DCT video compression in the late 1980s, starting with H.261, the first practical digital video coding standard.
Digital video was later capable of higher quality and, eventually, much lower cost than earlier analog technology. After the invention of the DVD in 1997, and later the Blu-ray Disc in 2006, sales of videotape and recording equipment plummeted. Advances in computer technology allows even inexpensive personal computers and smartphones to capture, store, edit and transmit digital video, further reducing the cost of video production, allowing program-makers and broadcasters to move to tapeless production. The advent of digital broadcasting and the subsequent digital television transition is in the process of relegating analog video to the status of a legacy technology in most parts of the world. As of 2015 [update] , with the increasing use of high-resolution video cameras with improved dynamic range and color gamuts, and high-dynamic-range digital intermediate data formats with improved color depth, modern digital video technology is converging with digital film technology.
Frame rate , the number of still pictures per unit of time of video, ranges from six or eight frames per second (frame/s) for old mechanical cameras to 120 or more frames per second for new professional cameras. PAL standards (Europe, Asia, Australia, etc.) and SECAM (France, Russia, parts of Africa etc.) specify 25 frame/s, while NTSC standards (USA, Canada, Japan, etc.) specify 29.97 frame/s.Film is shot at the slower frame rate of 24 frames per second, which slightly complicates the process of transferring a cinematic motion picture to video. The minimum frame rate to achieve a comfortable illusion of a moving image is about sixteen frames per second.
Video can be interlaced or progressive. In progressive scan systems, each refresh period updates all scan lines in each frame in sequence. When displaying a natively progressive broadcast or recorded signal, the result is optimum spatial resolution of both the stationary and moving parts of the image. Interlacing was invented as a way to reduce flicker in early mechanical and CRT video displays without increasing the number of complete frames per second. Interlacing retains detail while requiring lower bandwidth compared to progressive scanning.
In interlaced video, the horizontal scan lines of each complete frame are treated as if numbered consecutively, and captured as two fields: an odd field (upper field) consisting of the odd-numbered lines and an even field (lower field) consisting of the even-numbered lines. Analog display devices reproduce each frame, effectively doubling the frame rate as far as perceptible overall flicker is concerned. When the image capture device acquires the fields one at a time, rather than dividing up a complete frame after it is captured, the frame rate for motion is effectively doubled as well, resulting in smoother, more lifelike reproduction of rapidly moving parts of the image when viewed on an interlaced CRT display.
NTSC, PAL and SECAM are interlaced formats. Abbreviated video resolution specifications often include an i to indicate interlacing. For example, PAL video format is often described as 576i50, where 576 indicates the total number of horizontal scan lines, i indicates interlacing, and 50 indicates 50 fields (half-frames) per second.
When displaying a natively interlaced signal on a progressive scan device, overall spatial resolution is degraded by simple line doubling—artifacts such as flickering or "comb" effects in moving parts of the image which appear unless special signal processing eliminates them. A procedure known as deinterlacing can optimize the display of an interlaced video signal from an analog, DVD or satellite source on a progressive scan device such as an LCD television, digital video projector or plasma panel. Deinterlacing cannot, however, produce video quality that is equivalent to true progressive scan source material.
Aspect ratio describes the proportional relationship between the width and height of video screens and video picture elements. All popular video formats are rectangular, and so can be described by a ratio between width and height. The ratio width to height for a traditional television screen is 4:3, or about 1.33:1. High definition televisions use an aspect ratio of 16:9, or about 1.78:1. The aspect ratio of a full 35 mm film frame with soundtrack (also known as the Academy ratio) is 1.375:1.
Pixels on computer monitors are usually square, but pixels used in digital video often have non-square aspect ratios, such as those used in the PAL and NTSC variants of the CCIR 601 digital video standard, and the corresponding anamorphic widescreen formats. The 720 by 480 pixel raster uses thin pixels on a 4:3 aspect ratio display and fat pixels on a 16:9 display.
The popularity of viewing video on mobile phones has led to the growth of vertical video. Mary Meeker, a partner at Silicon Valley venture capital firm Kleiner Perkins Caufield & Byers, highlighted the growth of vertical video viewing in her 2015 Internet Trends Report – growing from 5% of video viewing in 2010 to 29% in 2015. Vertical video ads like Snapchat’s are watched in their entirety nine times more frequently than landscape video ads.
The color model the video color representation and maps encoded color values to visible colors reproduced by the system. There are several such representations in common use: typically YIQ is used in NTSC television, YUV is used in PAL television, YDbDr is used by SECAM television and YCbCr is used for digital video.
The number of distinct colors a pixel can represent depends on color depth expressed in the number of bits per pixel. A common way to reduce the amount of data required in digital video is by chroma subsampling (e.g., 4:4:4, 4:2:2, etc.). Because the human eye is less sensitive to details in color than brightness, the luminance data for all pixels is maintained, while the chrominance data is averaged for a number of pixels in a block and that same value is used for all of them. For example, this results in a 50% reduction in chrominance data using 2 pixel blocks (4:2:2) or 75% using 4 pixel blocks (4:2:0). This process does not reduce the number of possible color values that can be displayed, but it reduces the number of distinct points at which the color changes.
Video quality can be measured with formal metrics like Peak signal-to-noise ratio (PSNR) or through subjective video quality assessment using expert observation. Many subjective video quality methods are described in the ITU-T recommendation BT.500. One of the standardized methods is the Double Stimulus Impairment Scale (DSIS). In DSIS, each expert views an unimpaired reference video followed by an impaired version of the same video. The expert then rates the impaired video using a scale ranging from "impairments are imperceptible" to "impairments are very annoying".
Uncompressed video delivers maximum quality, but with a very high data rate. A variety of methods are used to compress video streams, with the most effective ones using a group of pictures (GOP) to reduce spatial and temporal redundancy. Broadly speaking, spatial redundancy is reduced by registering differences between parts of a single frame; this task is known as intraframe compression and is closely related to image compression. Likewise, temporal redundancy can be reduced by registering differences between frames; this task is known as interframe compression, including motion compensation and other techniques. The most common modern compression standards are MPEG-2, used for DVD, Blu-ray and satellite television, and MPEG-4, used for AVCHD, Mobile phones (3GP) and Internet.
Stereoscopic video for 3d film and other applications can be displayed using several different methods:
Different layers of video transmission and storage each provide their own set of formats to choose from.
For transmission, there is a physical connector and signal protocol (see List of video connectors). A given physical link can carry certain display standards that specify a particular refresh rate, display resolution, and color space.
Many analog and digital recording formats are in use, and digital video clips can also be stored on a computer file system as files, which have their own formats. In addition to the physical format used by the data storage device or transmission medium, the stream of ones and zeros that is sent must be in a particular digital video coding format, of which a number are available (see List of video coding formats).
Analog video is a video signal represented by one or more analog signals. Analog color video signals include luminance, brightness (Y) and chrominance (C). When combined into one channel, as is the case, among others with NTSC, PAL and SECAM it is called composite video. Analog video may be carried in separate channels, as in two channel S-Video (YC) and multi-channel component video formats.
Analog video is used in both consumer and professional television production applications.
Digital video signal formats have been adopted, including serial digital interface (SDI), Digital Visual Interface (DVI), High-Definition Multimedia Interface (HDMI) and DisplayPort Interface.
Video can be transmitted or transported in a variety of ways including wireless terrestrial television as an analog or digital signal, coaxial cable in a closed circuit system as an analog signal. Broadcast or studio cameras use a single or dual coaxial cable system using serial digital interface (SDI). See List of video connectors for information about physical connectors and related signal standards.
Video may be transported over networks and other shared digital communications links using, for instance, MPEG transport stream, SMPTE 2022 and SMPTE 2110.
Digital television broadcasts use the MPEG-2 and other video coding formats and include:
Analog television broadcast standards include:
An analog video format consists of more information than the visible content of the frame. Preceding and following the image are lines and pixels containing metadata and synchronization information. This surrounding margin is known as a blanking interval or blanking region; the horizontal and vertical front porch and back porch are the building blocks of the blanking interval.
Computer display standards specify a combination of aspect ratio, display size, display resolution, color depth, and refresh rate. A list of common resolutions is available.
Early television was almost exclusively a live medium with some programs recorded to film for distribution of historical purposes using Kinescope. The analog video tape recorder was commercially introduced in 1951. In approximate chronological order. All formats listed were sold to and used by broadcasters, video producers or consumers; or were important historically (VERA).
Digital video tape recorders offered improved quality compared to analog recorders.
Optical storage mediums offered an alternative, especially in consumer applications, to bulky tape formats.
Digital video is an electronic representation of moving visual images (video) in the form of encoded digital data. This is in contrast to analog video, which represents moving visual images with analog signals. Digital video comprises a series of digital images displayed in rapid succession.
MPEG-2 is a standard for "the generic coding of moving pictures and associated audio information". It describes a combination of lossy video compression and lossy audio data compression methods, which permit storage and transmission of movies using currently available storage media and transmission bandwidth. While MPEG-2 is not as efficient as newer standards such as H.264/AVC and H.265/HEVC, backwards compatibility with existing hardware and software means it is still widely used, for example in over-the-air digital television broadcasting and in the DVD-Video standard.
NTSC, named after the National Television System Committee, is the analog television color system that was introduced in North America in 1954 and stayed in use until digital conversion. It was one of three major analog color television standards, the others being PAL and SECAM.
Phase Alternating Line (PAL) is a colour encoding system for analogue television used in broadcast television systems in most countries broadcasting at 625-line / 50 field per second (576i). It was one of three major analogue colour television standards, the others being NTSC and SECAM.
Standard-definition television is a television system which uses a resolution that is not considered to be either high or enhanced definition. SDTV and high-definition television (HDTV) are the two categories of display formats for digital television (DTV) transmissions. "Standard" refers to the fact that it was the prevailing specification for broadcast television in the mid- to late-20th century.
SECAM, also written SÉCAM, is an analog color television system first used in France. It was one of three major color television standards, the others being PAL and NTSC.
VHS is a standard for consumer-level analog video recording on tape cassettes. Developed by Victor Company of Japan (JVC) in the early 1970s, it was released in Japan on September 9, 1976, and in the United States on August 23, 1977.
Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of the phi phenomenon.
Chroma subsampling is the practice of encoding images by implementing less resolution for chroma information than for luma information, taking advantage of the human visual system's lower acuity for color differences than for luminance.
Advanced Television Systems Committee (ATSC) standards are an American set of standards for digital television transmission over terrestrial, cable and satellite networks. It is largely a replacement for the analog NTSC standard and, like that standard, is used mostly in the United States, Mexico and Canada. Japan and several former NTSC users have not used ATSC during their digital television transition, because they adopted their own system called ISDB.
Broadcast television systems are the encoding or formatting standards for the transmission and reception of terrestrial television signals. There were three main analog television systems in use around the world until the late 2010s (expected): NTSC, PAL, and SECAM. Now in digital terrestrial television (DTT), there are four main systems in use around the world: ATSC, DVB, ISDB and DTMB.
CIF, also known as FCIF, is a standardized format for the picture resolution, frame rate, color space, and color subsampling of digital video sequences used in video teleconferencing systems. It was first defined in the H.261 standard in 1988.
1080i is an abbreviation referring to a combination of frame resolution and scan type, used in high-definition television (HDTV) and high-definition video. The number "1080" refers to the number of horizontal lines on the screen. The "i" is an abbreviation for "interlaced"; this indicates that only the odd lines, then the even lines of each frame are drawn alternately, so that only half the number of actual image frames are used to produce video. A related display resolution is 1080p, which also has 1080 lines of resolution; the "p" refers to progressive scan, which indicates that the lines of resolution for each frame are "drawn" on the screen in sequence.
480i is a shorthand name for the video mode used for standard-definition analog or digital television in the Caribbean, Myanmar, Japan, South Korea, Taiwan, Philippines, Laos, Western Sahara, and most of the Americas. The 480 identifies a vertical resolution of 480 lines, and the i identifies it as an interlaced resolution. The field rate, which is 60 Hz, is sometimes included when identifying the video mode, i.e. 480i60; another notation, endorsed by both the International Telecommunication Union in BT.601 and SMPTE in SMPTE 259M, includes the frame rate, as in 480i/30. The other common standard, used in the other parts of the world, is 576i.
576i is a standard-definition video mode originally used for terrestrial television in most countries of the world where the utility frequency for electric power distribution is 50 Hz. Because of its close association with the colour encoding system, it is often referred to as simply PAL, PAL/SECAM or SECAM when compared to its 60 Hz NTSC-colour-encoded counterpart, 480i. In digital applications it is usually referred to as "576i"; in analogue contexts it is often called "625 lines", and the aspect ratio is usually 4:3 in analogue transmission and 16:9 in digital transmission.
Progressive segmented Frame is a scheme designed to acquire, store, modify, and distribute progressive scan video using interlaced equipment.
Analog high-definition television was an analog video broadcast television system developed in the 1930s to replace early experimental systems with as few as 12-lines. On 2 November 1936 the BBC began transmitting the world's first public regular analog high-definition television service from the Victorian Alexandra Palace in north London. It therefore claims to be the birthplace of television broadcasting as we know it today. John Logie Baird, Philo T. Farnsworth, and Vladimir Zworykin had each developed competing TV systems, but resolution was not the issue that separated their substantially different technologies, it was patent interference lawsuits and deployment issues given the tumultuous financial climate of the late 1920s and 1930s.
MUSE, was an analog high-definition television system, using dot-interlacing and digital video compression to deliver 1125-line high definition video signals to the home. Japan had the earliest working HDTV system, which was named Hi-Vision with design efforts going back to 1979. The country began broadcasting wideband analog HDTV signals in 1989 using 1035 active lines interlaced in the standard 2:1 ratio (1035i) with 1125 lines total. By the time of its commercial launch in 1991, digital HDTV was already under development in the United States. Hi-Vision continued broadcasting in analog until 2007.
Television standards conversion is the process of changing a television transmission or recording from one television system to another. The most common is from NTSC to PAL or the other way around. This is done so television programs in one nation may be viewed in a nation with a different standard. The video is fed through a video standards converter, which makes a copy in a different video system.
High-definition television (HD) describes a television system providing an image resolution of substantially higher resolution than the previous generation of technologies. The term has been used since 1936, but in modern times refers to the generation following standard-definition television (SDTV), often abbreviated to HDTV or HD-TV. It is the current standard video format used in most broadcasts: terrestrial broadcast television, cable television, satellite television, Blu-ray Discs, and the only difference is the picture quality of SD and HD. Prior to this, respective SD versions of the channels transmitted in 4:3 video with letterbox on it, making the image appears smaller.
|Wikimedia Commons has media related to Video .|
|Wikiquote has quotations related to: Video|
| Library resources about |