Uncompressed video

Last updated

Uncompressed video is digital video that either has never been compressed or was generated by decompressing previously compressed digital video. It is commonly used by video cameras, video monitors, video recording devices (including general-purpose computers), and in video processors that perform functions such as image resizing, image rotation, deinterlacing, and text and graphics overlay. It is conveyed over various types of baseband digital video interfaces, such as HDMI, DVI, DisplayPort and SDI. Standards also exist for the carriage of uncompressed video over computer networks.

Contents

Some HD video cameras output uncompressed video, whereas others compress the video using a lossy compression method such as MPEG or H.264. In any lossy compression process, some of the video information is removed, which creates compression artifacts and reduces the quality of the resulting decompressed video. When editing video, it is preferred to work with video that has never been compressed (or was losslessly compressed) as this maintains the best possible quality, with compression performed after completion of editing. [1]

Uncompressed video should not be confused with raw video. Raw video represents largely unprocessed data (e.g. without demosaicing) captured by an imaging device.

Recording

A standalone video recorder is a device that receives uncompressed video and stores it in either uncompressed or compressed form. These devices typically have a video output that can be used to monitor or playback recorded video. When playing back compressed video, the compressed video is uncompressed by the device before being output. Such devices may also have a communication interface, such as Ethernet or USB, which can used to exchange video files with an external computer, and in some cases control the recorder from an external computer as well.

Recording to a computer is a relatively inexpensive alternative to implementing a digital video recorder, but the computer and its video storage device (e.g., solid-state drive, RAID) must be fast enough to keep up with the high video data rate, which in some cases may be HD video or multiple video sources, or both. Due to the extreme computational and storage system performance demands of real-time video processing, other unnecessary program activity (e.g., background processes, virus scanners) and asynchronous hardware interfaces (e.g., computer networks) may be disabled, and the process priority of the recording realtime process may be increased, to avoid disruption of the recording process.

HDMI, DVI and HD-SDI inputs are available as PCI Express (partly multi-channel) or ExpressCard, USB 3.0 [2] and Thunderbolt interface [3] [4] [5] also for 2160p (4K resolution). [6] [7]

Software for recording uncompressed video is often supplied with suitable hardware or available for free e.g. Ingex. [8]

Network transmission

SMPTE 2022 and 2110 are standards for professional digital video over IP networks. SMPTE 2022 includes provisions for both compressed and uncompressed video formats. SMPTE 2110 carries uncompressed video, audio, and ancillary data as separate streams.

Wireless interfaces such as Wireless LAN (WLAN, Wi-Fi), WiDi, and Wireless Home Digital Interface can be used to transmit uncompressed standard definition (SD) video but not HD video because the HD bit rates would exceed the network bandwidth. HD can be transmitted using higher-speed interfaces such as WirelessHD and WiGig. In all cases, when video is conveyed over a network, communication disruptions or diminished bandwidth can corrupt the video or prevent its transmission.

Data rates

Uncompressed video has a constant bitrate that is based on pixel representation, image resolution, and frame rate:

data rate = color depth [lower-alpha 1] × vertical resolution × horizontal resolution × refresh frequency [ citation needed ]

For example:

The actual data rate may be higher because some transmission media for uncompressed video require defined blanking intervals, which effectively add unused pixels around the visible image.

See also

Notes

  1. Most of the time color depth can be calculated as 3 × single color depth. For example, values of a single color can be represented with a range from 0 to 255 (8 bits) which gives a total color depth as 3 × 8 = 24.
  2. Interlaced video formats transmit every other line, half the picture content, per field period. Two fields are required for a full frame so the vertical resolution is halved in this calculation.

Related Research Articles

<span class="mw-page-title-main">Digital video</span> Digital electronic representation of moving visual images

Digital video is an electronic representation of moving visual images (video) in the form of encoded digital data. This is in contrast to analog video, which represents moving visual images in the form of analog signals. Digital video comprises a series of digital images displayed in rapid succession, usually at 24, 30, or 60 frames per second. Digital video has many advantages such as easy copying, multicasting, sharing and storage.

In telecommunications and computing, bit rate is the number of bits that are conveyed or processed per unit of time.

<span class="mw-page-title-main">Serial digital interface</span> Family of digital video interfaces

Serial digital interface (SDI) is a family of digital video interfaces first standardized by SMPTE in 1989. For example, ITU-R BT.656 and SMPTE 259M define digital video interfaces used for broadcast-grade video. A related standard, known as high-definition serial digital interface (HD-SDI), is standardized in SMPTE 292M; this provides a nominal data rate of 1.485 Gbit/s.

<span class="mw-page-title-main">HDMI</span> Proprietary interface for transmitting digital audio and video data

High-Definition Multimedia Interface (HDMI) is a proprietary audio/video interface for transmitting uncompressed video data and compressed or uncompressed digital audio data from an HDMI-compliant source device, such as a display controller, to a compatible computer monitor, video projector, digital television, or digital audio device. HDMI is a digital replacement for analog video standards.

<span class="mw-page-title-main">HDCAM</span> Magnetic tape-based videocassette format for HD video

HDCAM is a high-definition video digital recording videocassette version of Digital Betacam introduced in 1997 that uses an 8-bit discrete cosine transform (DCT) compressed 3:1:1 recording, in 1080i-compatible down-sampled resolution of 1440×1080, and adding 24p and 23.976 progressive segmented frame (PsF) modes to later models. The HDCAM codec uses rectangular pixels and as such the recorded 1440×1080 content is upsampled to 1920×1080 on playback. The recorded video bit rate is 144 Mbit/s. Audio is also similar, with four channels of AES3 20-bit, 48 kHz digital audio. Like Betacam, HDCAM tapes were produced in small and large cassette sizes; the small cassette uses the same form factor as the original Betamax. The main competitor to HDCAM was the DVCPRO HD format offered by Panasonic, which uses a similar compression scheme and bit rates ranging from 40 Mbit/s to 100 Mbit/s depending on frame rate.

<span class="mw-page-title-main">1080p</span> Video mode

1080p is a set of HDTV high-definition video modes characterized by 1,920 pixels displayed across the screen horizontally and 1,080 pixels down the screen vertically; the p stands for progressive scan, i.e. non-interlaced. The term usually assumes a widescreen aspect ratio of 16:9, implying a resolution of 2.1 megapixels. It is often marketed as Full HD or FHD, to contrast 1080p with 720p resolution screens. Although 1080p is sometimes informally referred to as 2K, these terms reflect two distinct technical standards, with differences including resolution and aspect ratio.

Digital Cinema Initiatives, LLC (DCI) is a consortium of major motion picture studios, formed to establish specifications for a common systems architecture for digital cinema systems.

<span class="mw-page-title-main">DisplayPort</span> Digital display interface

DisplayPort (DP) is a digital display interface developed by a consortium of PC and chip manufacturers and standardized by the Video Electronics Standards Association (VESA). It is primarily used to connect a video source to a display device such as a computer monitor. It can also carry audio, USB, and other forms of data.

<span class="mw-page-title-main">Asynchronous serial interface</span> Standardised transport interface for the broadcast industry

Asynchronous Serial Interface, or ASI, is a method of carrying an MPEG Transport Stream (MPEG-TS) over 75-ohm copper coaxial cable or optical fiber. It is popular in the television industry as a means of transporting broadcast programs from the studio to the final transmission equipment before it reaches viewers sitting at home.

WirelessHD, also known as UltraGig, is a proprietary standard owned by Silicon Image for wireless transmission of high-definition video content for consumer electronics products. The consortium currently has over 40 adopters; key members behind the specification include Broadcom, Intel, LG, Panasonic, NEC, Samsung, SiBEAM, Sony, Philips and Toshiba. The founders intend the technology to be used for Consumer Electronic devices, PCs, and portable devices.

SMPTE 424M is a standard published by SMPTE which expands upon SMPTE 259M, SMPTE 344M, and SMPTE 292M allowing for bit-rates of 2.970 Gbit/s and 2.970/1.001 Gbit/s over a single-link coaxial cable. These bit-rates are sufficient for 1080p video at 50 or 60 frames per second. The initial 424M standard was published in 2006, with a revision published in 2012. This standard is part of a family of standards that define a serial digital interface (SDI); it is commonly known as 3G-SDI.

<span class="mw-page-title-main">SxS</span> Memory card format

SxS (S-by-S) is a flash memory standard compliant to the Sony and SanDisk-created ExpressCard standard. According to Sandisk and Sony, the cards have transfer rates of 800 Mbit/s and burst transfer rate of up to 2.5 Gbit/s over the ExpressCard's PCI Express interface. Sony uses these cards as the storage medium for their XDCAM EX line of professional video cameras.

Mobile High-Definition Link (MHL) is an industry standard for a mobile audio/video interface that allows the connection of smartphones, tablets, and other portable consumer electronics devices to high-definition televisions (HDTVs), audio receivers, and projectors. The standard was designed to share existing mobile device connectors, such as Micro-USB, and avoid the need to add video connectors on devices with limited space for them.

Ingex is an open-source (GPL) suite of software for the digital capture of audio and video data, without the need for traditional audio or video tape or cassettes. Serial digital interface (SDI) capture is supported, as well as real-time transcoding. Portions of the software suite also act as a network file server for media files, as well as archiving to LTO-3 data tape. Audio and video media files can also be stored on USB hard drives or Network Attached Storage. The software is heavily used by the BBC, and was developed by the BBC Research Laboratory.

<span class="mw-page-title-main">4K resolution</span> Video or display resolutions with a width of around 4,000 pixels

4K resolution refers to a horizontal display resolution of approximately 4,000 pixels. Digital television and digital cinematography commonly use several different 4K resolutions. In television and consumer media, 3840 × 2160 is the dominant 4K standard, whereas the movie projection industry uses 4096 × 2160.

A Digital Cinema Package (DCP) is a collection of digital files used to store and convey digital cinema (DC) audio, image, and data streams.

<span class="mw-page-title-main">HDBaseT</span> Point-to-point media connection over category cable

HDBaseT is a consumer electronic (CE) and commercial connectivity standard for transmission of uncompressed ultra-high-definition video, digital audio, DC power, Ethernet, USB 2.0, and other control communication over a single category cable up to 100 m (328 ft) in length, terminated using the same 8P8C modular connectors as used in Ethernet networks. HDBaseT technology is promoted and advanced by the HDBaseT Alliance.

XAVC is a recording format that was introduced by Sony on October 30, 2012. XAVC is a format that will be licensed to companies that want to make XAVC products.

Apple ProRes is a high quality, "visually lossless" lossy video compression format developed by Apple Inc. for use in post-production that supports video resolution up to 8K. It is the successor of the Apple Intermediate Codec and was introduced in 2007 with Final Cut Studio 2. Much like the H.26x and MPEG standards, the ProRes family of codecs use compression algorithms based on the discrete cosine transform (DCT). ProRes is widely used as a final format delivery method for HD broadcast files in commercials, features, Blu-ray and streaming.

<span class="mw-page-title-main">Ultra-high-definition television</span> Television formats beyond HDTV

Ultra-high-definition television today includes 4K UHD and 8K UHD, which are two digital video formats with an aspect ratio of 16:9. These were first proposed by NHK Science & Technology Research Laboratories and later defined and approved by the International Telecommunication Union (ITU).

References