Serial digital interface

Last updated
SDI
Serial digital interface
BNC 75 ohm connector.jpg
Serial digital interface uses BNC connectors
Year started1989 (1989)
Organization SMPTE (The Society of Motion Picture and Television Engineers)

Serial digital interface (SDI) is a family of digital video interfaces first standardized by SMPTE (The Society of Motion Picture and Television Engineers) in 1989. [1] [2] For example, ITU-R BT.656 and SMPTE 259M define digital video interfaces used for broadcast-grade video. A related standard, known as high-definition serial digital interface (HD-SDI), is standardized in SMPTE 292M; this provides a nominal data rate of 1.485 Gbit/s. [3]

Contents

Additional SDI standards have been introduced to support increasing video resolutions (HD, UHD and beyond), frame rates, stereoscopic (3D) video, [4] [5] and color depth. [6] Dual link HD-SDI consists of a pair of SMPTE 292M links, standardized by SMPTE 372M in 1998; [2] this provides a nominal 2.970 Gbit/s interface used in applications (such as digital cinema or HDTV 1080P) that require greater fidelity and resolution than standard HDTV can provide. 3G-SDI (standardized in SMPTE 424M) consists of a single 2.970 Gbit/s serial link that allows replacing dual link HD-SDI. 6G-SDI and 12G-SDI standards were published on March 19, 2015. [7] [8]

These standards are used for transmission of uncompressed, unencrypted digital video signals (optionally including embedded audio and time code) within television facilities; they can also be used for packetized data. SDI is used to connect together different pieces of equipment such as recorders, monitors, PCs and vision mixers. Coaxial variants of the specification range in length but are typically less than 300 meters (980 ft). Fiber optic variants of the specification such as 297M allow for long-distance transmission limited only by maximum fiber length or repeaters. SDI and HD-SDI are usually available only in professional video equipment because various licensing agreements restrict the use of unencrypted digital interfaces, such as SDI, prohibiting their use in consumer equipment. Several professional video and HD-video capable DSLR cameras and all uncompressed video capable consumer cameras use the HDMI interface, often called clean HDMI. There are various mod kits for existing DVD players and other devices, which allow a user to add a serial digital interface to these devices.[ citation needed ]

Electrical interface

The various serial digital interface standards all use (one or more) coaxial cables with BNC connectors, with a nominal impedance of 75 ohms. This is the same type of cable used in analog video setups, which potentially makes for easier upgrades (though higher quality cables may be necessary for long runs at the higher bitrates). The specified signal amplitude at the source is 800 mV (±10%) peak-to-peak; far lower voltages may be measured at the receiver owing to attenuation. Using equalization at the receiver, it is possible to send 270 Mbit/s SDI over 300 meters (980 ft) without use of repeaters, but shorter lengths are preferred. The HD bitrates have a shorter maximum run length, typically 100 meters (330 ft). [9] [10]

Uncompressed digital component signals are transmitted. Data is encoded in NRZI format, and a linear feedback shift register is used to scramble the data to reduce the likelihood that long strings of zeroes or ones will be present on the interface. The interface is self-synchronizing and self-clocking. Framing is done by detection of a special synchronization pattern, which appears on the (unscrambled) serial digital signal to be a sequence of ten ones followed by twenty zeroes (twenty ones followed by forty zeroes in HD); this bit pattern is not legal anywhere else within the data payload.

Standards

StandardNameIntroducedBitratesExample video formats
SMPTE 259M SD-SDI1989 [2] 270 Mbit/s, 360 Mbit/s, 143 Mbit/s, and 177 Mbit/s480i, 576i
SMPTE 344M ED-SDI2000 [11] 540 Mbit/s480p, 576p
SMPTE 292M HD-SDI1998 [2] 1.485 Gbit/s, and 1.485/1.001 Gbit/s720p, 1080i
SMPTE 372M Dual Link HD-SDI2002 [2] 2.970 Gbit/s, and 2.970/1.001 Gbit/s1080p60
SMPTE 424M 3G-SDI2006 [2] 2.970 Gbit/s, and 2.970/1.001 Gbit/s1080p60
SMPTE ST 20816G-SDI2015 [7] 6 Gbit/s1080p120, 2160p30
SMPTE ST 208212G-SDI2015 [8] 12 Gbit/s2160p60
SMPTE ST 208324G-SDIIn development [12] [13] 24 Gbit/s2160p120, 4320p30

Bit rates

Several bit rates are used in serial digital video signal:

Other interfaces

SMPTE 297-2006 defines an optical fiber system for transmitting bit-serial digital signals It is intended for transmitting SMPTE ST 259 signals (143 through 360 Mbit/s), SMPTE ST 344 signals (540 Mbit/s), SMPTE ST 292-1/-2 signals (1.485 Gbit/s and 1.485/1.001 Gbit/s) and SMPTE ST 424 signals (2.970 Gbit/s and 2.970/1.001 Gbit/s). In addition to optical specification, ST 297 also mandates laser safety testing and that all optical interfaces are labelled to indicate safety compliance, application and interoperability. [14]

An 8-bit parallel digital interface is defined by ITU-R Rec. 601; this is obsolete (however, many clauses in the various standards accommodate the possibility of an 8-bit interface).

Data format

In SD and ED applications, the serial data format is defined to 10 bits wide, whereas in HD applications, it is 20 bits wide, divided into two parallel 10-bit datastreams (known as Y and C). The SD datastream is arranged like this:

Cb Y Cr Y' Cb Y Cr Y'

whereas the HD datastreams are arranged like this:

Y
Y Y' Y Y' Y Y' Y Y'
C
Cb Cr Cb Cr Cb Cr Cb Cr

For all serial digital interfaces (excluding the obsolete composite encodings), the native color encoding is 4:2:2 YCbCr format. The luminance channel (Y) is encoded at full bandwidth (13.5 MHz in 270 Mbit/s SD, ~75 MHz in HD), and the two chrominance channels (Cb and Cr) are subsampled horizontally, and encoded at half bandwidth (6.75 MHz or 37.5 MHz). The Y, Cr, and Cb samples are co-sited (acquired at the same instance in time), and the Y' sample is acquired at the time halfway between two adjacent Y samples.

In the above, Y refers to luminance samples, and C to chrominance samples. Cr and Cb further refer to the red and blue "color difference" channels; see Component Video for more information. This section only discusses the native color encoding of SDI; other color encodings are possible by treating the interface as a generic 10-bit data channel. The use of other colorimetry encodings, and the conversion to and from RGB colorspace, is discussed below.

Video payload (as well as ancillary data payload) may use any 10-bit word in the range 4 to 1,019 (00416 to 3FB16) inclusive; the values 0–3 and 1,020–1,023 (3FC16–3FF16) are reserved and may not appear anywhere in the payload. These reserved words have two purposes; they are used both for Synchronization packets and for Ancillary data headers.

Synchronization packets

A synchronization packet (commonly known as the timing reference signal or TRS) occurs immediately before the first active sample on every line, and immediately after the last active sample (and before the start of the horizontal blanking region). The synchronization packet consists of four 10-bit words, the first three words are always the same—0x3FF, 0, 0; the fourth consists of 3 flag bits, along with an error correcting code. As a result, there are 8 different synchronization packets possible.

In the HD-SDI and dual link interfaces, synchronization packets must occur simultaneously in both the Y and C datastreams. (Some delay between the two cables in a dual link interface is permissible; equipment which supports dual link is expected to buffer the leading link in order to allow the other link to catch up). In SD-SDI and enhanced definition interfaces, there is only one datastream, and thus only one synchronization packet at a time. Other than the issue of how many packets appear, their format is the same in all versions of the serial-digital interface.

The flags bits found in the fourth word (commonly known as the XYZ word) are known as H, F, and V. The H bit indicates the start of horizontal blank; and synchronization bits immediately preceding the horizontal blanking region must have H set to one. Such packets are commonly referred to as End of Active Video, or EAV packets. Likewise, the packet appearing immediately before the start of the active video has H set to 0; this is the Start of Active Video or SAV packet.

Likewise, the V bit is used to indicate the start of the vertical blanking region; an EAV packet with V=1 indicates the following line (lines are deemed to start at EAV) is part of the vertical interval, an EAV packet with V=0 indicates the following line is part of the active picture.

The F bit is used in interlaced and segmented-frame formats to indicate whether the line comes from the first or second field (or segment). In progressive scan formats, the F bit is always set to zero.

Line counter & CRC

In the high definition serial digital interface (and in dual-link HD), additional check words are provided to increase the robustness of the interface. In these formats, the four samples immediately following the EAV packets (but not the SAV packets) contain a cyclic redundancy check field, and a line count indicator. The CRC field provides a CRC of the preceding line (CRCs are computed independently for the Y and C streams), and can be used to detect bit errors in the interface. The line count field indicates the line number of the current line.

The CRC and line counts are not provided in the SD and ED interfaces. Instead, a special ancillary data packet known as an EDH packet may be optionally used to provide a CRC check on the data.

Line and sample numbering

Each sample within a given datastream is assigned a unique line and sample number. In all formats, the first sample immediately following the SAV packet is assigned sample number 0; the next sample is sample 1; all the way up to the XYZ word in the following SAV packet. In SD interfaces, where there is only one datastream, the 0th sample is a Cb sample; the 1st sample a Y sample, the 2nd sample a Cr sample, and the third sample is the Y' sample; the pattern repeats from there. In HD interfaces, each datastream has its own sample numbering—so the 0th sample of the Y datastream is the Y sample, the next sample the Y' sample, etc. Likewise, the first sample in the C datastream is Cb, followed by Cr, followed by Cb again.

Lines are numbered sequentially, starting from 1, up to the number of lines per frame of the indicated format (typically 525, 625, 750, or 1125 (Sony HDVS)). Determination of line 1 is somewhat arbitrary; however it is unambiguously specified by the relevant standards. In 525-line systems, the first line of vertical blank is line 1, whereas in other interlaced systems (625 and 1125-line), the first line after the F bit transitions to zero is line 1.

Note that lines are deemed to start at EAV, whereas sample zero is the sample following SAV. This produces the somewhat confusing result that the first sample in a given line of 1080i video is sample number 1920 (the first EAV sample in that format), and the line ends at the following sample 1919 (the last active sample in that format). Note that this behavior differs somewhat from analog video interfaces, where the line transition is deemed to occur at the sync pulse, which occurs roughly halfway through the horizontal blanking region.

Link numbering is only an issue in multi-link interfaces. The first link (the primary link), is assigned a link number of 1, subsequent links are assigned increasing link numbers; so the second (secondary) link in a dual-link system is link 2. The link number of a given interface is indicated by a VPID packet located in the vertical ancillary data space.

Note that the data layout in dual link is designed so that the primary link can be fed into a single-link interface, and still produce usable (though somewhat degraded) video. The secondary link generally contains things like additional LSBs (in 12-bit formats), non-cosited samples in 4:4:4 sampled video (so that the primary link is still valid 4:2:2), and alpha or data channels. If the second link of a 1080P dual link configuration is absent, the first link still contains a valid 1080i signal.

In the case of 1080p60, 59.94, or 50 Hz video over a dual link; each link contains a valid 1080i signal at the same field rate. The first link contains the 1st, 3rd, and 5th lines of odd fields and the 2nd, 4th, 6th, etc. lines of even fields, and the second link contains the even lines on the odd fields, and the odd lines on the even fields. When the two links are combined, the result is a progressive-scan picture at the higher frame rate.

Ancillary data

Like SMPTE 259M, SMPTE 292M supports the SMPTE 291M standard for ancillary data. Ancillary data is provided as a standardized transport for non-video payload within a serial digital signal; it is used for things such as embedded audio, closed captions, timecode, and other sorts of metadata. Ancillary data is indicated by a 3-word packet consisting of 0, 3FF, 3FF (the opposite of the synchronization packet header), followed by a two-word identification code, a data count word (indicating 0 - 255 words of payload), the actual payload, and a one-word checksum. Other than in their use in the header, the codes prohibited to video payload are also prohibited to ancillary data payload.

Specific applications of ancillary data include embedded audio, EDH, VPID and SDTI.

In dual link applications; ancillary data is mostly found on the primary link; the secondary link is to be used for ancillary data only if there is no room on the primary link. One exception to this rule is the VPID packet; both links must have a valid VPID packet present.

Embedded audio

Both the HD and SD serial interfaces provide for 16 channels of embedded audio. The two interfaces use different audio encapsulation methods — SD uses the SMPTE 272M standard, whereas HD uses the SMPTE 299M standard. In either case, an SDI signal may contain up to sixteen audio channels (8 pairs) embedded 48 kHz, 24-bit audio channels along with the video. Typically, 48 kHz, 24-bit (20-bit in SD, but extendable to 24 bit) PCM audio is stored, in a manner directly compatible with the AES3 digital audio interface. These are placed in the (horizontal) blanking periods, when the SDI signal carries nothing useful, since the receiver generates its own blanking signals from the TRS.

In dual-link applications, 32 channels of audio are available, as each link may carry 16 channels.

SMPTE ST 299-2:2010 extends the 3G SDI interface to be able to transmit 32 audio channels (16 pairs) on a single link.

EDH

As the standard definition interface carries no checksum, CRC, or other data integrity check, an EDH (Error Detection and Handling) packet may be optionally placed in the vertical interval of the video signal. This packet includes CRC values for both the active picture, and the entire field (excluding those lines at which switching may occur, and which should contain no useful data); equipment can compute their own CRC and compare it with the received CRC in order to detect errors.

EDH is typically only used with the standard definition interface; the presence of CRC words in the HD interface make EDH packets unnecessary.

VPID

VPID (or video payload identifier) packets are increasingly used to describe the video format. In early versions of the serial digital interface, it was always possible to uniquely determine the video format by counting the number of lines and samples between H and V transitions in the TRS. With the introduction of dual link interfaces, and segmented-frame standards, this is no longer possible; thus the VPID standard (defined by SMPTE 352M) provides a way to uniquely and unambiguously identify the format of the video payload.

Video payload and blanking

The active portion of the video signal is defined to be those samples which follow an SAV packet, and precede the next EAV packet; where the corresponding EAV and SAV packets have the V bit set to zero. It is in the active portion that the actual image information is stored.

Color encoding

Several color encodings are possible in the serial digital interface. The default (and most common case) is 10-bit linearly sampled video data encoded as 4:2:2 YCbCr. (YCbCr is a digital representation of the YPbPr colorspace). Samples of video are stored as described above. Data words correspond to signal levels of the respective video components, as follows:

Note that the scaling of the luma and chroma channels is not identical. The minimum and maximum of these ranges represent the preferred signal limits, though the video payload may venture outside these ranges (providing that the reserved code words of 0 - 3 and 1020 - 1023 are never used for video payload). In addition, the corresponding analog signal may have excursions further outside of this range.

Colorimetry

As YPbPr (and YCbCr) are both derived from the RGB colorspace, a means of converting is required. There are three colorimetries typically used with digital video:

  • SD and ED applications typically use a colorimetry matrix specified in ITU-R Rec. 601.
  • Most HD, dual link, and 3 Gbit/s applications use a different matrix, specified in ITU-R Rec. 709.
  • The 1035-line HD standards specified by SMPTE 260M (primarily used in Japan and now largely considered obsolete), used a colorimetry matrix specified by SMPTE 240M. This colorimetry is nowadays rarely used, as the 1035-line formats have been superseded by 1080-line formats.

Other color encodings

The dual-link and 3 Gbit/s interfaces additionally support other color encodings besides 4:2:2 YCbCr, namely:

  • 4:2:2 and 4:4:4 YCbCr, with an optional alpha (used for linear keying, a.k.a. alpha compositing) or data (used for non-video payload) channel
  • 4:4:4 RGB, also with an optional alpha or data channel
  • 4:2:2 YCbCr, 4:4:4 YCbCr, and 4:4:4 RGB, with 12 bits of color information per sample, rather than 10. Note that the interface itself is still 10 bit; the additional 2 bits per channel are multiplexed into an additional 10-bit channel on the second link.

If an RGB encoding is used, the three primaries are all encoded in the same fashion as the Y channel; a value of 64 (40 hex) corresponds to 0 mV, and 940 (3AC hex) corresponds to 700 mV.

12-bit applications are scaled in a similar fashion to their 10-bit counterparts; the additional two bits are considered to be LSBs.

Vertical and horizontal blanking regions

For portions of the vertical and horizontal blanking regions which are not used for ancillary data, it is recommended that the luma samples be assigned the code word 64 (40 hex), and the chroma samples be assigned 512 (200 hex); both of which correspond to 0 mV. It is permissible to encode analog vertical interval information (such as vertical interval timecode or vertical interval test signals) without breaking the interface, but such usage is nonstandard (and ancillary data is the preferred means for transmitting metadata). Conversion of analog sync and burst signals into digital, however, is not recommended—and neither is necessary in the digital interface.

Different picture formats have different requirements for digital blanking, for example all so called 1080 line HD formats have 1080 active lines, but 1125 total lines, with the remainder being vertical blanking. [1]

Supported video formats

The various versions of the serial digital interface support numerous video formats.

In addition to the regular serial digital interface described here, there are several other similar interfaces which are similar to, or are contained within, a serial digital interface.

SDTI

There is an expanded specification called SDTI (Serial Data Transport Interface), which allows compressed (i.e. DV, MPEG and others) video streams to be transported over an SDI line. This allows for multiple video streams in one cable or faster-than-realtime (2x, 4x,...) video transmission. A related standard, known as HD-SDTI, provides similar capability over an SMPTE 292M interface.

The SDTI interface is specified by SMPTE 305M. The HD-SDTI interface is specified by SMPTE 348M.

ASI

The asynchronous serial interface (ASI) specification describes how to transport a MPEG Transport Stream (MPEG-TS), containing multiple MPEG video streams, over 75-ohm copper coaxial cable or multimode optical fiber. ASI is popular way to transport broadcast programs from the studio to the final transmission equipment before it reaches viewers sitting at home.

The ASI standard is part of the Digital Video Broadcast (DVB) standard.

SMPTE 349M

The standard SMPTE 349M: Transport of Alternate Source Image Formats through SMPTE 292M, specifies a means to encapsulate non-standard and lower-bitrate video formats within an HD-SDI interface. This standard allows, for example, several independent standard definition video signals to be multiplexed onto an HD-SDI interface, and transmitted down one wire. This standard doesn't merely adjust EAV and SAV timing to meet the requirements of the lower-bitrate formats; instead, it provides a means by which an entire SDI format (including synchronization words, ancillary data, and video payload) can be encapsulated and transmitted as ordinary data payload within a 292M stream.

High-Definition Multimedia Interface (HDMI)

HDMI to SDI converter HDMI to SDI Converter.jpg
HDMI to SDI converter

The HDMI interface is a compact audio/video interface for transferring uncompressed video data and compressed/uncompressed digital audio data from an HDMI-compliant device to a compatible computer monitor, video projector, digital television, or digital audio device. It is mainly used in the consumer area, but increasingly used in professional devices including uncompressed video, often called clean HDMI.

G.703

The G.703 standard is another high-speed digital interface, originally designed for telephony.

HDcctv

The HDcctv standard embodies the adaptation of SDI for video surveillance applications, not to be confused with TDI, a similar but different format for video surveillance cameras.

CoaXPress

The CoaXPress standard is another high-speed digital interface, originally design for industrial camera interfaces. The data rates for CoaXPress go up to 12.5 Gbit/s over a single coaxial cable. A 41 Mbit/s uplink channel and power over coax are also included in the standard.

Related Research Articles

Digital Visual Interface Standard for transmitting digital video to a display

Digital Visual Interface (DVI) is a video display interface developed by the Digital Display Working Group (DDWG). The digital interface is used to connect a video source, such as a video display controller, to a display device, such as a computer monitor. It was developed with the intention of creating an industry standard for the transfer of digital video content.

In telecommunications and computing, bit rate is the number of bits that are conveyed or processed per unit of time.

HDMI Proprietary interface for transmitting digital audio and video data

High-Definition Multimedia Interface (HDMI) is a proprietary audio/video interface for transmitting uncompressed video data and compressed or uncompressed digital audio data from an HDMI-compliant source device, such as a display controller, to a compatible computer monitor, video projector, digital television, or digital audio device. HDMI is a digital replacement for analog video standards.

HD-MAC was a proposed broadcast television systems standard by the European Commission in 1986, a part of Eureka 95 project. It is an early attempt by the EEC to provide High-definition television (HDTV) in Europe. It is a complex mix of analogue signal, multiplexed with digital sound, and assistance data for decoding (DATV). The video signal was encoded with a modified D2-MAC encoder.

HDCAM Magnetic tape-based videocassette format for HD video

HDCAM is a high-definition video digital recording videocassette version of Digital Betacam introduced in 1997 that uses an 8-bit discrete cosine transform (DCT) compressed 3:1:1 recording, in 1080i-compatible down-sampled resolution of 1440×1080, and adding 24p and 23.976 progressive segmented frame (PsF) modes to later models. The HDCAM codec uses rectangular pixels and as such the recorded 1440×1080 content is upsampled to 1920×1080 on playback. The recorded video bit rate is 144 Mbit/s. Audio is also similar, with four channels of AES3 20-bit, 48 kHz digital audio. Like Betacam, HDCAM tapes were produced in small and large cassette sizes; the small cassette uses the same form factor as the original Betamax. The main competitor to HDCAM was the DVCPRO HD format offered by Panasonic, which uses a similar compression scheme and bit rates ranging from 40 Mbit/s to 100 Mbit/s depending on frame rate.

Digital Cinema Initiatives, LLC (DCI) is a consortium of major motion picture studios, formed to establish specifications for a common systems architecture for digital cinema systems.

Dolby Digital Plus, also known as Enhanced AC-3 is a digital audio compression scheme developed by Dolby Labs for transport and storage of multi-channel digital audio. It is a successor to Dolby Digital (AC-3), also developed by Dolby, and has a number of improvements including support for a wider range of data rates, increased channel count and multi-program support, and additional tools (algorithms) for representing compressed data and counteracting artifacts. While Dolby Digital (AC-3) supports up to five full-bandwidth audio channels at a maximum bitrate of 640 kbit/s, E-AC-3 supports up to 15 full-bandwidth audio channels at a maximum bitrate of 6.144 Mbit/s.

Serial Data Transport Interface is a way of transmitting data packets over a Serial Digital Interface datastream. This means that standard SDI infrastructure can be used.

DisplayPort Digital display interface

DisplayPort (DP) is a digital display interface developed by a consortium of PC and chip manufacturers and standardized by the Video Electronics Standards Association (VESA). The interface is primarily used to connect a video source to a display device such as a computer monitor, and it can also carry audio, USB, and other forms of data.

Asynchronous serial interface

Asynchronous Serial Interface, or ASI, is a method of carrying an MPEG Transport Stream (MPEG-TS) over 75-ohm copper coaxial cable or optical fiber. It is popular in the television industry as a means of transporting broadcast programs from the studio to the final transmission equipment before it reaches viewers sitting at home.

In television technology, Error Detection and Handling (EDH) protocol is an optional but commonly used addition to the Standard Definition-Serial Digital Interface (SDI) standard. This protocol allows an SD-SDI receiver to verify that each field of video is received correctly.

SMPTE 292 is a digital video transmission line standard published by the Society of Motion Picture and Television Engineers (SMPTE). This technical standard is usually referred to as HD-SDI; it is part of a family of standards that define a Serial Digital Interface based on a coaxial cable, intended to be used for transport of uncompressed digital video and audio in a television studio environment.

Ancillary data is data that has been added to given data and uses the same form of transport. Common examples are cover art images for media files or streams, or digital data added to radio or television broadcasts.

MADI multichannel digital audio interface

Multichannel Audio Digital Interface (MADI) standardized as AES10 by the Audio Engineering Society (AES) defines the data format and electrical characteristics of an interface that carries multiple channels of digital audio. The AES first documented the MADI standard in AES10-1991, and updated it in AES10-2003 and AES10-2008. The MADI standard includes a bit-level description and has features in common with the two-channel AES3 interface.

Digital-S Professional digital video cassette format

D-9 or Digital-S as it was originally known, is a professional digital video videocassette format created by JVC in 1995.

SMPTE 424M is a standard published by SMPTE which expands upon SMPTE 259M, SMPTE 344M, and SMPTE 292M allowing for bit-rates of 2.970 Gbit/s and 2.970/1.001 Gbit/s over a single-link coaxial cable. These bit-rates are sufficient for 1080p video at 50 or 60 frames per second. The initial 424M standard was published in 2006, with a revision published in 2012. This standard is part of a family of standards that define a serial digital interface (SDI); it is commonly known as 3G-SDI.

SMPTE 372M is a standard published by SMPTE which expands upon SMPTE 259M, SMPTE 344M, and SMPTE 292M allowing for bit-rates of 2.970 Gbit/s, and 2.970/1.001 Gbit/s over two wires. These bit-rates are sufficient for 1080p video at 50 or 60 frames per second.

Mobile High-Definition Link (MHL) is an industry standard for a mobile audio/video interface that allows the connection of smartphones, tablets, and other portable consumer electronics devices to high-definition televisions (HDTVs), audio receivers, and projectors. The standard was designed to share existing mobile device connectors, such as Micro-USB, and avoid the need to add additional video connectors on devices with limited space for them.

Uncompressed video is digital video that either has never been compressed or was generated by decompressing previously compressed digital video. It is commonly used by video cameras, video monitors, video recording devices, and in video processors that perform functions such as image resizing, image rotation, deinterlacing, and text and graphics overlay. It is conveyed over various types of baseband digital video interfaces, such as HDMI, DVI, DisplayPort and SDI. Standards also exist for the carriage of uncompressed video over computer networks.

Ingex is an open-source (GPL) suite of software for the digital capture of audio and video data, without the need for traditional audio or video tape or cassettes. Serial digital interface (SDI) capture is supported, as well as real-time transcoding. Portions of the software suite also act as a network file server for media files, as well as archiving to LTO-3 data tape. Audio and video media files can also be stored on USB hard drives or Network Attached Storage. The software is heavily used by the BBC, and was developed by the BBC Research Laboratory.

References

  1. 1 2 Charles A. Poynton (2003). Digital Video and HDTV. Morgan Kaufmann. ISBN   978-1-55860-792-7.
  2. 1 2 3 4 5 6 John Hudson (2013). "3Gb/s SDI for Transport of 1080p50/60, 3D, UHDTV1 / 4k and Beyond" (PDF).
  3. Francis Rumsey, John Watkinson (2004). Digital interface handbook. ISBN   9780240519098.
  4. "AJA Introduces Hi5-3D Mini-Converter For Stereo 3D Monitoring by Scott Gentry - ProVideo Coalition".
  5. http://pro.jvc.com/pro/attributes/HDTV/manual/IF_2D3D1MANUAL_061810.pdf
  6. "Signalling and Transport of HDR and Wide Colour Gamut Video over 3G-SDI Interfaces". May 23, 2020 via tech.ebu.ch.
  7. 1 2 ST 2081-10:2015 - 2160-Line and 1080-Line Source Image and Ancillary Data Mapping for Single-Link 6G-SDI. IEEE. 2015-03-19. doi:10.5594/SMPTE.ST2081-10.2015.
  8. 1 2 ST 2082-10:2015 - 2160-line Source Image and Ancillary Data Mapping for 12G-SDI. IEEE. 2015-03-19. doi:10.5594/SMPTE.ST2082-10.2015.
  9. Advice on the use of 3 Gbit/s HD-SDI interfaces (PDF). European Broadcasting Un. July 2011. Retrieved 20 July 2015.
  10. "Recommended Transmission Distance at Serial Digital Data Rates" (PDF). Belden. Belden. Archived from the original (PDF) on 2015-02-26. Retrieved 20 July 2015.
  11. "Transport of alternate source formats through Recommendation ITU-R BT.1120" (PDF). International Telecommunication Union. Retrieved February 27, 2019.
  12. [ dead link ]
  13. "March 2014 Standards Quarterly Report (page 28)" (PDF). SMPTE. SMPTE. Retrieved 19 September 2014.
  14. SMPTE (2013). "3Gb/s SDI for Transport of 1080p50/60, 3D, UHDTV1 / 4k and Beyond" (PDF).

Sources

Standards