Ancillary data

Last updated

Ancillary data is data that has been added to given data and uses the same form of transport. Common examples are cover art images for media files or streams, or digital data added to radio or television broadcasts.

Contents

Television

Ancillary data (commonly abbreviated as ANC data), in the context of television systems, refers to a means which by non-video information (such as audio, other forms of essence, and metadata) may be embedded within the serial digital interface. Ancillary data is standardized by SMPTE as SMPTE 291M: Ancillary Data Packet and Space Formatting.

Ancillary data can be located in non-picture portions of horizontal scan lines, known as Horizontal ANCillary data (HANC). Ancillary data can also be located in non-picture regions of the video frame, known as Vertical ANCillary data (VANC).

Technical details

Location

Ancillary data packets may be located anywhere within a serial digital data stream, with the following exceptions:

  • They should not be located in the lines identified as a switch point (which may be lost when switching sources).
  • They should not be located in the active picture area.
  • They may not cross the TRS (timing reference signal) packets.

Ancillary data packets are commonly divided into two types, depending on where they are located—specific packet types are often constrained to be in one location or another.

  • Ancillary packets located in the horizontal blanking region (after EAV but before SAV), regardless of line, are known as horizontal ancillary data, or HANC. HANC is commonly used for higher-bandwidth data, and/or for things that need to be synchronized to a particular line; the most common type of HANC is embedded audio.
  • Ancillary packets located in the vertical blanking region, and after SAV but before EAV, are known as vertical ancillary data, or VANC. VANC is commonly used for low-bandwidth data, or for things that only need be updated on a per-field or per-frame rate. Closed caption data and VPID are generally stored as VANC.

Note that ANC packets which lie in the dataspace which is in both the horizontal and vertical intervals, is considered to be HANC and not VANC.

VANC packets should be inserted in this manner:

  • (SMPTE 334M section 3): VANC data packets can appear anywhere between the SAV and EAV TRS packets in any line from the second line after the line specified for switching to the last line preceding active video, inclusive. Given the spec for switch points (set RP168 figure 2), the first allowed lines are 12 and 275 (for 525-line/59.94 Hz systems) or 8 and 321 (for 625-line/50 Hz systems). This conflicts with SMPTE 125M, and does not address requirements for carrying DVITC (Digital Vertical Interval TimeCode) and video index packets.
  • (SMPTE 125M section 3.6.2): VANC should appear only in lines 1-13, 15-19, 264-276, and 278-282, with lines 14 and 277 reserved for DVITC and video index data. This conflicts with SMPTE 334M, and does not address 625-line/50 Hz systems.

Packet format

All ANC packets must start with a start sequence; for component interfaces (the only kind of serial digital interface in widespread use today), the start sequence is 0x000 0x3FF 0x3FF. This sequence is otherwise illegal in the serial digital interface. (In the obsolete composite versions of SDI, the ANC start sequence is a single word, 0x3FC).

Three words immediately follow the start sequence in the header. The first word after the start sequence is the Data Identifier or DID, followed by either a 'Secondary Data Identifier (SDID) or a Data Block Number (DBN), followed by a Data Count (DC). After the Data Count word are 0 - 255 (inclusive) User Data Words (UDW), followed by a Checksum (CS) word.

DID

The Data Identifier word (along with the SDID, if used), indicates the type of ancillary data that the packet corresponds to. Data identifiers range from 1 to 255 (FF hex), with 0 being reserved. As the serial digital interface is a 10-bit format, the DID word is encoded as follows:

  • Bits 0-7 (bit 0 being the LSB), are the raw DID value.
  • Bit 8 is the even parity bit of bits 0-7.
  • Bit 9 is the inverse of bit 8.

Thus, a DID of 0x61 (01100001) would be encoded as 0x161 (0101100001), whereas a DID of 0x63 (01100011) would be encoded as 0x263 (1001100011). Note that this encoding scheme ensures that the reserved values in the serial digital interface (0-3 and 1020-1023) are never used.

If the DID is equal to 128 (0x80) or greater, then the packet is a Type 1 packet, and the DID is sufficient to identify the packet type, and the following word is a Data Block Number. If the DID is less than 128, it is a Type 2 packet, and the following words is the Secondary Data Identifier; the DID and SDID together identify the packet type.

SDID

The SDID is only valid if the DID is less than 0x80. The SDID is nominally an 8-bit value, ranging from 0 to 255. It is encoded in the same fashion as the DID.

DID/SDID words of 161 101 (hex) correspond to a DID of 61 hex, and a SDID of 1 (once the two high bits are removed); these values would indicate that the packet type is defined by SMPTE 334M, and contains DTV closed captions data.

DBN

The DBN is only valid if the DID is 80 hex or greater. It is (optionally) used to identify multiple packets of the same type within a field; each subsequent packet of the indicated type has a DBN which is one higher than the previous packet, wrapping around as necessary. The DBN is an 8-bit value, encoded in the same fashion as the SDID.

DC

The Data Count word is an 8-bit value, encoded in the same fashion as the DID, which indicates how many user data words are to follow. It can range from 0 to 255.

UDW

User data words are the "payload" present in the ANC packet. They are defined according to the packet type, SMPTE 291M does not define their use or impose any restrictions on the values which may be present in the UDW space. The only restriction is that the reserved values in the serial digital interface (0-3 and 1020-1023) may not be included in the UDW. Many ANC formats, though not all, are essentially 8-bit formats, and encode data in the same manner that the header words are encoded.

Example

SMPTE 352M (Video Payload ID) defines four UDW:

BitsByte 1Byte 2Byte 3Byte 4
Bit 71Interlaced (0) or Progressive (1) transportReservedReserved
Bit 60Interlaced (0) or Progressive (1) pictureHorizontal Y´/Y sampling 1920 (0) or 2048 (1)Reserved
Bit 50ReservedReservedReserved
Bit 40ReservedReservedDynamic range 100% (0h), 200% (1h), 400% (2h), Reserved (3h)
Bit 31Picture Rate (see SMPTE 352M table 2)Sampling structure (see SMPTE 352M table 3 and Note 1)Dynamic range 100% (0h), 200% (1h), 400% (2h), Reserved (3h)
Bit 20Picture Rate (see SMPTE 352M table 2)Sampling structure (see SMPTE 352M table 3 and Note 1)Reserved
Bit 10Picture Rate (see SMPTE 352M table 2)Sampling structure (see SMPTE 352M table 3 and Note 1)Bit depth 8-bit (0h), 10-bit (1h), 12-bit (2h), Reserved (3h)
Bit 01Picture Rate (see SMPTE 352M table 2)Sampling structure (see SMPTE 352M table 3 and Note 1)Bit depth 8-bit (0h), 10-bit (1h), 12-bit (2h), Reserved (3h)
Checksum

The last word in an ANC packet is the Checksum word. It is computed by computing the sum (modulo 512) of bits 0-8 (not bit 9), of all the other words in the ANC packet, excluding the packet start sequence. Bit 9 of the checksum word is then defined as the inverse of bit 8. Note that the checksum word does not contain a parity bit; instead, the parity bits of other words are included in the checksum calculations.

Usage

Embedded audio

Embedded audio is audio payload which is (typically) the soundtrack (music, dialogue, and sound effects) for the video program. Two standards, SMPTE 272M (for SD) and SMPTE 299M (for HD and 3G) define how audio is embedded into the ancillary space. The SD and HD standards provide for up to 16 channels of PCM audio, while 3G allows up to 32 channels, typically encoded in the AES3 format. In HD, the embedded audio data packets are carried in the HANC space of Cb/Cr (chroma) parallel data stream.

In addition, both standards define audio control packets. The audio control packets are carried in the HANC space of the Y (luminance) parallel data steam and are inserted once per field at the second video line past the switching point (see SMPTE RP168 for switching points of various video standards). The audio control packet contains audio-related metadata, such as its timing relative to video, which channels are present, etc.

Embedded audio packets are Type 1 packets.

EDH

EDH packets are used for error detection in standard definition interfaces (they are not necessary in HD interfaces, as the HD-SDI interface includes CRC checkwords built in).

Related Research Articles

<span class="mw-page-title-main">Ogg</span> Open container format maintained by the Xiph.Org Foundation

Ogg is a free, open container format maintained by the Xiph.Org Foundation. The authors of the Ogg format state that it is unrestricted by software patents and is designed to provide for efficient streaming and manipulation of high-quality digital multimedia. Its name is derived from "ogging", jargon from the computer game Netrek.

In telecommunications and computer networking, a network packet is a formatted unit of data carried by a packet-switched network. A packet consists of control information and user data; the latter is also known as the payload. Control information provides data for delivering the payload. Typically, control information is found in packet headers and trailers.

<span class="mw-page-title-main">Digital Visual Interface</span> Standard for transmitting digital video to a display

Digital Visual Interface (DVI) is a video display interface developed by the Digital Display Working Group (DDWG). The digital interface is used to connect a video source, such as a video display controller, to a display device, such as a computer monitor. It was developed with the intention of creating an industry standard for the transfer of uncompressed digital video content.

<span class="mw-page-title-main">SMPTE timecode</span> Standards to label individual frames of video or film with a timestamp

SMPTE timecode is a set of cooperating standards to label individual frames of video or film with a timecode. The system is defined by the Society of Motion Picture and Television Engineers in the SMPTE 12M specification. SMPTE revised the standard in 2008, turning it into a two-part document: SMPTE 12M-1 and SMPTE 12M-2, including new explanations and clarifications.

AES3 is a standard for the exchange of digital audio signals between professional audio devices. An AES3 signal can carry two channels of pulse-code-modulated digital audio over several transmission media including balanced lines, unbalanced lines, and optical fiber.

<span class="mw-page-title-main">D-1 (Sony)</span> Magnetic tape-based videocassette format

D-1 or 4:2:2 Component Digital is an SMPTE digital recording video standard, introduced in 1986 through efforts by SMPTE engineering committees. It started as a Sony and Bosch – BTS product and was the first major professional digital video format. SMPTE standardized the format within ITU-R 601, also known as Rec. 601, which was derived from SMPTE 125M and EBU 3246-E standards.

<span class="mw-page-title-main">Serial digital interface</span> Family of digital video interfaces

Serial digital interface (SDI) is a family of digital video interfaces first standardized by SMPTE in 1989. For example, ITU-R BT.656 and SMPTE 259M define digital video interfaces used for broadcast-grade video. A related standard, known as high-definition serial digital interface (HD-SDI), is standardized in SMPTE 292M; this provides a nominal data rate of 1.485 Gbit/s.

<span class="mw-page-title-main">Dolby</span> Audio technology company

Dolby Laboratories, Inc. is a British-American technology corporation specializing in audio noise reduction, audio encoding/compression, spatial audio, and HDR imaging. Dolby licenses its technologies to consumer electronics manufacturers.

CTA-708 is the standard for closed captioning for ATSC digital television (DTV) viewing in the United States and Canada. It was developed by the Consumer Electronics sector of the Electronic Industries Alliance, which became Consumer Technology Association.

AES47 is a standard which describes a method for transporting AES3 professional digital audio streams over Asynchronous Transfer Mode (ATM) networks.

Dolby Digital Plus, also known as Enhanced AC-3, is a digital audio compression scheme developed by Dolby Labs for the transport and storage of multi-channel digital audio. It is a successor to Dolby Digital (AC-3), and has a number of improvements over that codec, including support for a wider range of data rates, an increased channel count, and multi-program support, as well as additional tools (algorithms) for representing compressed data and counteracting artifacts. Whereas Dolby Digital (AC-3) supports up to five full-bandwidth audio channels at a maximum bitrate of 640 kbit/s, E-AC-3 supports up to 15 full-bandwidth audio channels at a maximum bitrate of 6.144 Mbit/s.

Serial Data Transport Interface is a way of transmitting data packets over a serial digital interface (SDI) datastream. This means that standard SDI infrastructure can be used.

<span class="mw-page-title-main">IEEE 1355</span>

IEEE Standard 1355-1995, IEC 14575, or ISO 14575 is a data communications standard for Heterogeneous Interconnect (HIC).

<span class="mw-page-title-main">Asynchronous serial interface</span> Standardised transport interface for the broadcast industry

Asynchronous Serial Interface, or ASI, is a method of carrying an MPEG Transport Stream (MPEG-TS) over 75-ohm copper coaxial cable or optical fiber. It is popular in the television industry as a means of transporting broadcast programs from the studio to the final transmission equipment before it reaches viewers sitting at home.

In television technology, Error Detection and Handling (EDH) protocol is an optional but commonly used addition to the Standard Definition-Serial Digital Interface (SDI) standard. This protocol allows an SD-SDI receiver to verify that each field of video is received correctly.

SMPTE 292 is a digital video transmission line standard published by the Society of Motion Picture and Television Engineers (SMPTE). This technical standard is usually referred to as HD-SDI; it is part of a family of standards that define a serial digital interface based on a coaxial cable, intended to be used for transport of uncompressed digital video and audio in a television studio environment.

VLYNQ is a proprietary interface developed by Texas Instruments and used for broadband products, such as WLAN and modems, VOIP processors and audio and digital media processor chips. The chip implements a full-duplex serial communications interface that enables the extension of an internal bus segment to one or more external physical devices. The external devices are mapped into local, physical address space and appear as if they are on the internal bus. Multiple VLYNQ devices are daisy-chained, communication is peer-to-peer, host/peripheral. Data transferred over the VLYNQ interface is 8B/10B encoded and packetized.

<span class="mw-page-title-main">Display Serial Interface</span> Specification by MIPI

The Display Serial Interface (DSI) is a specification by the Mobile Industry Processor Interface (MIPI) Alliance aimed at reducing the cost of display controllers in a mobile device. It is commonly targeted at LCD and similar display technologies. It defines a serial bus and a communication protocol between the host, the source of the image data, and the device which is the destination. The interface is closed source, which means that the specification of the interface is not open to the public. The maintenance of the interface is the responsibility of the MIPI Alliance. Only legal entities can be members. These members or the persons commissioned and approved by them have access to the specification in order to use it in their possible applications.

Uncompressed video is digital video that either has never been compressed or was generated by decompressing previously compressed digital video. It is commonly used by video cameras, video monitors, video recording devices, and in video processors that perform functions such as image resizing, image rotation, deinterlacing, and text and graphics overlay. It is conveyed over various types of baseband digital video interfaces, such as HDMI, DVI, DisplayPort and SDI. Standards also exist for the carriage of uncompressed video over computer networks.

ARINC 818: Avionics Digital Video Bus (ADVB) is a video interface and protocol standard developed for high bandwidth, low-latency, uncompressed digital video transmission in avionics systems. The standard, which was released in January 2007, has been advanced by ARINC and the aerospace community to meet the stringent needs of high performance digital video. The specification was updated and ARINC 818-2 was released in December 2013, adding a number of new features, including link rates up to 32X fibre channel rates, channel-bonding, switching, field sequential color, bi-directional control and data-only links.

References