MPEG-21

Last updated

The MPEG-21 standard, from the Moving Picture Experts Group, aims at defining an open framework for multimedia applications. MPEG-21 is ratified in the standards ISO/IEC 21000 - Multimedia framework (MPEG-21). [1] [2] [3] [4] [5] [6]

Contents

MPEG-21 is based on two essential concepts:

Digital Items can be considered the kernel of the Multimedia Framework and the users can be considered as who interacts with them inside the Multimedia Framework. At its most basic level, MPEG-21 provides a framework in which one user interacts with another one, and the object of that interaction is a Digital Item. Due to that, we could say that the main objective of the MPEG-21 is to define the technology needed to support users to exchange, access, consume, trade or manipulate Digital Items in an efficient and transparent way.

MPEG-21 Part 9: File Format defined the storage of an MPEG-21 Digital Item in a file format based on the ISO base media file format, with some or all of Digital Item's ancillary data (such as movies, images or other non-XML data) within the same file. [7] It uses filename extensions .m21 or .mp21 and MIME type application/mp21. [8]

Digital Rights Management

MPEG-21 defines also a "Rights Expression Language" standard as means of managing restrictions for digital content usage. As an XML-based standard, MPEG-21 is designed to communicate machine-readable license information and do so in a "ubiquitous, unambiguous and secure" manner.

Among the aspirations for this standard, that the industry hope will put an end to file sharing, is that it will constitute: "A normative open framework for multimedia delivery and consumption for use by all the players in the delivery and consumption chain. This open framework will provide content creators, producers, distributors and service providers with equal opportunities in the MPEG-21 enabled open market." [ citation needed ]

See also

Related Research Articles

Moving Picture Experts Group alliance of working groups to set standards for multimedia coding

The Moving Picture Experts Group (MPEG) is an alliance of working groups established jointly by ISO and IEC that sets standards for media coding, including compression coding of audio, video, graphics and genomic data, and transmission and file formats for various applications. Together with JPEG, MPEG is organized under ISO/IEC JTC 1/SC 29 – Coding of audio, picture, multimedia and hypermedia information.

MPEG-2 Video encoding standard

MPEG-2 is a standard for "the generic coding of moving pictures and associated audio information". It describes a combination of lossy video compression and lossy audio data compression methods, which permit storage and transmission of movies using currently available storage media and transmission bandwidth. While MPEG-2 is not as efficient as newer standards such as H.264/AVC and H.265/HEVC, backwards compatibility with existing hardware and software means it is still widely used, for example in over-the-air digital television broadcasting and in the DVD-Video standard.

MPEG-4 is a method of defining compression of audio and visual (AV) digital data. It was introduced in late 1998 and designated a standard for a group of audio and video coding formats and related technology agreed upon by the ISO/IEC Moving Picture Experts Group (MPEG) under the formal standard ISO/IEC 14496 – Coding of audio-visual objects. Uses of MPEG-4 include compression of AV data for Internet video and CD distribution, voice and broadcast television applications. The MPEG-4 standard was developed by a group led by Touradj Ebrahimi and Fernando Pereira.

MPEG-1 Audio Layer II or MPEG-2 Audio Layer II is a lossy audio compression format defined by ISO/IEC 11172-3 alongside MPEG-1 Audio Layer I and MPEG-1 Audio Layer III (MP3). While MP3 is much more popular for PC and Internet applications, MP2 remains a dominant standard for audio broadcasting.

MPEG-7 is a multimedia content description standard. It was standardized in ISO/IEC 15938. This description will be associated with the content itself, to allow fast and efficient searching for material that is of interest to the user. MPEG-7 is formally called Multimedia Content Description Interface. Thus, it is not a standard which deals with the actual encoding of moving pictures and audio, like MPEG-1, MPEG-2 and MPEG-4. It uses XML to store metadata, and can be attached to timecode in order to tag particular events, or synchronise lyrics to a song, for example.

MPEG-4 Part 3 or MPEG-4 Audio is the third part of the ISO/IEC MPEG-4 international standard developed by Moving Picture Experts Group. It specifies audio coding methods. The first version of ISO/IEC 14496-3 was published in 1999.

Digital storage media command and control (DSM-CC) is a toolkit for developing control channels associated with MPEG-1 and MPEG-2 streams. It is defined in part 6 of the MPEG-2 standard and uses a client/server model connected via an underlying network.

MPEG-4 Part 2, MPEG-4 Visual is a video compression format developed by the Moving Picture Experts Group (MPEG). It belongs to the MPEG-4 ISO/IEC standards. It is uses block-wise motion compensation and a discrete cosine transform (DCT), similar to previous standards such as MPEG-1 Part 2 and H.262/MPEG-2 Part 2.

The Extensible MPEG-4 Textual Format (XMT) is a high-level, XML-based file format for storing MPEG-4 data in a way suitable for further editing. In contrast, the more common MPEG-4 Part 14 (MP4) format is less flexible and used for distributing finished content.

MPEG-4 Part 11Scene description and application engine was published as ISO/IEC 14496-11 in 2005. MPEG-4 Part 11 is also known as BIFS, XMT, MPEG-J. It defines:

QuickTime File Format (QTFF) is a computer file format used natively by the QuickTime framework.

Digital Item is the basic unit of transaction in the MPEG-21 framework. It is a structured digital object, including a standard representation, identification and metadata.

MPEG-4 Part 14 MP4; digital format for storing video and audio

MPEG-4 Part 14 or MP4 is a digital multimedia container format most commonly used to store video and audio, but it can also be used to store other data such as subtitles and still images. Like most modern container formats, it allows streaming over the Internet. The only filename extension for MPEG-4 Part 14 files as defined by the specification is .mp4. MPEG-4 Part 14 is a standard specified as a part of MPEG-4.

MPEG Surround, also known as Spatial Audio Coding (SAC) is a lossy compression format for surround sound that provides a method for extending mono or stereo audio services to multi-channel audio in a backwards compatible fashion. The total bit rates used for the core and the MPEG Surround data are typically only slightly higher than the bit rates used for coding of the core. MPEG Surround adds a side-information stream to the core bit stream, containing spatial image data. Legacy stereo playback systems will ignore this side-information while players supporting MPEG Surround decoding will output the reconstructed multi-channel audio.

MPEG-1 Audio Layer I, commonly abbreviated to MP1, is one of three audio formats included in the MPEG-1 standard. It is a deliberately simplified version of MPEG-1 Audio Layer II, created for applications where lower compression efficiency could be tolerated in return for a less complex algorithm that could be executed with simpler hardware requirements. While supported by most media players, the codec is considered largely obsolete, and replaced by MP2 or MP3.

BiM is an international standard defining a generic binary format for encoding XML documents.

DMIF, or Delivery Multimedia Integration Framework, is a uniform interface between the application and the transport, that allows the MPEG-4 application developer to stop worrying about that transport. DMIF was defined in MPEG-4 Part 6 in 1999. DMIF defines two interfaces: the DAI and the DNI. A single application can run on different transport layers when supported by the right DMIF instantiation. MPEG-4 DMIF supports the following functionalities:

ISO/IEC base media file format (ISOBMFF) defines a general structure for time-based multimedia files such as video and audio. It is standardized in ISO/IEC 14496-12 – MPEG-4 Part 12. The text was also published as ISO/IEC 15444-12.

High Efficiency Image File Format (HEIF) is a container format for storing individual images and image sequences. The standard covers multimedia files that can also include other media streams, such as timed text, audio and video.

MPEG-A is a group of standards for composing MPEG systems formally known as ISO/IEC 23000 - Multimedia Application Format, published since 2007.

References

  1. ISO. "ISO/IEC TR 21000-1:2004 - Information technology -- Multimedia framework (MPEG-21) -- Part 1: Vision, Technologies and Strategy" . Retrieved 2017-08-30.
  2. ISO. "ISO/IEC 21000-2:2005 - Information technology -- Multimedia framework (MPEG-21) -- Part 2: Digital Item Declaration" . Retrieved 2017-08-30.
  3. ISO. "ISO/IEC 21000-3:2003 - Information technology -- Multimedia framework (MPEG-21) -- Part 3: Digital Item Identification" . Retrieved 2017-08-30.
  4. MPEG. "About MPEG - Achievements". chiariglione.org. Archived from the original on July 8, 2008. Retrieved 2009-10-31.
  5. MPEG. "Terms of Reference". chiariglione.org. Archived from the original on February 21, 2010. Retrieved 2009-10-31.
  6. MPEG. "MPEG standards". chiariglione.org. Archived from the original on April 20, 2010. Retrieved 2009-10-31.
  7. ISO (2006). "MPEG-21 File Format white paper - Proposal". chiariglione.org. Retrieved 2010-08-20.
  8. ISO/IEC 21000-9 First edition, 2005-07-01, AMENDMENT 1 2008-10-01, Information technology — Multimedia framework (MPEG-21) — Part 9: File Format AMENDMENT 1: MIME type registration (PDF), retrieved 2010-08-20

Further reading