Universal multimedia access

Last updated

The Universal Multimedia Access (UMA) addresses the delivery of multimedia resources under different and varying network conditions, diverse terminal equipment capabilities, specific user or creator preferences and needs and usage environment conditions. [1] It aims for guaranteed unrestricted access to multimedia content from any device, through any network, independently of the original content format, efficiently satisfying user preferences and usage environment conditions.

Contents

Requirements

The fulfilment of UMA requires that content is remotely searchable and accessible, with useful descriptions of it and its context, and mediation/delivery systems can use this information to serve users regardless of location, format and type of terminal or network connections, respecting user preferences, environmental conditions and ownership and usage rights. [2]

Approach

One feasible approach to implement UMA, is to develop context-aware systems that use the content and context descriptions to decide upon the need to adapt the content before delivering it to the end-user. The use of open ontologies and standards to structure, represent and convey those descriptions as well as to specify the kind of adaptation operations is vital for the success of UMA. This is especially true in loosely coupled environments such as the Internet, where heterogeneous end-users devices, varied content formats, repositories and networking technologies co-exist. Standards from the W3C such as OWL (Web Ontology Language) or CC/PP (Content Capability/Preferences Profile) and from ISO/IEC such as MPEG-7 and especially MPEG-21, are well-suited for the implementation of UMA-enabler systems. [3]

Related Research Articles

<span class="mw-page-title-main">Moving Picture Experts Group</span> Alliance of working groups to set standards for multimedia coding

The Moving Picture Experts Group (MPEG) is an alliance of working groups established jointly by ISO and IEC that sets standards for media coding, including compression coding of audio, video, graphics, and genomic data; and transmission and file formats for various applications. Together with JPEG, MPEG is organized under ISO/IEC JTC 1/SC 29 – Coding of audio, picture, multimedia and hypermedia information.

MPEG-4 is a group of international standards for the compression of digital audio and visual data, multimedia systems, and file storage formats. It was originally introduced in late 1998 as a group of audio and video coding formats and related technology agreed upon by the ISO/IEC Moving Picture Experts Group (MPEG) under the formal standard ISO/IEC 14496 – Coding of audio-visual objects. Uses of MPEG-4 include compression of audiovisual data for Internet video and CD distribution, voice and broadcast television applications. The MPEG-4 standard was developed by a group led by Touradj Ebrahimi and Fernando Pereira.

Multimedia is a form of communication that uses a combination of different content forms, such as writing, audio, images, animations, or video, into a single interactive presentation, in contrast to traditional mass media, such as printed material or audio recordings, which feature little to no interaction between users. Popular examples of multimedia include video podcasts, audio slideshows, and animated videos. Multimedia also contains the principles and application of effective interactive communication, such as the building blocks of software, hardware, and other technologies.

MPEG-7 is a multimedia content description standard. It was standardized in ISO/IEC 15938. This description will be associated with the content itself, to allow fast and efficient searching for material that is of interest to the user. MPEG-7 is formally called Multimedia Content Description Interface. Thus, it is not a standard which deals with the actual encoding of moving pictures and audio, like MPEG-1, MPEG-2 and MPEG-4. It uses XML to store metadata, and can be attached to timecode in order to tag particular events, or synchronise lyrics to a song, for example.

The MPEG-21 standard, from the Moving Picture Experts Group, aims at defining an open framework for multimedia applications. MPEG-21 is ratified in the standards ISO/IEC 21000 - Multimedia framework (MPEG-21).

Windows Media Video (WMV) is a series of video codecs and their corresponding video coding formats developed by Microsoft. It is part of the Windows Media framework. WMV consists of three distinct codecs: The original video compression technology known as WMV, was originally designed for Internet streaming applications, as a competitor to RealVideo. The other compression technologies, WMV Screen and WMV Image, cater for specialized content. After standardization by the Society of Motion Picture and Television Engineers (SMPTE), WMV version 9 was adapted for physical-delivery formats such as HD DVD and Blu-ray Disc and became known as VC-1. Microsoft also developed a digital container format called Advanced Systems Format to store video encoded by Windows Media Video.

Digital storage media command and control (DSM-CC) is a toolkit for developing control channels associated with MPEG-1 and MPEG-2 streams. It is defined in part 6 of the MPEG-2 standard and uses a client/server model connected via an underlying network.

A multimedia framework is a software framework that handles media on a computer and through a network. A good multimedia framework offers an intuitive API and a modular architecture to easily add support for new audio, video and container formats and transmission protocols. It is meant to be used by applications such as media players and audio or video editors, but can also be used to build videoconferencing applications, media converters and other multimedia tools. Data is processed among modules automatically, it is unnecessary for app to pass buffers between connected modules one by one.

The UAProf specification is concerned with capturing capability and preference information for wireless devices. This information can be used by content providers to produce content in an appropriate format for the specific device.

A service delivery platform (SDP) is a set of components that provides a service(s) delivery architecture for a type of service delivered to consumer, whether it be a customer or other system. Although it is commonly used in the context of telecommunications, it can apply to any system that provides a service. Although the TM Forum (TMF) is working on defining specifications in this area, there is no standard definition of SDP in industry and different players define its components, breadth, and depth in slightly different ways.

Digital Item is the basic unit of transaction in the MPEG-21 framework. It is a structured digital object, including a standard representation, identification and metadata.

Universal usability refers to the design of information and communications products and services that are usable for every citizen. The concept has been advocated by Professor Ben Shneiderman, a computer scientist at the Human-Computer Interaction Lab at the University of Maryland, College Park. He also provided a more practical definition of universal usability – "having more than 90% of all households as successful users of information and communications services at least once a week." The concept of universal usability is closely related to the concepts of universal design and design for all. These three concepts altogether cover, from the user's end to the developer's end, the three important research areas of information and communications technology (ICT): use, access, and design.

AXMEDIS is a set of European Union digital content standards, initially created as a research project running from 2004 to 2008 partially supported by the European Commission under the Information Society Technologies programme of the Sixth Framework Programme (FP6). It stands for "Automating Production of Cross Media Content for Multi-channel Distribution". Now it is distributed as a framework, and is still being maintained and improved. A large part of the framework is under open source licensing. The AXMEDIS framework includes a set of tools, models, test cases, documents, etc. supporting the production and distribution of cross media content.

The term “adaptation” in computer science refers to a process where an interactive system adapts its behaviour to individual users based on information acquired about its user(s) and its environment. Adaptation is one of the three pillars of empiricism in Scrum.

HTTP Live Streaming is an HTTP-based adaptive bitrate streaming communications protocol developed by Apple Inc. and released in 2009. Support for the protocol is widespread in media players, web browsers, mobile devices, and streaming media servers. As of 2022, an annual video industry survey has consistently found it to be the most popular streaming format.

Spatial contextual awareness consociates contextual information such as an individual's or sensor's location, activity, the time of day, and proximity to other people or objects and devices. It is also defined as the relationship between and synthesis of information garnered from the spatial environment, a cognitive agent, and a cartographic map. The spatial environment is the physical space in which the orientation or wayfinding task is to be conducted; the cognitive agent is the person or entity charged with completing a task; and the map is the representation of the environment which is used as a tool to complete the task.

Adaptive bitrate streaming is a technique used in streaming multimedia over computer networks.

Dynamic Adaptive Streaming over HTTP (DASH), also known as MPEG-DASH, is an adaptive bitrate streaming technique that enables high quality streaming of media content over the Internet delivered from conventional HTTP web servers. Similar to Apple's HTTP Live Streaming (HLS) solution, MPEG-DASH works by breaking the content into a sequence of small segments, which are served over HTTP. An early HTTP web server based streaming system called SProxy was developed and deployed in the Hewlett Packard Laboratories in 2006. It showed how to use HTTP range requests to break the content into small segments. SProxy shows the effectiveness of segment based streaming, gaining best Internet penetration due to the wide deployment of firewalls, and reducing the unnecessary traffic transmission if a user chooses to terminate the streaming session earlier before reaching the end. Each segment contains a short interval of playback time of content that is potentially many hours in duration, such as a movie or the live broadcast of a sport event. The content is made available at a variety of different bit rates, i.e., alternative segments encoded at different bit rates covering aligned short intervals of playback time. While the content is being played back by an MPEG-DASH client, the client uses a bit rate adaptation (ABR) algorithm to automatically select the segment with the highest bit rate possible that can be downloaded in time for playback without causing stalls or re-buffering events in the playback. The current MPEG-DASH reference client dash.js offers both buffer-based (BOLA) and hybrid (DYNAMIC) bit rate adaptation algorithms. Thus, an MPEG-DASH client can seamlessly adapt to changing network conditions and provide high quality playback with few stalls or re-buffering events.

MPEG media transport (MMT), specified as ISO/IEC 23008-1, is a digital container standard developed by Moving Picture Experts Group (MPEG) that supports High Efficiency Video Coding (HEVC) video. MMT was designed to transfer data using the all-Internet Protocol (All-IP) network.

References

  1. Perkis, Andrew; Abdeljaoued, Yousri; Christopoulos, Charilaos; Ebrahimi, Touradj; Chicharo, Joe F. (May 2001). "Universal multimedia access from wired and wireless systems" (PDF). Circuits, Systems and Signal Processing. 20 (3–4): 387–402. doi:10.1007/BF01201409. S2CID   17273790.
  2. Reiterer, Bernhard; Concolato, Cyril; Lachner, Janine; Le Feuvre, Jean; Moissinac, Jean-Claude; Lenzi, Stefano; Chessa, Stefano; Fernández Ferrá, Enrique; González Menaya, Juan José; Hellwagner, Hermann (10 June 2008). "User-centric universal multimedia access in home networks". The Visual Computer. 24 (7–9): 837–845. doi:10.1007/s00371-008-0265-5. S2CID   33049139.
  3. Kasutani, E; Ebrahimi, T (1 January 2004). "New Frontiers in Universal Multimedia Access". École polytechnique fédérale de Lausanne. Retrieved 24 February 2017.