Unreal Media Server

Last updated
Unreal Media Server
Developer(s) Unreal Streaming Technologies
Initial releaseOctober 2003;19 years ago (2003-10)
Stable release
14.0 / April 14, 2020;2 years ago (2020-04-14)
Operating system Windows
Type streaming server software
License Proprietary
Website www.umediaserver.net/umediaserver

Unreal Media Server is a streaming server software created by Unreal Streaming Technologies.

Contents

Streaming protocol support

Proprietary UMS streaming protocol is based on Microsoft DirectShow, and therefore, UMS protocol is codec-independent. UMS protocol realizes a distributed DirectShow graph where source filter resides on the server computer and renderer filter resides on the player computer; a corresponding DirectShow decoder needs to be installed at the player computer/device.

Supported file container formats: MP4, ASF, AVI, MKV, MPEG, WMV, FLV, Ogg, MP3, 3GP, MOV, other containers.

With regards to live video, Unreal Media Server acts as universal transmuxer: it receives live streams multiplexed (muxed) in different protocols/formats (WebRTC/RTSP-RTP, MS-WMSP/ASF, MPEG2-TS, UMS), demuxes (extracts) the actual elementary streams from these containers (no decoding or transcoding), and muxes (packages) it for specific player delivery. For example, it can ingest a live RTSP stream from IP camera and send it to WebRTC players; at the same time re-mux it into RTMP/FLV protocol/format for delivery to Adobe Flash Player; at the same time re-mux it to video/mp4 segments for delivery via WebSocket protocol to HTML5 MSE players in web browsers; at the same time re-mux it to MPEG2-TS for delivery to Set-Top box, and at the same time send it to iOS devices with HLS protocol. Unreal Media Server is known for low latency live streaming; with UMS, WebRTC, WebSocket-video/mp4, RTMP and MPEG2-TS protocols latencies of 0.2–2 seconds can be achieved when streaming over the Internet; with Apple HLS the latency can be as low as 3 seconds.

History

A first version of Unreal Media Server, released in October 2003, supported proprietary UMS protocol only. At that time this was the only server capable of streaming AVI files without transcoding; the first version was completely free. [1] In the next versions additional streaming protocols such as MS-WMSP(MMS) and RTMP were added. Also, a free version introduced a limit of 15 concurrent connections and a commercial version was offered for purchase. [2] Before version 9.0 the Server accepted live streams from proprietary encoder named Unreal Live Server only. With version 9.0 the ability of ingesting of RTSP, MPEG2-TS and MMS live streams was introduced, to support industry standard live encoders such as IP network cameras, Windows Media Encoder etc.; version 10.0 added support for Flash encoders such as FMLE. Version 10.5 added support for adaptive bitrate streaming; also, limit of concurrent connections in a free version was reduced to 10 connections. Version 11.0 added time-shifted playback for live broadcasts, for up to 12 hours back from real-time. Version 11.5 added "live playlist" feature allowing server-side channel switching and ad insertion. Version 12.0 added streaming via WebSockets to HTML5 <video> Media Source Extensions. Version 13.0 added full WebRTC support: ingesting live WebRTC streams from web browsers and sending live WebRTC streams to web browsers. [3] Version 14.0 added VOD files streaming to HTML5 video element via HTTP byte-range requests.

Related Research Articles

The Real Time Streaming Protocol (RTSP) is an application-level network protocol designed for multiplexing and packetizing multimedia transport streams over a suitable transport protocol. RTSP is used in entertainment and communications systems to control streaming media servers. The protocol is used for establishing and controlling media sessions between endpoints. Clients of media servers issue commands such as play, record and pause, to facilitate real-time control of the media streaming from the server to a client or from a client to the server.

RealAudio, or also spelled as Real Audio is a proprietary audio format developed by RealNetworks and first released in April 1995. It uses a variety of audio codecs, ranging from low-bitrate formats that can be used over dialup modems, to high-fidelity formats for music. It can also be used as a streaming audio format, that is played at the same time as it is downloaded. In the past, many internet radio stations used RealAudio to stream their programming over the internet in real time. In recent years, however, the format has become less common and has given way to more popular audio formats. RealAudio was heavily used by the BBC websites until 2009, though it was discontinued due to its declining use. BBC World Service, the last of the BBC websites to use RealAudio, discontinued its use in March 2011.

Helix DNA is a project to produce computer software that can play audio and video media in various formats and aid in producing such media. It is intended as a largely free and open-source digital media framework that runs on numerous operating systems and processors and was started by RealNetworks which contributed much of the code. The Helix Community is an open collaborative effort to develop and extend the Helix DNA platform.

<span class="mw-page-title-main">VLC media player</span> Free and open-source media player and streaming media server

VLC media player is a free and open-source, portable, cross-platform media player software and streaming media server developed by the VideoLAN project. VLC is available for desktop operating systems and mobile platforms, such as Android, iOS and iPadOS. VLC is also available on digital distribution platforms such as Apple's App Store, Google Play, and Microsoft Store.

Microsoft Media Server (MMS), a Microsoft proprietary network-streaming protocol, serves to transfer unicast data in Windows Media Services. MMS can be transported via UDP or TCP. The MMS default port is UDP/TCP 1755.

Flash Video is a container file format used to deliver digital video content over the Internet using Adobe Flash Player version 6 and newer. Flash Video content may also be embedded within SWF files. There are two different Flash Video file formats: FLV and F4V. The audio and video data within FLV files are encoded in the same way as SWF files. The F4V file format is based on the ISO base media file format, starting with Flash Player 9 update 3. Both formats are supported in Adobe Flash Player and developed by Adobe Systems. FLV was originally developed by Macromedia. In the early 2000s, Flash Video was the de facto standard for web-based streaming video. Users include Hulu, VEVO, Yahoo! Video, metacafe, Reuters.com, and many other news providers.

Real-Time Messaging Protocol (RTMP) is a communication protocol for streaming audio, video, and data over the Internet. Originally developed as a proprietary protocol by Macromedia for streaming between Flash Player and the Flash Communication Server, Adobe has released an incomplete version of the specification of the protocol for public use.

HTTP Live Streaming is an HTTP-based adaptive bitrate streaming communications protocol developed by Apple Inc. and released in 2009. Support for the protocol is widespread in media players, web browsers, mobile devices, and streaming media servers. As of 2019, an annual video industry survey has consistently found it to be the most popular streaming format.

The Helix Universal Media Server was a product developed by RealNetworks and originates from the first streaming media server originally developed by Progressive Networks in 1994. It supported a variety of streaming media delivery transports including MPEG-DASH RTMP (flash), RTSP (standard), HTTP Live Streaming (HLS), Microsoft Silverlight and HTTP Progressive Download enabling mobile phone OS and PC OS media client delivery.

Adaptive bitrate streaming is a technique used in streaming multimedia over computer networks. While in the past most video or audio streaming technologies utilized streaming protocols such as RTP with RTSP. Today's adaptive streaming technologies are almost exclusively based on HTTP and designed to work efficiently over large distributed HTTP networks such as the Internet. It works by detecting a user's bandwidth and CPU capacity in real time, adjusting the quality of the media stream accordingly. It requires the use of an encoder encodes a single source media at multiple bit rates. The player client switches between streaming the different encodings depending on available resources. "The result: very little buffering, fast start time and a good experience for both high-end and low-end connections."

Sirannon is a free, open-source, media server and client. The goal is to aid in video research and experimental streaming. Sirannon allows the programmer to create a wide variety of media-handling components such as streaming, reading, writing, packetizing. By organizing these components in a workflow the programmer can create many applications such as a media server, media proxy or video tool. Sirannon was introduced at the ACM multimedia conference in October 2009 under its former name xStreamer.

Wowza Streaming Engine is a unified streaming media server software developed by Wowza. The server is used for streaming of live and on-demand video, audio, and rich Internet applications over IP networks to desktop, laptop, and tablet computers, mobile devices, IPTV set-top boxes, internet-connected TV sets, game consoles, and other network-connected devices. The server is a Java application deployable on most operating systems.

WebRTC is a free and open-source project providing web browsers and mobile applications with real-time communication (RTC) via application programming interfaces (APIs). It allows audio and video communication to work inside web pages by allowing direct peer-to-peer communication, eliminating the need to install plugins or download native apps. Supported by Apple, Google, Microsoft, Mozilla, and Opera, WebRTC specifications have been published by the World Wide Web Consortium (W3C) and the Internet Engineering Task Force (IETF).

Dynamic Adaptive Streaming over HTTP (DASH), also known as MPEG-DASH, is an adaptive bitrate streaming technique that enables high quality streaming of media content over the Internet delivered from conventional HTTP web servers. Similar to Apple's HTTP Live Streaming (HLS) solution, MPEG-DASH works by breaking the content into a sequence of small segments, which are served over HTTP. An early HTTP web server based streaming system called SProxy was developed and deployed in the Hewlett Packard Laboratories in 2006. It showed how to use HTTP range requests to break the content into small segments. SProxy shows the effectiveness of segment based streaming, gaining best Internet penetration due to the wide deployment of firewalls, and reducing the unnecessary traffic transmission if a user chooses to terminate the streaming session earlier before reaching the end. Each segment contains a short interval of playback time of content that is potentially many hours in duration, such as a movie or the live broadcast of a sport event. The content is made available at a variety of different bit rates, i.e., alternative segments encoded at different bit rates covering aligned short intervals of playback time. While the content is being played back by an MPEG-DASH client, the client uses a bit rate adaptation (ABR) algorithm to automatically select the segment with the highest bit rate possible that can be downloaded in time for playback without causing stalls or re-buffering events in the playback. The current MPEG-DASH reference client dash.js offers both buffer-based (BOLA) and hybrid (DYNAMIC) bit rate adaptation algorithms. Thus, an MPEG-DASH client can seamlessly adapt to changing network conditions and provide high quality playback with few stalls or re-buffering events.

Media Source Extensions (MSE) is a W3C specification that allows JavaScript to send byte streams to media codecs within Web browsers that support HTML5 video and audio. Among other possible uses, this allows the implementation of client-side prefetching and buffering code for streaming media entirely in JavaScript. It is compatible with, but should not be confused with, the Encrypted Media Extensions (EME) specification, and neither requires the use of the other, although many EME implementations are only capable of decrypting media data provided via MSE.

Nimble Streamer is a freeware media server developed by WMSPanel company currently known as Softvelum, LLC. The server is used for streaming of live and on-demand video and audio to desktop computers, mobile devices, internet-connected TV sets, IPTV set-top boxes and other network-connected devices. Its first stable version 1.0.0-1 was released on October, 21st, 2013, with a number of preliminary versions done before that. The release cycle is intensive and introduces a new version every week or less. Nimble Streamer was the finalist in Streaming Media European Readers' Choice Awards for 2016 as the Best Streaming Innovation and for 2021 as Hardware/software Server.

Astra is a professional software to organize digital broadcasting service for TV operators and broadcasters, internet service providers, hotels, etc. Astra is an acronym for "advanced streaming application".

Web Call Server is unified intermedia server software developed by Flashphoner. It is a server-side platform, implemented in Java, dedicated for streaming video over wide range of communication protocols, including:

References