Hypertext transfer protocol (HTTP) streaming is a form of multimedia delivery of internet video and audio content—referred to as multimedia content, media content, media services, or the like. In HTTP streaming, a multimedia file can be partitioned into one or more segments and delivered to a client using the HTTP protocol. HTTP-based multimedia content delivery (streaming) provides for reliable and simple content delivery due to broad previous adoption of both HTTP and its underlying protocols, Transmission Control Protocol/Internet Protocol (TCP/IP). Moreover, HTTP-based delivery simplifies streaming services by avoiding network address translation (NAT) and firewall traversal issues. HTTP-based streaming also provides the ability to use standard HTTP servers and caches instead of specialized streaming servers that are more difficult to scale due to additional state information maintained on those servers. Examples of HTTP streaming technologies include Microsoft Internet Information Services (IIS) Smooth Streaming, Apple HTTP Live Streaming, and Adobe HTTP Dynamic Streaming.
Adaptive video streaming involves continuously optimizing video configurations such as bit rate, resolution, and frame rate based on changing link conditions, device capabilities, and content characteristics. Adaptive streaming improves the video viewing experience for the end client user in terms of performance goals such as high video quality, low startup delay, and interrupt-free playback. Traditionally, adaptive video streaming involved a Real-Time Streaming Protocol (RTSP). RTSP includes a client that connects to a streaming server that tracks the client's state until it disconnects. Tracking the client's state entails frequent communication between the client and the server, including session provisioning and negotiation of media parameters. Once the client and the server establish a session, the server sends the media as a continuous stream of packets over either User Datagram Protocol (UDP) or TCP transport. Example technologies for RTSP-based adaptive streaming include Microsoft Windows Media™, Apple QuickTime™, Adobe Flash™, and Helix™ by Real Networks, among others.
Dynamic adaptive streaming over HTTP (DASH) is an adaptive HTTP streaming technology standardized in the Third Generation Partnership Project (3GPP) TS 26.247 and the Moving Picture Experts Group (MPEG) ISO/IEC DIS 23009-1; however, various standards organizations implement DASH technology including the Open Internet Protocol Television (IPTV) Forum (OIPF) and Hybrid Broadcast Broadband TV (HbbTV), among others. DASH operates differently in comparison to RTSP-based adaptive streaming because DASH operates by the use of the stateless HTTP protocol.
DASH specifies formats for a media presentation description (MPD) metadata file. The MPD file provides information on the structure and different versions of the media content representations stored in the server. The MPD file also specifies the segment formats, i.e., information concerning the initialization and media segments for a media player to ensure mapping of segments into media presentation timeline for switching and synchronous presentation with other representations. For example, the media player inspects initialization segments identified in the MPD file to understand container format and media timing info.
Wireless communication technology, such as Worldwide Interoperability for Microwave Access (WiMAX) or Long Term Evolution (LTE), has evolved to deliver rich multimedia and video services in addition to the traditional voice and data services. Typical wireless multimedia communications involve the transmission of a continuous source over a noisy channel. Common examples are speech communications, mobile TV, mobile video, and broadcast streaming. In such communications, the multimedia source is encoded and compressed into a finite stream of bits, and the bit stream is then communicated over the noisy channel. Source coding is carried out to convert the continuous source into a finite stream of bits. Channel coding is performed to mitigate the errors in the bit stream introduced by the noisy channel. Source and channel coding introduce quality degradation during playback of the media that is generally attributable to such factors as high distortion levels, limited bandwidth, excessive delay, power constraints, and computational complexity limitations. Nevertheless, it may be important to transmit the source over time-varying wireless channels while satisfying certain end-to-end quality of service (QoS) or quality of experience (QoE) constraints, including average distortion and multimedia quality requirements, such as in real-time mobile video streaming.