The delivery of digital television to the home was launched in earnest in 1995 for both cable television as well as satellite-delivery systems. This new technology enabled multi-channel video program distributors (MVPD) to provide far more television programming using available network bandwidth as compared to what had been possible using analog signals of the same programs. A plurality of digital television signals are combined to fit multiple digital channels into the space of one legacy analog channel via a process called “multiplexing.” When television programs were digitally encoded at a fixed bit rate (called Constant Bit Rate or ‘CBR’), then the early digital cable TV systems could carry perhaps six to eight digital television programs in the space of a single legacy analog channel (6 MHz for NTSC or 8 MHz for non-NTSC-based systems).
The distribution networks of MVPD systems, whether cable TV or satellite, are known as “managed networks” because the output of a multiplexer is typically of a fixed bit rate. For comparison, the Internet data network is known as an “unmanaged” network, since the public use of the Internet is not regulated by a central controlling mechanism and bandwidth between two points on the network varies unpredictably.
Variable bit rate (VBR) video encoding is more efficient in the use of bandwidth than CBR encoding. VBR also generally delivers a better quality picture for the same average bandwidth. However, VBR is more difficult to manage on a distribution network. Statistical multiplexing is used to address this difficulty.
With the advent of interactive services hosted in a central location, such as a cable TV headend as well as with media originating “in the cloud” and routing over a managed network to a consumer set-top box, the difficulty of managing the VBR session within a multiplexer becomes far more challenging and more prone to adverse interactions among the sessions within a multiplex stream.
Interactive television services provide the viewer with the ability to interact with their television for the purposes of selecting certain television programming, requesting more information about the programming, or responding to offers, among many possible uses. Such services have been used, for example, to provide navigable menu and ordering systems that are used to implement electronic program guides and on-demand and pay-per-view program reservations without the need to call a service provider. These services typically employ an application that is executed on a server located remotely from the viewer. Such servers may be, for example, located at a cable television headend. The output of a software application running on such servers is streamed to the viewer, typically in the form of an audio-visual MPEG Transport Stream. This enables the stream to be displayed on (or using) virtually any client device that has MPEG decoding capabilities, including a “smart” television, television set-top box, game console, and various network-connected consumer electronics devices and mobile devices. The client device enables the user to interact with the remote application by capturing keystrokes and passing these to the software application over a network connection.
An interactive television service combines the properties of managed and unmanaged network topologies. Such services require low delay, perceptually real-time properties typically associated with Real Time Transport Protocol running over User Datagram Protocol (UDP/RTP) high-complexity, proprietary clients. However, in interactive television applications the stream must be received by relatively low-complexity clients using consumer electronics-grade components. Typically, these clients do not have the capability of much more powerful laptop and tablet computers to which the user has grown accustom. Hence, interactive applications hosted on a cable or satellite set-top box are perceived as slow and old-fashioned compared to the contemporary norm. Hosting the application in a central means (e.g., a remote server located at a cable television headend) and providing the picture output to the set-top device mitigates this shortcoming and allow for the delivery of rich, highly interactive applications and services. It also places stronger demands on the distribution network to deliver these services.
A centrally (remotely) hosted interactive television service provides a combination of relatively static image portions representing a Graphical User Interface (graphical UI or GUI) that requires low-latency, artifact-free updates responsive to user input, and other portions that may have video with associated audio that require smooth and uninterrupted play-out. Conventional network distribution systems do not adequately facilitate this combination of data types. For instance, with existing statistical multiplexers for cable or satellite television systems, when large user interface graphics of a particular session need to be sent to a particular client, the many other sessions sharing the same multiplex have no means available (except a drastic reduction in image quality) to scale back the bandwidth requirements of adjacent streams to allow a temporary large data block representing the UI graphics to pass.
With many interactive sessions active within a single multiplex stream, a possibility exists for disruption to video, audio and/or GUI data. The only alternative that conventional systems have is for the conservative allocation of bandwidth which then supports many fewer simultaneous sessions per multiplex stream.
Therefore, it is desirable to provide an improved way for multiplexing interactive program streams.
Additional background information is provided in U.S. patent application Ser. Nos. 12/443,571; 13/438,617; and 14/217,108, all of which are incorporated by reference herein in their entirety.