HTTP live streaming is now wide-spread in industry. The set top can act as an HLS server and create variant bit rate streams and an HLS feature can allow switching playlist files based on available network bandwidth. A set top gateway box has an HLS server especially used for streaming video to multiple mobile client devices either in home or out of home. Out of Home streaming also adds up network delay causing delay to display first video frame at end device. For better user experience, delay in streaming startup should be reduced. One of factor to consider is to reduce pre buffer time at HLS server while streaming video content.
When HTTP live streaming, for every content, the server maintains different playlists with different bit rates (quality) indicating different files. The client downloads these playlist files. Then a client determines available network bandwidth and selects files from the appropriate playlist and plays them one by one (single content is split into several file chunks which are specified within a playlist). The client periodically monitors bandwidths, downloads/updates playlist files and plays accordingly.
Systems currently in the art implement server-side controlled adaptive streaming in a chosen VMS platform. In embodiments, the VMS Server will monitor the network bandwidth, network buffer read and write status of the client and adjust the transcoder configuration to output at the desired rate. This requires only one VMS transcoder resource for any playback request from client boxes.
However, a problem with server controlled adaptive streaming techniques is that while switching between different bitrate profiles, the pre-buffer time results in delays adapting to desired bitrate profile. This can be avoided by using an idle transcoder unit as a standby while switching to different bitrates. This includes dynamic change/update in playlist file only while switching to desired bitrate profile. Different playlist files depending upon bitrate (quality) are stored on the server, and the set top client fetches files from playlist. Table 1 shows different profiles for content which has different playlists associated with each of them.
TABLE 1Content ProfilesStreamContainerCodecTypeResolutionFPSBit RateProfile #1SPTSMPEG2-TSAVCHigh 4.11920 × 108029.976.7502SPTSMPEG2-TSAVCHigh 4.11920 × 108029.974.5003SPTSMPEG2-TSAVCHigh 4.11920 × 108029.973.0004SPTSMPEG2-TSAVCHigh 4.11280 × 720 29.974.1255SPTSMPEG2-TSAVCHigh 4.11280 × 720 29.972.7506SPTSMPEG2-TSAVCMain 3.01280 × 720 29.972.5007SPTSMPEG2-TSAVCMain 3.0854 × 48029.971.5008SPTSMPEG2-TSAVCMain 3.0854 × 48029.971.0009SPTSMPEG2-TSAVCMain 3.0640 × 36029.970.60010SPTSMPEG2-TSAVCMain 3.0416 × 24029.970.25011SPTSMPEG2-TSAVCMain 3.0640 × 48029.971.25012SPTSMPEG2-TSAVCMain 3.0640 × 48029.970.90013SPTSMPEG2-TSAVCMain 3.0480 × 36029.970.50014SPTSMPEG2-TSAVCMain 3.0320 × 24029.970.25015SPTSMPEG2-TSAVCBase 3.0848 × 48029.970.70016SPTSMPEG2-TSAVCBase 3.0640 × 48029.971.20017SPTSMPEG2-TSAVCBase 3.0640 × 48029.970.90018SPTSMPEG2-TSAVCBase 3.0480 × 32029.970.90019SPTSMPEG2-TSAVCBase 3.0480 × 32029.970.600
All the playlist files are stored as different playlist files (for example, m3u8 format) located by a different URI. Set tops and mobile clients can download different playlist files associated with different bit rates. As this playlist file content gets dynamically updated on the server, the client set top also connects to server periodically to get these playlist files and compare if they are changed, and then use that information accordingly. The periodicity of playlist refresh at the client is described in standard document: “HTTP Live streaming draft-pantos-http-live-streaming-05” section 6.3.4.
However, problems arise, for example, in case of DVR asset playback from an HLS server to a client. In such an example, content is transcoded to a client-supported profile and the HLS chunks are created on the fly and an m3u8 playlist file is generated. For smooth playback, there can be pre-buffer of 3 transcoded chunks. This adds up waiting time for mobile client to display first video frame. For example, if the one chunk duration is of a 2 second duration and the pre buffer is of 3 chunks, then the total pre buffer added time is 2×3=6 seconds. When network congestion is detected, the rate switch occurs to a lower bitrate variant, but the client could not experience the switch down in the bitrate due to the available chunks of higher bitrate which already exist due to the previously described prebuffering technique in the pipeline. This worsens the network congestion as the state is already bad and the client has to further consume this high bitrate content for the duration of the prebuffer. Client side stabilization in this case can take ˜6 sec (which is equal to the pre buffer time). This is an excessive delay. During this time, content play on the client device will experience video macroblock or jitter or some type of disruption.
The embodiments described herein solve these problems by making use of the idle transcoders that are used as standby resources to support smooth switching to desired bandwidth (see FIG. 3 and the associated sections, below).
Disclosed herein is a system to smooth and reduce the transition time while switching to different bitrate transcode profiles. This is achieved in a few ways, including avoiding pre buffer time using parallel transcode sessions, enabling a fast response to Network congestion, as desired bitrate chunks/segments are readily available. This is a more simple approach as the client, for example, an HLS client, is not aware of switching the bitrate.
The proposed technique does not demand any change in the HLS client as the client is not aware of any pre-transcoded content and playlist file. Only the HLS server needs to manage synchronization of pre-transcoded content and continues with the further transcoding of content.