Media, including audio and/or video, can be captured, encoded, and streamed live over a computer network using a protocol such as HTTP live streaming (HLS). The HLS protocol includes a manifest containing pointers (or references) to separate segments of video data. A single live media stream may include multiple tracks that can be synchronized with each other via presentation time stamps (PTSes) embedded in the segments of audio or video data.
A typical media player will time playback based on a local player clock. In the case of playback of a live stream, where a player may not have control over the rate at which the media stream is received at the player, playback rate based on a local player clock may be adjusted in response to buffer fullness measurements of the received live stream.
Many types of modern events include multiple cameras and/or microphones for capturing the event simultaneously from different angles or positions at the event. Separate HLS media encoders may not, however, generate PTSes that correlate to each other.