Virtual Reality (VR) systems use a wired connection between a rendering engine and a Head Mounted Device (HMD) in order to guarantee efficient and reliable transfer of rendered frames. Wired connections offer the advantages of high bandwidth and reliable transmission. However, it obviously limits the mobility of the user. Wireless VR systems aim to replace the wired connection with a wireless one, (e.g., WiFi). Due to the limited bandwidth of wireless connections as compared to wired connections, highly efficient low latency compression is critical for enabling wireless VR systems.
Compression is important since rendering can involve large amounts of computation. For certain real-time applications, such as VR applications, video games, simulations, and the like, rendering needs to occur at very fast speeds. Applications may need to maintain the latency between a user input and the rendering of the corresponding graphics within a certain tolerance. For example, a high rendering latency in response to a user input in a VR application can lead to degraded visual acuity and performance. These breaks in perceived presence, in turn, may lead to motion sickness, for example.
The advancement in networking and increased bandwidth have allowed for the possibility of offloading rendering computations from client devices to remote servers, which can stream rendered graphics to the client. Under such a rendering scheme, a client may transmit input commands over a network and a server can perform rendering of a scene based on the input and transmit the rendered scene back to the client. However, even with increased network bandwidth, maintaining low latency remains challenging.