Within the field of computing, many scenarios involve a headset that presents a visual to a user, where the visual is rendered to translate the motion of the user into a corresponding motion of the visual. For instance, a display mounted within a virtual reality helmet may project a three-dimensional rendering of a scene from a particular perspective, where the motion of the user (e.g., tilting or turning the head; ducking or jumping; or stepping forward, backward, left or right) may cause a scene renderer to render the scene from an altered perspective corresponding to the user's motion.
In such scenarios, many users perceive a lag between the motion and the corresponding rendering of the scene. For example, a headset may be capable of maintaining a generally consistent framerate of 100 Hz, a perceivable delay in the range of ten milliseconds arises between the user's motion and the corresponding translation within the scene—often longer if the scene renderer misses a frame, or is not capable of incorporating a detected motion into the very next frame. The user's perception of this delay may give rise to vertigo and/or motion sickness, which may considerably impair the user's enjoyment of using the headset. The effects may be exacerbated in some cases; e.g., a rapid acceleration, such as quickly turning the head, may amplify the discrepancy between the user's motion and the perceived scene.