The present disclosure generally relates to tracking eye movement, and specifically to using time of flight to determine eye tracking information based on a calibrated model of the path of light through a user's eye.
Augmented reality (AR), mixed reality (MR), and virtual reality (VR) systems may receive various input methods, such as input from the user's hands, input devices, and so on. In addition, the AR/MR/VR system may receive input based on eye tracking information, which may indicate, for example, where the user's eye is gazing towards. Such information may be used in a gaze-contingent interface. However, current systems of eye tracking may necessitate complex optical sensors and methods to analyze and detect corneal reflections and other image features captured of the user's eye, which are computationally intensive. Moreover such systems do not provide direct information regarding an eye's accommodation, and instead indirectly infer the eye's accommodation from a determined vergence of the eyes.