Presbyopia is an age-related farsightedness condition caused by a loss of elasticity in the lens of the eye. This loss of elasticity decreases an adult's ability to accommodate near objects. Children typically have the ability to accommodate 20 dioptres or focus on any object from 50 mm from their eye to infinity. Most adults, by age 50, can only accommodate 2 dioptres. This loss of ability to accommodate generally results in adults requiring some form of visual correction such as reading glasses to focus on near objects. This means that adults must wear reading glasses to accommodate near objects and then remove them to accommodate far objects. In cases where adults also require glasses to correct nearsightedness (inability to accommodate far objects) they must switch between two sets of glasses depending on the depth of their gaze. This is a cumbersome solution for coping with presbyopia as well as myopia and hyperopia. Users would benefit massively from eyewear that adjusted automatically to accommodate near and far objects without requiring manual input from the user.
Several eyewear products have been developed to help adults accommodate both near and far objects using a couple different kinds of lenses. Adlens offers a pair of glasses that uses manually tunable Alvarez lenses that the user can adjust by twisting a knob on each lens. Pixeloptics developed glasses that allow the user to manually switch between two forms of correction (a farsightedness correction and a nearsightedness correction) by pressing a button. Pixeloptics also made a product that uses an accelerometer to allow the user to manually adjust between near and far prescriptions by moving their head. Eyejusters also produced eyewear that allows the user to manually focus Alvarez lenses. Adlens developed eyewear with a membrane lens for continuous focus accommodation that also requires the user to manually adjust a knob on the glasses in order to focus on near or far objects. None of these technologies allows for automatic continuous focus adjustment, but instead rely on user to engage the focus mechanism.
In order for eyewear to make automatic adjustments to a continuous focus lens, it needs to observe the eye and determine the depth of the user's gaze. The relative positions of the pupils are examples of features of the eye that can be used to determine depth of gaze. Sorensen proposed using a neural network to process reflections from the eye to provide an at least partially in focus image in a display screen. (U.S. Pat. No. 5,861,936) The current disclosure uses cameras and a computer controller to track the pupils and determine the depth of gaze using a technique known as structured light. Gersten proposed the use of structured light to display the edge of the pupil on a corneal topography map. (U.S. Pat. No. 5,214,456) Raffle et al suggests that structured light could be used to detect the direction of someone's gaze in their Google Glass application. (U.S. Pat. No. 8,955,973). The current disclosure is not concerned with the direction of the user's gaze, instead it uses structured light to determine the depth of the user's gaze by comparing structured light scans of both eyes numerous times per second.