Virtual reality can be viewed as a computer-generated simulated environment in which a user has an apparent physical presence. A virtual reality experience can be generated in 3D and viewed with a head-mounted display (HMD), such as glasses or other wearable display device that has near-eye display panels as lenses to display a virtual reality environment, which replaces the actual environment. Augmented reality, however, provides that a user can still see through the display lenses of the glasses or other wearable display device to view the surrounding environment, yet also see images of virtual objects that are generated for display and appear as a part of the environment. Augmented reality can include any type of input such as audio and haptic inputs, as well as virtual images, graphics, and video that enhances or augments the environment that a user experiences. As an emerging technology, there are many challenges and design constraints with augmented reality, from generation of the virtual objects and images so that they appear realistic in a real environment, to developing the optics small and precise enough for implementation with a wearable display device.
A waveguide display can be implemented in a wearable display device as a near-eye display panel. However, conventional waveguide displays are limited in field of view because of the limited range of angles to propagate light down the waveguide and due to the fixed mechanisms for pushing light into and out of a waveguide. One technique is to stack two waveguides on top of or next to each other with an airspace between the two waveguides. However, disadvantages of this technique include the need for collimation optics for each waveguide, which adds complexity and bulk to a display system. The collimation optics for each waveguide typically add too much bulk for a waveguide display to be implemented as lenses in a wearable display device.