Smart TVs, smart appliances, Virtual Reality (VR), and Augmented Reality (AR) are all becoming increasingly popular. The key to their success is having an easy-to-use user interface to control the device (e.g., smart TVs, smart appliances and devices implementing VR/AR). Currently though such devices lack an easy-to-use user interface.
Smart TVs are currently cumbersome to control by having the user navigate through various menus. Many smart appliances require users to manually launch smartphone applications and click through pages to control the smart appliance, which is even more cumbersome than turning on/off switches. VR and AR provide an immersive experience, and open the doors to new ways of training, education, meeting, advertising, travel, health care, emergency responses, and scientific experiments. However, the current user interface of devices implementing VR/AR are rather limited: they rely on tapping, swiping, voice recognition, or steering the camera towards the hand to make sure the hand is within the view and line-of-sight of the camera while wearing the headset.
Recently, there has been research conducted concerning motion tracking as a means for controlling the device. However, despite some progress, achieving highly accurate and responsive device-free tracking on commodity hardware remains an open challenge. According to game and application developers, sub-centimeter level accuracy and a response time within 16 ms are required in order to provide a good user experience. This is especially challenging to achieve using a commodity device, such as a smartphone, given its limited processing power and lack of special hardware.
Hence, there is not currently a means for easily interacting and controlling devices, such as smart devices and devices implementing VR/AR, via highly accurate and responsive device-free tracking.