The present disclosure generally relates to sensor devices, and specifically relates to a tracking sensor integration system configured to estimate a pose of a user's hand for use in artificial reality systems.
An artificial reality system is a simulated environment created by computer technology and presented to a user, such as through a head-mounted display (HMD) system. Typically, a HMD system includes a HMD headset that provides visual and audio information to the user. Conventional HMD systems create virtual hands of the user in the simulated environment and use a hand tracking system to track motion and positions of the user's hands. However, many conventional hand tracking systems are based on optical systems, and such systems may not capture accurate poses of user's hands.
To estimate a pose of a user's hand with a high accuracy, an optical estimate may be fused with a stretch sensor estimate, and possibly with estimates obtained by other sensors such as an inertial measurement unit (IMU), wherein these various types of tracking sensors are attached to an instrumented (fabric) glove or some other wearable garment placed on the user's hand. To obtain an accurate estimate of the pose of the user's hand using sensor data acquired by the various tracking sensors attached to the instrumented glove, the attached sensors may need to be first calibrated before applying any estimation algorithm on the acquired sensor data. However, this approach can be challenging for a number of reasons including sensor nonlinearity and drift, shifting of the instrumented glove on the hand, noisy (or non-valid at some time frames) ground truth data, and so on.