Smart devices including smart phones, mobile phones, tablet computers, and the like have become pervasive. Further, wearable devices, such as smartwatches, fitness bands and monitors, action cameras, and the like, have become increasingly popular. These wearable devices may often include a very small touchscreen to interact with the device. Users of these devices may need to accurately touch the correct user interface (UI) or icons, which are often spaced closely together, and/or swipe the interface several times to search and launch an application. Further, some of these devices include no touchscreen or user interface at all. As a result, the user experience for these small wearable devices may be degraded due to their confined Human Machine Interface (HMI). Some of the existing hardware and software solutions to sense user input may include push buttons, voice controls and gesture controls. These solutions, however, may suffer several disadvantages including limited states (i.e., ON and OFF states for hardware push buttons), complex and expensive interfaces (i.e., gesture and voice sensing require complex and expensive computing power and sensors), and unfashionable appearance (e.g., protruding hardware is not integrated, stylish or compatible with wearable devices). Simply put, conventional small wearable devices, such as smartwatches, having confined touchscreens and/or user interfaces may not be optimally useful (i.e., inaccurate, less user-friendly, unintegrated, and incompatible) to the wearer.