Traditionally, a virtual machine (VM) hypervisor includes an emulation capability for user input devices, such as a mouse and a keyboard. This emulation capability gives the VM the impression that it is attached to and controlling actual physical user input devices, when in fact these devices are attached to the physical client device that the end user logs into the VM from. The VM itself can be either local or remote to the physical client device, but most likely is remote to the device.
In the case of touch screen devices (also known as touch devices), they typically do not have physically separate end user input devices. Instead, these touch devices have the capability to translate an end user's touches on a screen of the touch device to input commands that are similar to key presses or mouse input. Furthermore, touch devices are being utilized more frequently to view sessions running on VMs.
Currently, the on-screen keyboard provided by the operating system (OS) of the touch device is utilized as the input device for the VM. Any key press indications or other touch commands received on the touch screen are converted into mouse and keyboard events, and these events are forwarded on to the VM. At this point, the hypervisor managing the VM utilizes this event information for the emulation function of the VM.
Unfortunately, this setup presents some difficulties. When viewing and interacting with a VM via the touch screen device, the VM is presented in its own window display, with control of the contents of that window provided by the VM itself. The touch screen keyboard provided by the touch screen device OS cannot be displayed within this VM window as it is not controlled or provided by the touch screen OS.
As a result, if the VM window is launched into a full screen size, there is no place to display the touch screen keyboard provided by the touch device OS. The VM window can be re-sized to allow display of both the VM window and the touch screen keyboard, but this is a cumbersome process and often times does not result in an aesthetically pleasing view on the touch screen.
Another problem is that some gestures might not translate well to mouse and keyboard events. Such events are normally interpreted by the touch device and not forwarded to the VM. For example, a zoom gesture would be interpreted as zooming the whole VM session window, when the reasonable expectation by the user may be to only zoom the specific application window with the VM session window.