A haptic device allows a user to interact with a computer via the sense of touch by simulating and rendering contact forces with virtual objects. A typical interface takes the form of a miniature robot arm with sufficient linkages to allow a 2D or 3D workspace (FIG. 1). Motors or brakes control the motion of the links that are selectively constrained to represent interactions with physical objects.
The user interacts with the device via a gripper or stylus attached at the end-effector of the moveable framework and typically moves a displayed graphical object across a computer screen. Clearly the choice of the displayed object is dependant on what is being simulated and on the devices capabilities. For the purposes of this invention, one can define an avatar as a virtual representation of the user through which physical interaction with the virtual environment occurs. For instance, a surgical tool may be thought of as an avatar when simulating an intervention. A cursor, on the other hand, can be used to represent the position of the mouse being held by the operator. The operator controls the avatar's position inside the virtual environment. When contact takes place between the user's avatar and the virtual objects, action and reaction forces occur. Such forces are regulated by the type of contact supported by the avatar and by its geometry.
A haptic device is typically used as a position control device in which displacement of the end-effector is directly correlated to displacement of the avatar displayed on the screen. This displacement correlation may not be a one-to-one correspondence, since the avatar position may be scaled according to a constant mapping from the device position. For example, the device may be moved a distance of one centimeter which causes the controlled avatar to move five centimeters across the screen. In general, small displacements of the device are scaled to large motions of the avatar to allow the operator to easily reach targets in all areas of the virtual workspace environment displayed onto the computer screen.
The scaled avatar movement scheme works well for coarse motion, when large distances inside the virtual workspace need to be traversed to bring the avatar from one global area to another. Accuracy of the avatar motion is not critical for coarse motion, but for tasks in which accurate positioning of the avatar is needed, the large scaling of device movement to avatar movement makes a target acquisition task physically impossible for the user.
Ballistic tracking is typically used to alleviate the scaling problem for fine positioning of the controlled object. Ballistics refers to the technique of varying the scaling between the motion of a physical device and the motion of a displayed avatar depending upon the velocity of the device in its workspace. The assumption is that if the user is moving the interface very quickly, the user is likely to be performing a “coarse motion” task inside the virtual environment, and therefore the device controller scales small motions of the interface to large motions of the avatar. Conversely, if the user is moving the device very slowly, the user is likely to be performing a fine positioning task on the screen, and the controller scales small motions of the device to small motions of the avatar.
When using ballistics, moving the device in one direction quickly and then moving it back in the other direction slowly may create a situation where the physical device has returned to its starting position but the avatar is positioned far away from its starting position. This illustrates that the frame of the avatar and the frame of the device have shifted or become offset. If this offset becomes too large, the user may not be able to reach some parts of the virtual workspace within the range of motion of the device.
In a typical, open-workspace interface, the offset is corrected through a process called “indexing.” Indexing is achieved in a typical mouse interface by lifting the mouse off the table and repositioning it after the mouse has reached the edge of its available workspace, while the cursor remains fixed in position. However, most force feedback devices are grounded to their base and require the use of an additional input device, such as a user switch for instance, to inform the controller to uncouple the device from the avatar and let the operator reposition the device at the center of its physical workspace. Unfortunately, with limited-workspace devices, indexing becomes cumbersome and highly interferes with the operator since he or she needs to constantly perform the offset correction.
Since ballistics needs indexing to restore the frame offsets, and since ballistics and indexing are both traditional mouse techniques that conflict with typical ground-based haptic devices, a more transparent solution is needed that reconciles both the ballistics and the indexing problem in force feedback interface devices without interfering with the operator.