Virtual Reality (VR) technology has spread to many different applications in the past few years. This spread has been accelerated through the availability of devices that easily allow interaction with virtual interaction with virtual environments. These new devices range from stereo visual displays and passive tracking devices, to more active mechanisms that involve force feedback. These force feedback devices, which interact with the users sense of touch, their haptic sense, allow for greater realism and performance when completing tasks. Being able to feel contact between two objects produces much stronger cues to the user than can be produced with visual feedback. Haptic interfaces display forces to the users hand through the use of motors or actuators, thereby mimicking the feel of physical objects. These devices, in conjunction with conventional visual displays, allow the creation of many different types of simulators, from a typical block in hole problem to surgical simulations. The devices can also be used for entertainment in video games and scientific exploration by examining haptic models of remote environments, which could range from atomic scale molecular models to planetary scale virtual maps. Haptic interfaces can also be used as general masters in teleoperation systems, where the user controls a machine at a remote site through sensations generated by the interface based on what the remote machine experiences.
In general, haptic interface systems consist of an apparatus which will physically sense the position of and apply forces to the user, and computational hardware to determine the users position, perform simulation tasks, and determine the forces to feed back to the user so that he/she can feel the result of moving the on-screen object into contact with and sometimes into another on-screen object.
The computational hardware can also contain the simulation of the visual display, so as to maintain synchronism between the haptics and the graphics subsystems.
There are a handful of different haptic mechanisms commercially available which vary greatly in design. These different mechanisms range greatly from simple 2 degree of freedom (DOF) computer mouse type designs to a 7 DOF, cable driven mechanism. Both devices utilize low friction, low inertia mechanisms. Other devices are built around standard robot mechanisms, with sensors to detect movement so that they can perform inertia compensation, to reduce the perceived mass of the device. Some mechanisms include force sensors in the tool handle to sense the forces felt by the user, while others forego such sensors to simplify the design and reduce cost.
3 DOF devices provide the most generality in workspace without limiting the user to a planar surface, but are limited to providing only position feedback, usually of the tip of a device or tool, which inadequate for many simulations. This is because the shank of the tool may contact an obstacle or another portion of the body it is inserted into which causes a torque to be applied to the tool. Prior 3 DOF devices have no way of simulating this torque, so the user cannot feel its effect. Devices like the aforementioned 7 DOF haptic interface are very complex, and have limited mechanical stiffness.
By way of background, minimally invasive surgical simulation, where tools are inserted into the body through small incisions and the procedure is observed on a video monitor, often require more than 3 DOF so that an individual can feel all of the effects on the tools as it is inserted into the body. This includes, for instance, the cantilevering of the shank of a probe on a tendon as the point of the scalpel proceeds in towards its intended position.
There is therefore a need for a device which can simulate the forces felt by both the tip and the shaft of a medical instrument inserted within the body.
More particularly, in the field of surgical simulation, there is a need to provide a trainee with the feel of the instrument as he or she uses the instrument in a simulated environment. Here a training scenario may involve manipulating a simulated scalpel on-screen, at which, for instance, a patients knee is presented. Through so-called voxel representation of the knee, various parts of the knee can be given densities and penetration characteristics resembling a real knee.
In the past, such virtual reality representations have been used to permit the generation of forces on the tip of the instrument used to drive the virtual reality instrument so as to give the trainee feeling of actually penetrating the given part of anatomy. However, the 3D haptic devices, called Phantoms, cannot give the trainee the sensation of what is happening when not only does the probe tip touch a given part of the anatomy, but also when the tool shaft touches part of the anatomy as the tool is manipulated.
In short, what is missing is a way to impart the sensation of the tool cantilevered on some structure or obstacle which is removed from the tip of the tool, as would the case of the tool passing over a ligament on its way to where the tip of the probe is present.