The current invention makes a range of medical simulations more realistic to the user. In virtual reality simulations, the user becomes immersed in a "world" which is artificially constructed from computer graphics in polygon frames to be a visual representation in three dimensions which changes to the user's view as he or she effects movement inside that world. In simple simulations, the user merely navigates around objects and through the virtual environment. Thus, in more complex medical simulations, the user is required to manipulate objects within the virtual environment, changing their places within that artificial, visual domain. When supplementing the virtual experience with physical medical instruments as the interface device, it is far more difficult to achieve realism, but that realism, when achieved, extends out to the very hands of the user.
In the most complex virtual reality simulations, such as medical simulations, the user can grasp and deform anatomical objects with instruments which are virtual within the computer-created world, but which--as interfaces--are handles of the instruments which might be used within that anatomy by doctors performing procedures. This obviously requires sensors of geometric position in the interface. However, in the highest level of virtual reality simulator which also senses interface position geometries, the user will experience not only instrument tactility, but force feedback from the instrument and from the organ which is seen in the virtual environment. In some cases, feedback to the hand has been effectuated through a special glove. Thus in the highest level of medical simulation, both visual and haptic displays must work in coordination.
Other technical elements in the creation of the virtual simulation include whether the user is semi-immersed, or fully immersed. In a semi-immersed situation, the user operates within the virtual world by using some sort of interface--a mouse or a touchpad, voice commands and, at times, mock instruments which electronically serve as an interfaces to the virtual reality. In this semi-immersive virtual reality, the user is watching a remote virtual world--such as one, which would normally be observed on a CRT screen in a minimally invasive surgery, or flexible scope endoscopy, or in performing an angioplasty. The essence of realism in the semi-immersive virtual reality interaction is that in a normal circumstance the user would be performing from a position remote from the event, as a physician is in minimally invasive surgeries which are performed within an enclosed cavity of the body.
In a partially immersed situation, the medical user in a simulation may be looking at a CRT screen, either in two dimensions, as is most usual, or in three dimensions with stereoscopic glasses. The instrument he or she uses is in full direct view and is used remote to the interface as a mouse or a joystick would be. On the screen the user may see the effect of navigating through the virtual reality with the instrument--if it is an simulation of a medical implement with a camera attached, such as a flexible endoscope--or if the camera is one of several implements in the real medical procedure, the user may see part or all of the implement he is manipulating offscreen as an onscreen instrument that matches his offscreen motions. Magnetic tracking devices can be used to portray the instrument in hand as the instrument onscreen, and can even use them to make simple manipulations or deformations of the computer graphics onscreen, as when virtual tissue is grasped, cut, stapled or otherwise deformed in the virtual graphic setting. In this prior art, the instrument is not able to create tactile feedback from itself--as in forming a staple--or force feedback from the virtual anatomy, as in grasping or cutting the virtual tissue. For the creation of force feedback, a resistance-creating device is necessary. If the instrument is build into this device, then the procedure is confined to those instruments which are part of resistance creating devices, which may be somewhat limiting to those surgeons who in a complex operation, may want to use several tools. In the case of the partial immersion, prior art has solved this problem by having instruments "dock" into a force-feedback-creating --or "haptic" --device.
Several of these devices are known. In all of the known devices and proposals of prior art, one of two conditions are present: (1) where the tactility and specific action of the instrument itself is necessary, that instrument in current simulators is built into the haptic device, and so no selection is possible, or (2) where a selection of instruments is possible, the instruments are usually limited to those which interact with the tissue in very basic ways, such as a scalpel cutting as opposed to scissors cutting, or a needle for sewing, or a probe moving the tissue rather than a grasp holding the tissue, and most of the more complex tactile feedback of the instrument itself is not sensed. Furthermore, in this second category of devices, the ability of the instrument to be "docked" is dependent upon direct vision by the user, and in most cases is a deterrent to the overall selection and continuum of reality in the simulation. The virtual medical simulators currently seen, then, fail to provide the surgeon in a virtual simulation an appropriately broad selection of implements that give tactile feedback in themselves, and transmit their specific actions to the simulation. In addition, current virtual medical simulators fail to provide recognition to the computer model of the types of instrument selected, and the person or persons selecting it, to be able to use that information in logic that transcends the anatomy and goes to the roles and physical position of the users themselves.
In fully immersed virtual reality some methodology is used to allow the user almost complete departure from the realistic world surrounding. This is often done by means of a mechanism that shows the visual only to the single user with small screens placed near the eyes within goggles or a head-mounted display. In certain instances, the visuals are projected upon the user's retina. However it is accomplished, in immersed virtual reality, the user has the impression of being surrounded by a completely different place than in the actual surroundings. The Head Mounted immersion may, however, become an accepted or even standard means of remote or semi-remote surgical viewing, because with the head mounted display the surgeon will--on call--be able to see many other kinds of information, such as X-rays, without looking away from his surgery.
In fully immersed virtual reality simulations which involve haptic displays, or more specifically tactile force feedback in medical simulations, the most advanced existing mechanisms both read position on a sensor which translates to a visual location in the virtual environment, and feedback a certain kind and amount of force to that interface device. In the same manner as with the partially immersive situations, watching a CRT as opposed to being surrounded with the picture in a 360 degree environment, these haptic feedback systems will either (1) require that the interfacing instrument be locked into the reader/generator mechanism, to be a part of that mechanism and not interchangeable, or (2) offer some method of reading various interchangeable interface instruments and of imparting feedback to that changeable instrument in a fixed armature, or (3) by placing the instrument (or one's finger) in a receptacle on a mullet-positional arm.
The problem of haptic simulation within a total immersion simulation is (1) that the user cannot see to insert the actual interfacing instrument, so the instrument currently must be fixed or at least pre-inserted in the haptic/reader system, or (2) though they may look down to pick up instruments, the user cannot change instruments and insert them into the receptacle except by reverting to direct view, and taking himself or herself out of the fully immersive environment. In totally immersive medical simulations, in such situations that do not admit of remote procedures that can be semi-immersive, there is currently no way of connecting a free-floating haptic arm with a selectable instrument in a totally immersive environment. Thus there continues to be a need for the haptic receptacle arm to "home" on selected instruments, and seamlessly effect their "docking."
Current virtual reality medical simulators using haptic devices have two qualities: positionality-sensing and force feedback generation. However laudatory those simulators may be, they are lacking (1) the capability for an identified user to easily select between and use two or more instruments requiring complex hand and finger movement, and (2) they lack the capability of tracking, homing, intercepting, and docking with said randomly-selected instruments within the virtual environment.