The present invention relates generally to a method and apparatus for interfacing locomotive 3D movements of a user to a reference in a virtual or remote environment, and more particularly to such a method and apparatus where a gestural pace made by a user is sensed and used to move the reference in accordance with the extent, direction, and timing of the gestural pace.
A virtual environment is an array of sensory cues generated in response to a user""s actions that gives the user the impression of dealing directly with a three dimensional model. Typically either a head mounted display or a surround screen display is used to present a dynamic perspective image of the visual environment as the user moves within it. The virtual environment may be synthetically generated within a computer or may involve sensory input from a remote physical location (as in tele-operations). Virtual locomotion is movement over long distances in a virtual environment which is controlled by a user remaining within a relatively small physical space.
Virtual locomotion can be used for a variety of purposes, such as: training or rehearsal in executing skills, tasks, strategies and navigation that involve moving through an environment on foot; planning activities that involve moving through a target environment; evaluating the ergonomics or aesthetics of structures designed for human habitation, or of devices intended for mobile operation; piloting remote surface vehicles; communications between people at different locations when they want to relate to each other in a way that involves locomotion; and entertainment experiences that involve moving through a virtual environment. Particular applications would include training people to perform hazardous tasks without risking bodily harm or to train soldiers in combat simulators where the soldier interacts directly with the surrounding environment which includes other members of a team.
Head based steering is widely used in virtual environment systems. It is economical because the same position tracker used to determine the user""s field of view is also used to control the direction of motion. Head based steering also encourages the user to look where they are going, and the hands are free for manipulation. However, looking and moving are no longer independent, so that the user cannot turn their head to look to the side while moving without altering their path. This makes it difficult to move to a desired position in the virtual environment.
Hand based steering techniques are also widely used and determine direction either from where the arm is pointed, from where a hand-grip is pointed, or from where a finger is pointed. Hand based steering thus frees the head for looking and allows the user to move sideways relative to the head or body. However, the control mechanism interferes with manipulation, so that the hand cannot be used for other, more normal tasks. In addition, the user must remain aware of the significance of where the hand is pointed at all times.
Torso based steering frees the head for looking and the hands for manipulation, but it does not support sidestepping. While a user will typically move in the direction that the front of the torso is pointing, sometimes this is not the case. For example, a soldier aiming a rifle across his chest may prefer to advance in the direction that he is aiming.
Another steering technique is lean based steering, which has three approaches: tilting of the upper torso, shifting of weight relative to the feet, and shifting of weight relative to a platform. All three approaches provide hands-free operation and can support sidestepping. However, the tilting of the torso approach precludes the user tilting for other purposes. The shifting of weight approach, where weight is shifted relative to the feet, is of limited use because a user cannot pace to turn.
The shifting of weight relative to a platform approach is controlled by moving the body of the user locally, relative to a central neutral position. However, when using a head-mounted display, the user easily loses track of where he stands with respect to the neutral position although the direction and rate of optical flow provides one indication of where the user is situated. A set of elastic straps attached to a ring around the user""s waist gives haptic feedback, pushing the user back towards the neutral position. However, with the directional coordinate frame relative to a fixed external point in space, this is an unnatural condition that makes turning of the body and controlling the direction of motion even more independent than they are with natural locomotion. For example, the user may choose to move in one direction and then turn to face in another, making it as easy to move backwards as forwards. This approach is also incompatible with physical locomotion because velocity is controlled by the relative position of the body.
Speed of movement in a virtual environment can be controlled by using finger pressure for hand based systems, by the degree of leaning for lean based systems, by the rate at which a user paces in place, or by the degree of leg movement when the user paces in place. Often a pair of binary switches attached to a hand control are used to invoke either forward or backward virtual motion. This widely used technique is easy and inexpensive to implement. The use of such hand controls is advantageous as they work independently of head, torso and leg movements, and are thus compatible with a wide range of physical motions. However, use of hand controls for speed interferes with use of the fingers for manipulative tasks which are becoming more desired and common, especially in combat systems where the user needs to hold and aim a weapon. Another disadvantage is that when head-mounted displays are used, the user cannot see his hands or how the fingers touch the buttons, limiting the number of buttons the user can deal with.
Another speed control system was based on detecting the gesture of walking in place. A six degree of freedom magnetic sensor was attached to a head mounted display in order to track the user""s head motion so that a computer could recognize walking in place versus other activity such as turning one""s head or bending at the waist. Head based steering was also used so that the head tracker fully controlled virtual locomotion. While useful, this system did not allow a user to walk in one direction and look in another direction.
A number of mechanical systems have also been disclosed. One such system uses the sliding motion of the feet to indicate walking. In this system, the user wears sandals with low friction film on the middle of their sole and a rubber brake pad at the toe. The user thus glides on a low friction surface by pushing their waist against a hoop that surrounds the user and sliding his feet. A position sensor attached to each ankle and contact sensors on the bottom of each foot allow the system to recognize the length and direction of each pace and hence to specify movement in the virtual environment. However, the placement of a hoop at waist level does not allow a user to hold an object such as a rifle by their side.
Another mechanical system utilizes an omni-directional treadmill and allows the user to walk in any direction. The treadmill consists of a pair of conveyor belts nested one insider the other, with each belt mounted horizontally and perpendicular to each other. The outer belt has rollers so that it can transmit the motion produced by the inner belt, so that motion in any horizontal direction can be made. An associated control is used to keep the user centered in the middle of the platform. This system allows a user to walk in a straight line in any direction and to accelerate in that direction. However, turning while accelerating can lead to a misalignment between the user""s direction of translation and the centering motion of the controller, causing a loss of balance. Even turning in place can be difficult because the controller registers the motion and compensates for it by moving the surface under the user""s feet. Another problem is that a user can decelerate very quickly, coming to a full stop in a single pace or by redirecting their motion while walking fast. The user normally perceives linear and angular acceleration using their vestibular system, which leads to perceptual conflicts when using this system and which thus makes its use difficult.
A foot based locomotion system has been developed by the Army Research Institute in Orlando, Fla. Forward virtual motion is triggered by the lifting of the foot above a vertical threshold. Steering is effected by the horizontal orientation of a magnetic tracker worn on the back of the user between the shoulder blades. Thus, the user is able to effect virtual displacement while turning in place. To move backwards, one foot is placed or slid behind the other a predefined distance, with both feet flat on the floor. However, this system is disadvantageous as it does not allow use of a normal pace or gait, or of a side pace.
It will thus be appreciated that the ability to realistically simulate walking around in a virtual environment is a key element missing from the prior art systems presently available.
In accordance with the present invention, a method and apparatus for interfacing locomotive three dimensional (hereafter 3D) movements of a user to a reference in a virtual or remote environment are provided. Initially, a 3D motion of a body portion of a user is sensed as the user takes a gestural pace. This sensing includes the determining of a beginning and an end of the gestural pace taken by the user, the determining of a 3D direction characteristic of the body portion motion during the gestural pace, and the determining of a 3D extent characteristic of the body portion motion during the gestural pace. Next, a 3D direction and extent of motion in the virtual environment corresponding to the determined direction and extent characteristics of the gestural pace is computed. Finally, the computed 3D motion is used to move the reference in the environment.
Preferably, the body portion sensed is a part of a leg of the user. More preferably, the body portion sensed is a knee of the user. Then, the determining a 3D direction characteristic includes a measuring of an initial horizontal motion of the knee to a maximum displacement thereof, and a measuring of a return horizontal motion of the knee from a maximum displacement thereof.
In a preferred embodiment, the determining the beginning and end of a pace includes a step of measuring a force exerted by a foot of the user on a base such that the gestural pace begins when the measured force first starts to diminish, given that this decline leads to the force falling below a threshold, and ends when the measured force either reaches a plateau, beyond the force threshold, or bears a selected proportion (such as one half) of the weight of user. The force threshold setting is dependent on the weight of the fully equipped user, and the tightness with which the force sensors are held in contact with the user""s foot.
Also in a preferred embodiment, the computing step (a) computes the direction of motion in the environment to be equal to the determined direction characteristic, and (b) computes the extent of motion in the environment to be a multiple of the determined extent characteristic. To do this, the determining of the 3D extent characteristic step includes a step of measuring a rocking motion of the knee of the user during the gestural pace.
In the preferred embodiment, the reference in the environment includes a point of view. Then, the method further includes the steps of determining changes to an orientation of a second body portion of the user associated with a point of view of the user, and moving the point of view of the reference in the environment to match the determined changes of the point of view of the user. Preferably, the determining changes to an orientation of a second body portion determines changes to a head of the user.
The method and apparatus of the present invention preferably also includes the sensing of a 3D motion of a body portion of a user as the user takes an actual pace, including the steps of determining a beginning and an end of the actual pace taken by the user, determining a 3D direction of the body portion motion during the actual pace, and determining a 3D extent of the body portion motion during the actual pace. Then the determined direction and extent of the actual pace is used to likewise move the reference in the environment. The step of distinguishing between an actual pace and a gestural pace is preferably made by determining an extent of a return motion of the knee of the user.
It is an object of the present invention to provide a control for maneuvering through virtual environments as naturally as possible.
It is also an object of the present invention to provide a great deal of compatibility with other natural actions by having the user turn in the virtual environment by physically turning their body.
It is a further object of the present invention to allow other sorts of postural movements like bending at the waist or crouching down.
It is still a further object of the present invention to allow virtual locomotion to be intermixed with natural locomotion.
It is an advantage that the equipment needed for the apparatus is relatively compact (compared to other systems) and potentially inexpensive.
Other features, objects and advantages of the present invention are stated in or apparent from detailed descriptions of presently preferred embodiments of the invention found hereinbelow.