This invention relates to a system for providing control inputs based upon determining the position and orientation of a part of the body of a user, such as the head, in three-dimensional (3D) space, and using the inputs to control a pointer or cursor on a 3D display, an instrument in 3D space, the perspective of a display (camera angle or view) of a 3D image or scene, etc. In particular, the invention is directed to a low-cost, accurate, and reliable system that can be used as a hands-free input device to play interactive 3D games on a computer.
With the proliferation of 3D graphics hardware for desktop computers and the increasing sophistication and power of 3D software, especially in the character combat genre of games, superior methods of character control and manipulation are desired and required as compared to the current joystick, keyboard, and mouse. The current peripheral input devices are inadequate for several reasons. First, a user must learn and remember a separate set of commands for each game or software program. For example, the xe2x80x9cUpxe2x80x9d arrow key may move the user forward in one game whereas the xe2x80x9cFxe2x80x9d key may do the same in another. This is cumbersome and burdensome for the user. Secondly, there is no logical correlation between pressing a key or button and directing movement, camera angle, or pointing on a display. It is more natural to the user, and would enhance psychological immersion into playing a game, if the user can control movement in the game by physically moving a part of their body in a similar direction in real time.
Ideally, a user would prefer to control a game character to navigate in 3D space or change what the character sees in as similar a manner as possible to controlling the movement of their own head or other part of their body. Also, it would be of a great benefit if the navigational and perspective changing method could free up the user""s hands for other, simultaneous input. For example, if the user could move their character without the use of the hands, then their hands could be used for combat-type input on a keyboard, game controller, or other input device.
Prior 3D head input systems suitable for computer cursoring or game control have relied on detection of signals transmitted from a head-mounted unit to a detection unit in order to calculate the position and/or orientation of the head. For example, in U.S. Pat. No. 5,367,614 to Bisey, ultrasonic sensors arranged in a triangular configuration around a computer monitor measure the arrival times of a pulse signal transmitted from a head unit in order to compute relative distances of the head unit along 3 axes and use the computed data to rotate a 3D image on the screen in the same way the user would move their head to view an object in nature. In U.S. Pat. No. 5,574,836 to Broemmelsiek, detection of the head shift of a user by an ultrasonic or infrared detector is used to generate a corresponding parallax or perspective shift in the display of a 3D object. In U.S. Pat. No. 5,926,264 to Beale, a 4-division photosensor array detects the relative intensities of a light beam from a source LED reflected by a head-mounted reflector in order to derive the orientation of the user""s head and correspondingly control a pointer on a computer screen.
These prior 3D head input systems have typically relied on detection of relative light beam intensities or arrival times of transmitted signals from a head unit to a receiver unit in order to calculate by standard triangulation algorithms the relative position coordinates and/or relative angular orientation of the user""s head. Such xe2x80x9cone-wayxe2x80x9d transmitter/receiver detection systems have the limitation that they can be used to compute the head unit""s position or its orientation, but not both at the same time with accurate results. Computational tradeoffs are made to obtain primarily position information or orientation information, but cannot obtain accurate information in xe2x80x9csix degreesxe2x80x9d of freedom, i.e., on linear axes (X, Y, Z) and rotational axes (yaw, pitch, roll) at the same time. More complex systems have been devised for accurate 6-degrees-of-freedom (so-called 6DoF) calculations, but these have required expensive and complex equipment for optical scanning of a three-dimensional target array worn or held by the user from multiple camera angles, e.g., as described in U.S. Pat. No. 5,889,505 to Toyama, U.S. Pat. No. 5,884,239 to Romanik, U.S. Pat. No. 5,461,478 to Sakakibara, U.S. Pat. No. 5,227,985 to DeMentheon, U.S. Pat. No. 5,187,540 to Morrison, and U.S. Pat. No. 4,649,504 to Krouglicof et al.
It has become important with increasing sophistication of 3D games and other environments (such as computer aided design (CAD), simulation, or virtual reality environments) for the user to be able to change movement or position within the environment (X,Y,Z coordinate location within the 3D world) and also to control and change the field-of-view within the environment (often referred to as xe2x80x9ccamera anglexe2x80x9d). Also of importance is the ability of the user to perform such navigational or change of camera angle functions while keeping their hands free for manipulation tasks within the environment. For example, in CAD design, a design engineer may need to move the view around a displayed object while at the same time using a mouse or commands from a keyboard to add, delete, or modify components with the object. As computing power continues to increase while costs decrease for increasingly sophisticated 3D applications programs in an expanding range of environments, an accurate, low-cost, 3D navigation system with 6DoF control suitable for xe2x80x9chands freexe2x80x9d computer input is increasingly needed.
Accordingly, it is a principal object of the present invention to provide a system for 3D navigation by a user on linear and rotational axes on a display that is reliable, simple to manufacture, and low in cost. It is a particular object of the invention that the 3D navigation system can be operated by a simple unit worn on the user""s head in order to leave the user""s hands free for simultaneous input in interactive games or manipulation of other controls.
In an apparatus embodying the present invention, a 3D navigation system comprises a complementary pair of emitter/detector units, one of which is worn on a part of a user""s body and the other of which is mounted in a stationary position on or adjacent to a display screen facing the user, wherein each emitter/detector unit of the complementary pair has an infrared emitter element for emitting a cone-shaped beam along the unit""s normal axis in a direction toward the other unit, and an array of photodetectors arranged around the infrared emitter element having their detection surfaces facing outwardly from the unit""s normal axis so as to receive the infrared cone-shaped beam from the other unit on its detection surfaces and provide an output of output signals representing light intensities of the received beam on the respective detection surfaces, and a processing unit for receiving the output signals from the photodetector array of the user-worn unit and the output signals from the photodetector array of the stationary unit and using them together as combined inputs to calculate position information and angular orientation information representing the position and orientation of the user-worn unit on linear and rotational axes relative to the stationary unit. The invention also encompasses the related method of using complementary, user-worn and stationary emitter/detector units for 3D navigation control.
In a preferred embodiment, the 3D navigation apparatus consists of a user-worn headset unit and a monitor-mounted unit coupled to a processor module. Each unit has an infrared emitter element aligned on a center (normal) axis and a ring of photodetector cells angled outwardly around the center axis of the unit. The processor module includes a processor motherboard and IC circuitry, links to the complementary emitter/detector units, and cable or wireless port to connect the device outputs to a computer COM port, parallel port, or other communication input channel, such as a USB connector. The monitor unit can be integrated with the processor module, or can be a separate unit mounted on the frame of a display monitor.
The photodetector cells are preferably 4 silicon photovoltaic (PV) cells or photodiodes spaced symmetrically around a central infrared emitter element, and angled outwardly at 30 to 45 degrees to the central axis of the unit. The detector surfaces of the PV cells or photodiodes perferably have filter elements superimposed on them to filter out ambient background light from the infrared light beam. The infrared emitter element preferably has a peak spectral output in the 900 nanometer (nm) range corresponding to the peak sensitivity range of the PV cells. Both the user-worn and stationary emitter/detector units have an identical construction and components in order to lower the cost of manufacture in quantity.
The outputs of the PV cells or photodiodes of each unit are converted to voltage signals through the use of on-board amplifier circuitry. The PV voltage signals are transmitted to the processor module and converted to binary data using an analog-to-digital integrated circuit (IC) device. The output binary data are transmitted to the associated computer, via the COM port, USB, or other communications channel, to device driver software running on the computer. The device driver software uses the input values to compute the X,Y,Z coordinates and angular orientation of the user""s headset unit relative to the monitor unit and uses the resulting position and orientation angle values to control 3D navigation functions of the associated application program (game, CAD program, etc.) running on the computer.
The 3D navigation system of the invention can also be used for hands-free control in a wide range of other applications and environments besides games. It may be used to control flight motions and display views in a simulator or cockpit of an aircraft, or the command center of a naval vessel, spaceship, land vehicle, etc. Leaving the user""s hands free allows for simultaneous control of other functions, such as manipulation of controls on a control panel or the firing of weapons. For disabled persons, the hands-free 3D navigation system can be used to control input to a computer, the movements of a motorized wheelchair, prosthetic device, and other assistance devices for the disabled. The 3D navigation system may also be used to control various functions in an industrial environment, for example, guiding the machine-assisted movement and orientation of a workpiece while leaving the operator""s hands free to manipulate process equipment applied to the workpiece (welding, painting, laminating, etc.). The system can similarly be used in CAD design, architectural, medical graphics, virtual reality, and other commercial applications. For example, use of the headset with a virtual reality display can allow a 3D xe2x80x9ctourxe2x80x9d to be taken through construction sites, buildings, medical diagnostic images, and simulated or artificial environments.