Electronic games have long been played on PCs and on dedicated game consoles, including hand-held portable consoles, e.g., GameBoy©, Nintendo DS©, PlayStation©. User input mechanisms to control games have evolved from keyboards, mice, joysticks, track/touch pads, to touch screens, and more recently to three-dimensional natural interfaces. Natural interfaces track can track body parts of the game player using three-dimensional type imaging. Software then discerns the player's desired game control actions from movements of the player's arms, legs, torso, etc. The Kinect© console from Microsoft© uses such a natural interface, and game play is modified, substantially in real time, responsive to perceived movements of the game user's body. (The terms “user” and “player” are used interchangeable herein.)
Challenges associated with inputting three-dimensional commands to a two-dimensional application, typically a video game, will be described generally with respect to FIGS. 1A-1C. These figures depict aspects of a very popular two-dimensional video game called Angry Birds, produced by Rovio;© see http://en.wikipedia.org/wiki/Angry_Birds and http://www.rovio.com/en/our-work/games/view/1/angry-birds. The popular mode of this game application falls into the third class of applications, as defined above. In its simplest form, the user or player aims and fires a virtual slingshot depicted on the game display. The goal is to accurately lob a projectile (an angry bird) from the slingshot towards the target, a rickety structure providing shelter to a herd of pigs. Points are awarded in the game for destroying the shelter and dispatching the pigs by accurately controlling the slingshot and its projectile. It is understood that in this game application, player input and display output are each two-dimensional.
FIG. 1A depicts an aiming scene in the Angry Birds game as it might appear to a user (or player) on display 10 of a monitor associated with the game-playing device. In this example the device is perhaps a hand-held smart phone that executes the AngryBirds game application and includes the display as an integral part of the device. A slingshot 20 with elastic band 30 holds a projectile 40 that is to be launched upon a desired trajectory 50′ (see FIG. 1B) by the game player to hit the target (the pig shelter 60 and pigs 70, shown in FIG. 1B). The game shows the last used trajectory 50 to aid the player and making adjustments to arrive at a new trajectory 50′, and also shows the ground terrain 80. As suggested by the (x,y) coordinate system shown, virtual projectile 40 can be moved left or right (x-direction) and/or up or down (y-direction), but cannot move into or out of the plane of display 10, which is why no z-axis is depicted. Understandably launching projectile 40 requires the user to cause slingshot 20 to direct a vector force accurately towards the desired target upon a successful trajectory 50′. A true vector force is required in that the quantum of force imparted to projectile 40 must be sufficient to reach the target, and the aiming of the projectile must be accurate enough to hit the target.
When playing Angry Birds on a device with a touch screen, the player can touch the image of slingshot 20 and “pull-back” projectile 40 and elastic 30 in a desired (xw,yw) direction to propel projectile 40 toward target 60, 70. In FIG. 1A and FIG. 1B, it is assumed that game coordinates (x,y) and real-world or screen coordinates (xw,yw) are superimposed. After aiming, the player releases projectile 40 by taking his or her finger off the projectile image. Assume that the last trajectory 50 was too shallow and that a higher trajectory 50′ is now created by the player. FIG. 1B depicts the target aspect of the game and shows that a higher trajectory 50′ was successfully achieved, with the desired result that projectile 40 has hit a portion of structure 60. At least a portion of structure 60 will collapse injuring at least one pig 70, sheltering within the structure, and appropriate game points will be awarded. Note that the display in FIGS. 1A and 1B has no notion of depth, as might be seen from the perspective of the game player. As such projectile 40 cannot be aimed at a target 60′ “in front of” structure 60, or a target 60″ “behind” target 60 because there is simply no sense of depth z in this class three two-dimensional game application.
Some game device manufactures try to promote a sense of three-dimensionality in the display itself. Some game devices might produce a three-dimensional display requiring the player to wear stereoscopic glasses, or perhaps the display will be auto-stereoscopic, which would not require eye glasses to be worn by the player. The Nintendo©3DS© mobile game device uses an auto-stereoscopic display to promote a three-dimensional experience, although the user interface still requires buttons and physical touching.
What is needed is a method and system whereby a two-dimensional game application, be class one, class two, or class three, may be modified if needed and played by responding to true three-dimensional real-world interface and corresponding timing attributes, including natural interface body and limb gestures made by the game player, without need to physically contact the game-rendering device or display. Further such method and system should enable the two-dimensional game application to convert to an integrated three-dimensional input and output framework. Such result could enable the game video display to present a sense of depth along a game display z-axis, as viewed from the perspective of the game player, and should enable the player to alter or define line of sight control in three-dimensional space. The game application could be played on a small, handheld or portable device, without need for physical contact by the game player.
The present invention provides such systems and methods.