1. Field of the Invention
The present invention generally relates to computer software. More specifically, the present invention relates to a rendering application configured for steering an animated character, in real-time, toward a goal location where the character's motion is blended from a set of motion clips that form a data-driven motion space.
2. Description of the Related Art
The term rendering tool refers to a broad variety of computer-based tools used by architects, engineers, animators, video game designers, and other graphics and design professionals. Rendering is the process of generating an image from a model by means of computer programs. A rendering application may be used to render three-dimensional (3D) characters. These 3D characters may be animated by the rendering application.
A common feature of rendering applications is the ability to generate frames of an animation sequence in real-time. For example, the motion of a video-game character moving from one location in a graphics scene to another may be generated on the fly based on a game player's interaction with the game. To create a motion sequence, users of a rendering application typically begin with one or more motion clips from which the motion sequence is created. Often, motion clips are created using a motion capture system. Typically, a motion capture actor wears markers near each joint to identify the motion by the positions or angles between the markers. The markers are then tracked to sub-millimeter positions. Motion capture computer software records the positions, angles, velocities, accelerations, and impulses, providing an accurate digital representation of the motion.
Similarly, many game products include goal-driven characters that are not controlled directly by the player of the game. In the video game industry, these are called “non-player characters” or “NPCs”. The most popular games (sports, role-playing, strategy, and first person shooters) make heavy use of NPCs to provide the key action elements in the game. Prior art techniques for controlling a non-player character's motion relied on a network of discrete motion clips with connected transitions that linearly blend from one clip into another clip. Motion transitions generated using this approach suffer from “motion artifacts” that cause the appearance of sliding, jumping, skipping, or other changes that look unnatural.
Creating realistic renderings of animated characters in real-time has proven to be challenging. Prior art techniques for controlling the motion of an NPC rely on a network of discrete motion clips with connected transitions that linearly blend from one clip into another clip. However, this approach often results in motion transitions that suffer from “motion artifacts” which cause the appearance of sliding, jumping, skipping, or other changes that look unnatural.
Creating game character motion that is engaging to game players and that appears realistic has proven to be difficult. Before this can happen, a game player must be able to see the character motion as being “alive” or “correct” without the mechanical motion artifacts that jar a player out of context. This problem of implausible motion is particularly apparent in animation sequences that show periodic motions, such as walking, running, or swimming.
Accordingly, there remains the need in the art for a technique for generating realistic animation sequences using real-time, goal space steering for data-driven character animation.