Virtual creatures are now widely used as characters in 2D or 3D games, avatars of players (users), or user interfaces of computers in the fields of description in information processing. Movement of a virtual creature of this kind is generated in an information processing apparatus, so that the virtual creature freely moves on a screen, or emits a sound.
Also, technologies for overlapping a virtual image on a real image, such as augmented reality (AR) and mixed reality, are spreading these days. A virtual image to be overlapped on a real image may be an image of a virtual creature, such as a character in a game, an avatar of a player (user), a virtual pet, or a user interface of a computer, or a virtual mobile object such as an automobile.
For example, there is a suggested entertainment device that displays a virtual pet combined with an image of real environments. The virtual pet can talk with a user and walk around on a virtual screen (see Patent Document 1, for example). In this entertainment device, a virtual pet generated by a system unit is rendered in a real image taken with a video camera, and is displayed on a display/sound output apparatus such as a monitor or a television set equipped with a display and a speaker However, the virtual pet moves only in the screen, and the sound or the like that expresses the action or reaction of the virtual pet is output only from the installation site of the display/sound output apparatus. Therefore, there is a limit to realistic expression of interaction between the virtual pet and the real space.
There also is a suggested object display apparatus. In this object display apparatus, the depth position of a three-dimensional image displayed on a three-dimensional display apparatus of DFD (Depth Fusion Display) type is adjusted to a depth position through which a real object passes, so as to give the impression that a virtual object exists three-dimensionally in the same space as the viewer, and the virtual object and the viewer interact with each other (see Patent Document 2, for example). However, this object display apparatus can provide only visual interaction effects. Also, such interaction effects are valid only in the installation site of the object display apparatus displaying the virtual object.
There also is a suggested image processing method for superimposing a video image of a virtual space including a virtual object on the real space, and presenting the video image through a head mount display (see Patent Document 3, for example). According to this image processing method, a speaker outputs sound effects such as a sound of a virtual object moving around in a virtual space. However, such a sound effect expresses interaction of the virtual object in the virtual space, and does not present interaction of the virtual object with the real space.
There also is a suggested information terminal device that displays an image of a virtual pet so that the virtual pet appears as a virtual image on the human skin in the actual field of view (see Patent Document 4, for example). When the virtual pet moves along with the movement of the skin of a hand or the like, the user can get the impression that the virtual pet is reacting to the movement of the skin. That is, with this information terminal device, a user in the real space can perform action on the virtual pet, but the virtual pet cannot perform any action in the real space. In short, interaction between the virtual pet and the real space is not to be presented.