In recent years, there have been spread portable phones and digital home electrical appliances loaded with a high-performance memory or CPU. Further, as the broadband Internet service has been spread, various tools are available, with which the user can easily create applications for implementing various animations, or animations themselves.
In the animations created with use of such tools, it is necessary to secure the integrity between the movement of animation and the sound of animation.
As a conventional technology relating to the above needs, there has been known an animation generation device disclosed in e.g. patent literature 1. FIG. 11 is a block diagram showing the animation generation device disclosed in patent literature 1.
The animation generation device shown in FIG. 11 is provided with a user setting section 300, an object attribute acquiring section 304, a sound processing section 305, an animation generating section 101, and a display section 102. The user setting section 300 includes an object setter 301, an animation setter 302, and a sound file setter 303, with which the user performs a setting operation for an animation effect.
The object setter 301 generates object data representing an object to be animated and displayed in response to a setting operation by the user. The animation setter 302 generates animation effect information representing an animation effect in response to a setting operation by the user. The sound file setter 303 generates sound data of animation in response to a setting operation by the user.
The object attribute acquiring section 304 acquires object attribute information representing an attribute (such as the shape, the color, the size, and the position) of an object to which an animation effect is applied.
The sound processing section 305 includes an edition lookup table 306, a waveform edition device 307, and a processing controller 308, with which a sound file is processed and edited, based on animation effect information and object attribute information.
The edition lookup table 306 stores therein a correlation between object attribute information and parameters for waveform edition, and a correlation between animation effect information and parameters for waveform edition. In this example, as a correlation between object attribute information and parameters for waveform edition, for instance, there is used a correlation that a sound which gives greater impact is correlated to an object which gives visually strong impression.
As a correlation between animation effect information and parameters for waveform edition, for instance, there is used a correlation that a waveform edition parameter indicating “an object is gradually enlarged and displayed” is correlated to an animation effect “zoom-in”.
The processing controller 308 specifies a waveform edition parameter corresponding to animation effect information from the edition lookup table 306, and controls the waveform edition device 307 to execute a waveform edition processing using the specified waveform edition parameter.
The waveform edition device 307 performs a waveform edition processing using a waveform edition parameter specified by the processing controller 308.
The animation generating section 101 generates an animation of an object to be animated, utilizing sound data which has been processed and edited by the processing controller 308. The display section 102 outputs the animation and the sound generated by the animation generating section 101.
As described above, in the animation generation device disclosed in patent literature 1, the length and the volume of sound are adjusted in such a manner as to match the feature of an object to be animated and displayed, such as the color, the size, and the shape, which have been set in advance by the user. Thus, the integrity between the movement and the sound of animation is secured.
In recent years, animation is actively used at e.g. a user interface of a digital home electrical appliance. Reproduction of animation may be stopped at the user interface in response to a user's operation or command.
In the animation generation device disclosed in patent literature 1, however, there is no disclosure about as to how the sound is treated, in the case where reproduction of animation is stopped during a reproducing operation. In the above animation generation device, even if the sound is edited in such a manner that the sound matches the movement of animation before reproduction of the animation is started, in the case where reproduction of the animation is suspended in response to a user's operation or command, the sound is continued to be played, which may make it difficult or impossible to secure the integrity between the movement and the sound of animation. As a result, the animation may give the sense of incongruity to the user.
As described above, in the case where an animation generated by the device disclosed in patent literature 1 is simply applied to a user interface of e.g. a digital home electrical appliance, if reproduction of the animation is stopped at an unintended timing by the user, the sound may continue to be played, and the user may feel the sense of incongruity.