As a technique for displaying a three-dimensional image on a display apparatus, conventionally known is a technique as disclosed in Patent Document 1 (Japanese Laid-Open Patent Publication No. 6-68238). In the case where a viewpoint (a position of a virtual camera) moves in the three-dimensional space, a game apparatus described in Patent Document 1 displays in real time an image of geographic features of the three-dimensional space in accordance with a movement of the viewpoint. In the game apparatus, based on the position of the viewpoint and a viewing direction, an address of a space to be displayed on a screen is calculated, coordinate conversion is performed with respect to altitude data of the geographic features, and then a color to be displayed on the screen is determined. Accordingly, the above-described game apparatus is capable of promptly responding to the movement of the viewpoint, and displaying the image of the geographic features set within the three-dimensional space in real time without using expensive hardware.
In the three-dimensional space, in the case where the position of the viewpoint and/or the viewing direction change, an image of the three-dimensional space viewed from the viewpoint along the viewing direction, that is, an entire content of the image to be displayed on the display apparatus changes (moves). In the case where the entire content of the image changes, in this manner, it is difficult to identify the content of the image due to the change of the image, compared to a case where a part of the content of the image changes (e.g., a case where the ground surface is displayed so as to be at a standstill, and only a player object is moving). Therefore, a user may not be able to recognize a target to be focused (focusing target) in the image. For example, the user may lose sight of an object having been focused since the viewpoint has moved. In another example, the user may not be able to capture the object to be focused during the viewpoint moving. Particularly, in the case where the viewpoint moves at a high speed, or in the case where the viewing direction changes, the entire content of the image changes drastically, and thus a problem becomes significant in that the user cannot identity the focusing target.
Therefore, an aspect of the present invention is to provide an image processing program and an image processing apparatus which are capable of allowing a user to easily recognize a focusing target even in the case where a position of a viewpoint or a viewing direction changes in a three-dimensional space.
The present invention has the following features. The reference numerals, additional description and the like in parentheses described in the present section indicate the correspondence with the embodiment described below in order to aid in understanding the present invention, and are not intended to limit the scope of the present invention in any way.
A first aspect of the present invention is directed to a computer readable storage medium (optical disc 4) having stored thereon an image processing program (game program 60) executed by a computer (CPU 10 or the like) of an image processing apparatus (game apparatus 3) for displaying an image of a virtual space on a display apparatus (television 2). The image processing program causes the computer to execute a reference image generating step (S8), an area determining step (S9), a first blurring value calculation step (S10), a blurring processing step (S13), and a displaying step (S14). In the reference image generating step, the computer generates, as a reference image, the image of the virtual space viewed from a virtual camera which is set in the virtual space. In the area determining step, the computer determines, in the reference image, a predetermined area (target area) which includes pixels corresponding to a predetermined position in a viewing direction of the virtual camera. In the first blurring value calculation step, the computer calculates a first blurring value (correction value A) which indicates a blurring degree of an image. The first blurring value is calculated so as to become greater than a reference value in the case where a movement amount of the virtual camera is equal to or greater than a predetermined amount, and so as to become smaller than the reference value in the case where the movement amount of the virtual camera is smaller than the predetermined amount. In the blurring processing step, the computer blurs an area outside the predetermined area in the reference image such that the blurring degree increases in accordance with an increase in the first blurring value. In the displaying step, the computer displays an image blurred in the blurring processing step on the display apparatus.
In a second aspect, the image processing program may further cause the computer to execute a second blurring value calculation step (S9). In the second blurring value calculation step, the computer calculates a second blurring value (synthesis rate α) of respective pixels in the area outside the predetermined area, the second blurring value indicating a blurring degree of an image, in accordance with a predetermined method which is different from a calculation method for the first blurring value. In this case, in the blurring processing step, the computer determines the blurring degree of at least the pixels in the area outside the predetermined area in accordance with the first blurring value and the second blurring value, and blurs, based on the blurring degree having been determined, the pixels in the area outside the predetermined area in the reference image.
In a third aspect, in the second blurring value calculation step, the computer may calculate the second blurring value of each of the pixels in the area outside the predetermined area in accordance with a depth value (Z value) of said each of the pixels.
In a fourth aspect, in the second blurring value calculation step, the computer may calculate the second blurring value of said each of the pixels in the area outside the predetermined area such that the blurring degree increases in accordance with an increase in a difference between the depth value of said each of the pixels and a depth value at a focus point of the virtual camera.
In a fifth aspect, in the area determining step, the computer may determine the predetermined area in accordance with a position of the focus point of the virtual camera.
In a sixth aspect, in the area determining step, the computer may determine, as the predetermined area in the reference image, an area including pixels whose depth value and the depth value at the focus point are different from each other by a predetermined value or less.
In a seventh aspect, the image processing program may further cause the computer to execute a rotation step (S6) of rotating the virtual camera around the predetermined position such that the virtual camera is facing the predetermined position. In this case, in the area determining step, the computer determines, as the predetermined area, an area including pixels corresponding to an object existing in a predetermined space which includes the predetermined position in the virtual space. In the first blurring value calculation step, the computer uses a rotation amount of the virtual camera as the movement amount of the virtual camera.
In an eighth aspect, the reference image generating step may be executed repeatedly at a rate of once per predetermined time period (1 frame time period). In this case, the first blurring value calculation step is executed each time the reference image is generated in the reference image generating step. In the first blurring value calculation step, the computer calculates the first blurring value by using a most recent first blurring value as the reference value.
In a ninth aspect, in the first blurring value calculation step, the computer may calculate the first blurring value by using a predetermined value as the reference value such that the first blurring value increases in accordance with an increase in a magnitude of the movement amount.
In a tenth aspect, the image processing program may further cause the computer to execute a blurring image generating step (S12) of generating a blurred image of the reference image by using the predetermined reference image. In this case, in the blurring processing step, the computer synthesizes, as a blurring process, the reference image and the blurred image at a synthesis rate corresponding to the blurring degree.
In an eleventh aspect, the movement amount of the virtual camera is equivalent to a rotation angle of the virtual camera.
In a twelfth aspect, the movement amount of the virtual camera is equivalent to a change amount of a position of the virtual camera.
A thirteenth aspect is directed to a computer readable storage medium (optical disc 4) having stored thereon an image processing program (game program 60) causing a computer (CPU 10) to execute a process of generating an image viewed from a virtual camera situated in a virtual space. The image processing program causes the computer to execute a rotation step (S6), and an image generating step (S13). In the rotation step, the computer rotates the virtual camera. In the image generating step, in the case where a rotation amount (rotation angle) of the virtual camera per unit time exceeds a threshold amount, the computer draws a first drawing target object (start object 51, player character object 54, enemy character object 55 and the like) included in a first area which includes a center of rotation of the virtual camera, draws a second drawing target object (star objects 52 and 53) included in a second area, which surrounds the first area, so as to be a blurred image compared to the first drawing target object, and generates a screenful of a display image based on the virtual camera.
In a fourteenth aspect, in the image generating step, in the case where the rotation amount of the virtual camera per unit time does not exceeds the threshold amount, the computer may draw the second drawing target object so as to be a less blurred image compared to the second drawing target object which is drawn in the case where the rotation amount of the virtual camera per unit time exceeds the threshold amount.
In a fifteenth aspect, the image generating step may include a blurring changing step (S10, S11) of changing a blurring degree (correction value A) of the second drawing target object in accordance with the rotation amount.
In a sixteenth aspect, in the blurring changing step, the computer may change the blurring degree so as to increase the blurring degree in accordance with an increase of the rotation amount, and to decrease the blurring degree in accordance with a decrease of the rotation amount.
Further, the present invention may be provided in a form of an image processing apparatus having a function identical to the image processing apparatus executing each of the steps in the above-described first to sixteenth aspects. In the image processing apparatus, each of the steps may be executed by a CPU which executes the image processing program, or alternatively, some or all the steps may be executed by a dedicated circuit included in the image processing apparatus.
According to the first aspect, in the case where the virtual camera moves by an amount equal to or greater than the predetermined amount, the image outside the predetermined area is displayed in a blurred manner. That is, in the case where the entire image changes due to the movement of the virtual camera, the image within the predetermined area is displayed clearly compared to the image outside the predetermined area. Accordingly, it is possible to bring the user's focus within the predetermined area. Therefore, even in the case where the entire image changes, the user can easily capture and recognize the focusing target within the predetermined area.
According to the second aspect, the second blurring value, which is set for each of the pixels, is used in addition to the first blurring value, whereby the blurring processing can be performed in accordance with the blurring degree appropriate to each of the pixels.
According to the third and fourth aspects, the blurring processing is performed to various blurring degrees which varies on a pixel-by-pixel basis in accordance with a depth value of each of the pixels. Particularly, according to the fourth aspect, the blurring degree is small in the vicinity of the position of the focus point in the depth direction, and the blurring degree becomes large at a position far from the focus point in the depth direction. Accordingly, it is possible to generate an image such that the image is focused at the focus position.
According to the fifth aspect, the position of the focus point of the virtual camera is used as the above-described predetermined position, whereby the predetermined area can be determined easily.
According to the sixth aspect, pixels corresponding to the object in the vicinity of the position of the focus point in the virtual space are within the predetermined area. Accordingly, the object in the vicinity of the position of the focus point, which tends to be a focusing target of the user, is displayed without being blurred, and the object in the position far from the focus point is displayed in a blurred manner. As a result, it is possible to determine the predetermined area such that the user's focusing target is certainly included in the predetermined area, and also possible to allow the user to recognize the focusing target unfailingly.
Further in the sixth aspect, in the case where the second blurring value is calculated based on the above-described difference, it is possible to perform, by using the difference, both of the processes of calculating the second blurring value and determining the predetermined area. Accordingly, it is possible to simplify the processes compared to a case where these two processes are performed by using two variables which are different from each other, and also possible to perform these two processes efficiently.
According to the seventh aspect, compared to an object existing within the predetermined space having the center of the rotation of the virtual camera included therein, an object existing outside the predetermined space is displayed in a further blurred manner. Accordingly, even in the case where the entire image changes due to the rotation of the virtual camera, the user can easily capture and recognize the focusing target within the predetermined space.
According to the eighth and ninth aspect, the first blurring value can be easily calculated so as to be equal to a value corresponding to the movement amount of the virtual camera.
According to the tenth aspect, the blurring processing can be easily performed by synthesizing the reference image and the blurred image.
According to the eleventh aspect, the blurring degree can be changed depending on a change in the attitude of the virtual camera.
According to the twelfth aspect, the blurring degree can be changed depending on a change in the position of the virtual camera.
According to the thirteenth aspect, in the case where the virtual camera rotates at the rotation amount equal to or greater than the threshold value, the second drawing target object is displayed in a further blurred manner than the first drawing target object. That is, in the case where the entire image changes due to the rotation of the virtual camera, the first drawing target object in the first area (the area in the virtual space) is displayed further clearly than the second drawing target object in the second area (the area within the virtual space). Accordingly, it is possible to bring the user's focus within the first area. Therefore, even in the case where the entire image changes, the user can easily capture and recognize the focusing target in the first area.
According to the fourteenth aspect, in the case where the rotation amount of the virtual camera does not exceed the threshold, the second drawing target object is displayed further clearly than a case where the rotation amount exceeds the threshold value. That is, in the case where the entire image hardly changes, and the image is continuously viewed clearly, the second drawing target is also displayed so as to be viewed clearly, whereby the entire image can be displayed so as to be viewed clearly.
According to the fifteenth aspect, the blurring degree can be calculated easily in accordance with rotation amount of the virtual camera. Further, according to the sixteenth aspect, the blurring degree is set so as to increase in accordance with the increase in the rotation amount of the virtual camera. Accordingly, it is possible to set such that the more the rotation amount increases, the more easily the user can focus the first drawing target object.
These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.