1. Field of the Invention
The present invention relates to an image pickup apparatus such as a digital camera, and more particularly to an image pickup apparatus that incorporates a characteristic object tracking processing technology.
2. Description of the Related Art
Digital cameras have been developed that can specify the area of an object to be captured on the basis of picked-up image data, adjust the brightness of the area of this object to be appropriate, and adjust focus.
Among these cameras, there is a camera that can track the movement of a specific object by performing pattern matching between time-series images. For example, Japanese Unexamined Patent Application Publication No. 2006-184742 discloses a structure for tracking a main object. With the structure, the main object is specified using distance information and a face-detection result, and processing for subsequently extracting, from images obtained thereafter, areas whose characteristics are similar to the characteristics of the area in which the main object has been present is performed.
However, in the case of the object tracking processing technology which has been proposed, when a camera is suddenly moved by a large amount or the main object suddenly changes its moving direction by a large amount, the camera may fail to track the movement of the main object. Moreover, although the camera has failed to track the movement, if pattern matching is continued using the area that was recognized at the moment that the main object existed, there is a problem in that tracking of an area that does not include the main object is continued.
For example, there is a known method in which a user is allowed to specify an object to be tracked, by moving a tracking frame displayed on a liquid crystal display on the back of a digital camera and the movement of the specified object is then tracked. When the movement of an object is tracked with this method, the following problem may occur.
For example, part (a) of FIG. 4 is a diagram illustrating that a tracking frame displayed in advance on a liquid crystal display has been moved to the position of the face of a person and the face of the person has been selected as a tracking target through an operation of a digital camera performed by a user. By performing matching on the basis of a partial image within the tracking frame, an area in which the processing target exists is extracted from a newly obtained image. The tracking frame is moved to and superimposed on the extracted tracking target, and is displayed. Every time a tracking target is successfully extracted, a partial image serving as a matching standard is updated, and thus even when a pattern of an object gradually changes such as the facial expression of a tracking target changes, the angle of the tracking target changes, or the like, the movement of the same object can be continuously tracked.
Part (b) of FIG. 4 is a diagram illustrating that the movement of the object is being tracked by performing such processing. However, when a person who is a tracking target suddenly changes his/her direction and an image pattern of the tracking target suddenly and greatly changes, a difference between the previous image pattern of the tracking target and the image pattern of the object actually specified as the tracking target becomes large, the previous image pattern being the pattern of the partial image extracted previously. Thus, even if the area having the highest correlation is extracted by performing matching, the position of the object specified as the tracking target cannot be precisely captured. As a result, as shown in part (c) of FIG. 4, the tracking frame may become separated from the tracking target. As a result, the tracking frame may partially include the background and a partial image to be the base of matching may also include an image pattern of the background.
Moreover, part (d) of FIG. 4 is a diagram illustrating that the image pattern of the background included in the partial image serving as the base of matching affects an extraction result regarding the tracking target, whereby the tracking frame becomes completely separated from the object and the tracking frame is fixed to the background. Thereafter, as shown in part (e) of FIG. 4, pattern matching is performed on the basis of the area of the background, and thus the tracking frame is continued to be displayed on the background.