1. Field of the Invention
The present invention generally relates to an apparatus and a method of motion detection, and more particularly, to an apparatus and a method for motion adaptive de-interlacing with a chroma up-sampling error remover.
2. Description of Related Art
Generally speaking, motion detection uses differences between corresponding pixels in the same parity field to determine whether the designated pixels are moving pixels or not. If the difference is larger than a preset threshold, it means a variation is occurred in the content of the video and the corresponding pixels are determined as the moving pixels, and if the difference is smaller than the preset threshold, it means no variation is occurred in the content of the video and the corresponding pixels are determined as the static pixels.
Through the motion detection as described above, motion information of pixels of the moving object is obtained and referenced for determining whether to use a spatial interpolation or a temporal interpolation method to generate the required field data, in which the temperal interpolation is used for calculating values of pixels in an area with no moving object and the spacial interpolation is used for calculating values of pixels in an area with the moving object.
On the other hand, in order to reduce a size of a video, the color components of the video are usually compressed in the 4:2:0 sampling format. To be specific, based on the characteristic of human eyes that is insensitive to color components, the color components of the video data is usually sub-sampled such that only a part of color information is retained. The common sub-sampling formats includes a 4:2:2 sampling format, a 4:1:1 sampling format, and a 4:2:0 sampling format.
For example, in the process of 4:2:2 sub-sampling for pixels containing color components of R, G, and B, the color components of R, G, and B are equivalently transformed into a luminance Y and color components U and V, and the color components of the pixels are sampled by two. That is, for each two pixels, only the color components of one pixel is sampled and used as the color components of the two pixels.
However, when performing three dimensional de-interlacing on a video compressed in a 4:2:0 sampling format, if the color components of corresponding pixels of a static area in a front field and a rear field are combined together for display, a chroma up-sampling error (CUE) may occur.
For example, FIG. 1 is an example illustrating a chroma up-sampling error occurred when performing de-interlacing. Referring to FIG. 1, when performing interlacing on a progressive field having a size of 8×8, the scan lines are separated into an odd field and an even field, in which the odd field contains odd scan lines of the progressive field, the even field contains even scan lines of the progressive field, and the pixels drawn with shadow blocks represent color pixels. Then, when performing a 4:2:0 sub-sampling on both the odd field and the even field, data of one of each two scan lines is retained and used as the data of the two scan lines, so as to reduce the size of the original field. However, when performing de-interlacing on the sub-sampled odd field and even field, the scan lines of the odd field and the even field are up-sampled alternately to recover the field. As shown in the third section of FIG. 1, the color pixels are mistakenly arranged such that saw-tooth may occur on the edge of the color area in the recovered field.
To resolve the CUE problem as described above, the conventional technique uses a low pass filter for up-sampling, but this method may blur the outline with apparent color change in the field. To resolve the problem thoroughly, the components of the pixels of the color area in the front field and in the rear field are required to be swapped as shown in the fourth section of FIG. 1. In detail, the components of the pixels in the odd field are required to be swapped with those of pixels in a next field and the components of the pixels in the even field are required to be swapped with those of pixels in a previous field.
However, the motion adaptive de-interlacing requires data of at least three fields so as to precisely detect the static areas and the moving areas because the motion detection needs to use same parity field. If it intends to remove the CUE in the input three fields so as to perform motion detection, data of at least five fields are required, which expends large volume of buffer.