As a method of coding videos, High Efficiency Video Coding (HEVC) (or H.265) is known. Also, there is cross component prediction as a function for improving a compression rate by coding the correlation between luminance and a color difference when there is a correlation between a luminance image and a color-difference image.
In cross component prediction, coding efficiency is improved by coding a difference (residual) resulting from subtracting “luminance residual×residual scale α” from a color-difference residual. In order to increase the encoding efficiency, conventional methods obtain residual scale α that results in minimum sum of squared differences S(=Σ(yi−αxi)2) between a color-difference residual and luminance residual×residual scale α. Note that yi represents a color-difference residual, which is a difference between a color-difference component of the i-th pixel of the original image and a color-difference component of a pixel of a predicted image corresponding to the pixel. xi represents a luminance residual, which is a difference between a luminance component of the i-th pixel of the original image and a luminance component of a pixel of a predicted image corresponding to the pixel. i represents a number corresponding to each pixel of an original image or a predicted image that is a process target of a cross component prediction, and satisfies i=1 through n, while n is the number of pixels in an image as a process target of cross component prediction. Σ(yi−αxi)2 represents the sum of squares of values obtained by subtracting, from color-difference residual yi, a value obtained by multiplying residual scale α by luminance residual xi.
However, when cross component prediction is performed on the basis of residual scale α resulting in above minimum sum of squared differences S in HEVC, the sum of squares between components actually calculated on the basis of a HEVC standard sometimes does not become smaller, reducing the compression rate.
Furthermore, the documents such as International Publication Pamphlet No. 2015/098562, Japanese Laid-open Patent Publication No. 2014-131172, etc. are well known.