H.264/MPEG-4 Advanced Video Coding (H.264/MPEG-4 AVC) (hereinafter referred to as H.264) is known as an encoding method used in compression recording of moving pictures (Refer to NPL 1). In H.264, a picture can be divided into multiple slices for encoding. Since low dependency of data exists between the slices, it is possible to execute an encoding process and a decoding process in parallel. The division into slices has a major advantage of executing the parallel processing by, for example, a multi-core central processing unit (CPU) to reduce the processing time.
Each slice is encoded by a binary arithmetic coding method in related art adopted in H.264. Specifically, each syntax element is binarized to generate a binary signal. Each syntax element has a table of occurrence probability (hereinafter referred to as an occurrence probability table) given thereto in advance and the binary signal is arithmetically encoded on the basis of the occurrence probability table. In decoding, the occurrence probability table is used as decoding information in the decoding of subsequent codes. In encoding, the occurrence probability table is used as encoding information in the encoding of subsequent codes. Each time the encoding is performed, the occurrence probability table is updated on the basis of statistical information indicating whether the encoded binary signal is a symbol having a higher occurrence probability.
Activities for international standardization of more efficient encoding methods, which are successors of H.264, have been started in recent years to establish Joint Collaborative Team on Video Coding (JCT-VC) between International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC) and International Telecommunication Union Telecommunication Standardization sector (ITU-T). In JCT-VC, standardization of High Efficiency Video Coding method (hereinafter referred to as HEVC) is advanced.
In the standardization of HEVC, various encoding tools are widely reviewed in terms of not only the improvement of encoding efficiency but also the ease of mounting and the reduction in processing time. In order to achieve the reduction in processing time, methods of improving parallelism are also examined. The methods include a method called Wavefront for executing entropy encoding and entropy decoding in parallel (refer to NPL 2). Since a binary signal to be encoded is required to be encoded by using the occurrence probability table that is constantly updated, it is not possible to execute the processing in parallel without resetting of the statistical information. However, the resetting of the statistical information has a problem in that the encoding efficiency is reduced. In contrast, in Wavefront, the occurrence probability table at a time when the blocks at multiple predetermined positions are encoded is applied to a leftmost block on the next line to enable the parallel encoding process of the blocks for every line while suppressing the reduction in encoding efficiency. Although Wavefront is described with focus on the encoding process, the same applies to the decoding process.
In addition, tiling is also included in HEVC as a method of improving the parallelism. The tiling is a technology to divide a picture into rectangles to separately process the rectangles. This technology allows the speed-up of the parallel encoding-decoding to be realized and allows the memory capacities of an encoding apparatus and a decoding apparatus to be reduced.
In HEVC, a tiles_or_entropy_coding_sync_idc code has hitherto been used to exclusively perform the processing tasks, such as the tiling and Wavefront. When the tiles_or_entropy_coding_sync_idc code has a value of zero, the picture includes only one tile and the parallel processing such as Wavefront is not performed. When the tiles_or_entropy_coding_sync_idc code has a value of one, the picture includes multiple tiles but the parallel processing such as Wavefront is not performed. When the tiles_or_entropy_coding_sync_idc code has a value of two, the picture includes only one tile and the parallel processing such as Wavefront is performed. When the tiles_or_entropy_coding_sync_idc code has a value of three, the picture includes only one tile and entropy slice capable of being independently decoded is used without performing the parallel processing such as Wavefront. Values other than the above values are not capable of being used. This is because performing multiple processes in parallel when an image is sufficiently small, like a high-definition image, has a problem in that the control of the parallel processing is made complicated and the complexity is increased with respect to the picture size. Accordingly, the exclusive processing is performed in such a case. However, when a large screen such as a super high-definition screen is processed, it is necessary to divide the screen into segments each having a certain size, allocate the segments to nodes of a computer, and cause multiple processors to operate in the nodes. For example, when a tile is allocated to each node to perform the processing, there is a problem in that it is not possible to perform the parallel process, such as Wavefront or the entropy slice, in the tiles.