Exemplary embodiments of the present invention relate to a system for calculating a degree of gaze-guidance of an image, a program and method thereof. Further, exemplary embodiments provide a gaze guidance degree calculation system, a gaze guidance degree calculation program, a storage medium, and a gaze guidance degree calculation method capable of implementing a compact and low-cost apparatus to obtain a proper eye-flow.
Visual documents are well-designed documents such as catalogs for goods in which layout elements such as titles, images, and texts are arranged to be easily recognized. Since a lot of design know-how is needed to make the visual documents, it is difficult for ordinary businessman to make the visual documents. For that reason, in many cases, a designer having special technologies is entrusted with the task of making the visual documents.
When the designer makes the visual documents, the designer disposes contextually-related layout elements along the direction (hereinafter, referred to as an eye-flow) of a reader's eye-flow in mind to implement easily-readable layout. For example, if a news article comprises a title, images, and texts, it is preferable that the title, images, and texts are arranged along the eye-flow. Therefore, after the layout elements are arranged, the designer re-arranges the layout elements along the expected eye-flow in order to obtain easily-readable layout by trial and error. However, since the designer relies on his/her sense of design and experience for the eye-flow, it is difficult to quantitatively acquire the eye-flow.
The related art includes techniques for acquiring the eye-flow and related techniques. There are a visual information analyzing apparatus disclosed in the related art Document Japanese Unexamined Patent Application Publication No. Heisei 6-162, an attraction region extraction apparatus disclosed in related art Document Japanese Unexamined Patent Application Publication No. 2001-126070, and an attractiveness estimation model disclosed in related art Document “A Figure Extraction Method Using the Color and Texture Contrasts of Image Regions as Feature Amount” by Shoji Tanaka, Seiji Inokuchi, Yuichi Iwadate, and Ryohei Nakatsu, in the Information Processing Society, Vol. 40, No. 8, 1999, (hereinafter “Tanaka”).
In the invention described in related art Document Japanese Unexamined Patent Application Publication No. Heisei 6-162, time-series change of eyeballs detected by the analyzing apparatus is analyzed in a frequency domain, contents of an image input from an image input unit are analyzed by a display-contents analyzing apparatus, and the both results are integrated and processed by an integration analyzing unit, so that highly-reliable data on a mental state of a tested person and the objective estimation of the image can be obtained.
The invention descried in the related art Document Japanese Unexamined Patent Application Publication No. 2001-126070 includes an attention region extraction apparatus, an image generation apparatus, and a figure-composition cutting apparatus. The image generation apparatus generates a current image, that is, a panorama image from a picture taken by a camera. The attention region extraction apparatus extracts an attention region from an original image supplied from the image generation apparatus. Estimation is made by a human subjectivity according to physical features of the original image, and the attention region is extracted as a result of the estimation. In the figure-composition cutting apparatus, the extracted attention region and adjacent image regions thereof are cut off from the original image with reference to data on paintings drawn by painters or pictures taken by photographers, which is stored in a memory. As a result, the attention region having the same picture-composition as that of the image of paintings and pictures can be cut off.
In Tanaka, a concept of attractiveness introduced in the invention described in related art Document Japanese Unexamined Patent Application Publication No. 2001-126070 and a detailed calculation method thereof are descried in detail.