The present invention relates to data playback equipment for playing back data recorded an a disk medium, such as an optical disk drive and a hard disk drive.
The error rate of played-back data is conventionally used as a yardstick for the quality of a signal played back by data playback equipment, and reducing the error rate is important. However, under use of the error correction technology such as error correcting code (ECC), system breakdown will occur if the error limit is exceeded even slightly, while no error arises at all within the error limit.
In data playback equipment, it is very difficult to execute learning for system optimization using the error rate of played-back data as a parameter. Conventionally, therefore, to optimize the margin of the error rate, the jitter amount is detected from an analog signal obtained from a disk medium, and learning is executed using the jitter amount (see Japanese Laid-Open Patent Publication No. 8-45081, No. 2000-173060 and No. 2001-23167).
The jitter amount is a very useful parameter for determining the margin of the error rate. However, with introduction of technologies such as partial response maximum likelihood (PRML) and adaptive equalization thanks to the recent advance of the digital technology, the jitter amount is no more a parameter always correlated with the error rate. In this situation, a new parameter replacing the jitter amount used in the conventional analog-related techniques is required.