In a spark-ignition internal combustion engine, the ignition timing affects fuel consumption and engine output, and inappropriate ignition timing leads to knocking, ignition failures, and the like.
Normally, the ignition timing is set in such a manner that the internal cylinder pressure reaches a maximum pressure at 10–15 degrees After Top Dead Center (deg. ATDC). However, the Minimum Spark Advance for Best Torque, (hereinafter, abbreviated to “MBT”) which is used as a basis for setting the ignition timing, varies according to the engine rotation speed, the engine load, the air/fuel ratio of the mixture supplied to the engine, the Exhaust Gas Recirculation (EGR) ratio, and other factors.
Therefore, in conventional ignition timing control, the ignition timing is determined in accordance with the operating conditions, by using a basic ignition timing map which corresponds to the engine rotation speed and engine load, together with a map of correctional values devised for various operational states. In this method, in order to improve the accuracy of the control, it is necessary to increase the grid density of each map as well as the number of maps. As a result, an enormous amount of preliminary experimentation is required in order to create the maps.
On the other hand, a method is also known in which knocking is detected by a knock sensor, and if knocking is detected, then the ignition timing is retarded by means of feedback control. However, since the ignition timing is subjected to feedback control after knocking has actually occurred, then there is a delay in the control procedure.