The present invention relates to a method for shortening delay time of a synthesized logic circuit and also for reducing the number of gates employed therein, in a logic synthesis system in which a logic design of a digital logic system is performed by a computer.
In the conventional method for shortening the delay times with respect to the Boolean expressions indicative of the functions of the logic circuit, the logic circuit to be processed is subdivided into plural portions or sections, slacks in the delay times for the subdivided sections are calculated, and while reconstructing the logic circuit from a region having a small slack, the delay time is shortened. This conventional method is described in detail in, for example, "Timing Optimization of Combinational Logic" (ICCAD 88, 1988, pages 282 to 285).
In the conventional method, since the delay times are sequentially improved every subdivided portions of the logic circuit to be processed, a distribution of the delay times is restricted, as compared with such a case that an overall logic circuit is handled as a single logic circuit so as to improve the delay times. As a consequence, there are some possibilities that such a logic circuit having a redundant logic is synthesized. To the contrary, there has been no proposal that an overall logic circuit is handled as a single logic circuit to be processed, and logic optimization is carried out to simultaneously reduce both the delay time and the number of gates.