A DC-to-DC buck converter is used to convert an input voltage from a high-voltage power supply to a lower voltage supplied to the circuit or apparatus connected to the output node of the converter. In a converter, a controller is typically used to control a driver to manipulate the high-side and low-side transistors to switch to generate the desired output voltage and current, and generally, the controller and driver are integrated in a single chip. However, the efficiency of the converter is better when the driver is closer to the load connected with the converter, while the controller is more interfered with noise when it is closer to the load.
To solve the above problem, another converter separating the controller and driver to different chips is proposed. Even the original problem is solved, separating the controller and driver introduces new problem. Due to the driver operated only under high voltage, the controller will not provide any control signal to the driver before the input voltage of the converter reaches the predetermined high voltage, and as a result, the switching transistors of the converter cannot be controlled to switch during this period. For an example, as shown in FIG. 1, when power on, the input voltage Vin of the converter rises from 0 V to 9 V and the controller and driver are not operated during this period. If the high-side circuit of the converter is shorted during this period, the input voltage Vin will be connected directly to the output node of the converter and thus damages the load circuit or apparatus connected to the output node of the converter. In other words, the conventional converter lacks of high-side short-circuit protection under abnormal operation period.
Therefore, it is desired a driver with high-side short-circuit protection for a voltage converter.