In many applications a voltage regulator is required to provide a voltage within a predetermined range. Some circuits are subject to uncertain and undesirable functioning and even irreparable damage if an input power supply falls outside a certain range.
Many power supplies include power sequencing circuits to control the initial stages of a power supply during turn-on. Power sequencers help control high inrush current and limit power converter turn-on noise. Conventional power sequencing circuits provide a converter enable signal for enabling power converter circuits included within the power supply. The disadvantage of conventional power sequencers include that the charging control signal is either on or off and when turning on does not limit inrushes of current experienced by power supply input circuits. The result is that the input inrush of current is delayed but peak amplitude and spikes are not otherwise limited.
Conventional power sequencers for DC input power supplies do not provide a way to delay the power converter turn-on until the charging of the upstream filter capacitor(s) is actually complete. This is because the conventional sequencers provide only a time-based delay function that does not account for the effect of different input voltages on the charging time of the upstream filter capacitor(s).
Conventional power sequencers also do not limit inrush current for a wide range of input voltages and function optimally only in a narrow range of input voltages. In addition conventional sequencers typically use costly power sequencing integrated circuits (ICs). IC sequencers are difficult to debug and require trial and error in the lab to find the optimal solution.
Accordingly, it is desirable to have a simple discrete analog control circuit that would provided a power sequencing solution that is much more economical, efficient and better suited for limiting inrush current to DC input power converters.