A power voltage detector is utilized in an integrated circuit (“IC”) for detecting when a supply voltage reaches or exceeds a trip-point voltage. If the supply voltage quickly ramps up and/or down, the integrated circuit can be significantly damaged by the varying supply voltages. For instance, the gate oxide of a metal-oxide-semiconductor field-effect transistor (“MOSFET”) can be broken down if the supply voltage ramps up and down. Moreover, once the supply voltage is turned on, it needs a requisite amount of time to become stable. Another concern is of high amplitude glitches in the supply voltage which can damage transistors of the IC. Even very small parametric variations in the IC can cause transistors of the IC to not work properly and not meet expected specifications. Therefore, it's important that the supply voltage is sensed to prevent such variations before being applied to components of the IC.
In a conventional voltage detector, the voltage detector typically uses a comparator to compare a supply voltage with a reference voltage to determine if the supply voltage has reached or exceeded the reference voltage. Subsequent action can be taken based on this result. Thus, the comparator acts as a voltage sensor. The drawbacks of this are that comparators are needed for each supply voltage (which can waste chip area on the IC) and the reference voltage must be generated in order for the comparators to work.
Therefore, it is desirable for providing new methods, systems, and apparatuses for power voltage detection that does not rely on the use of traditional comparators and a reference voltage.