Computer systems typically comprise a combination of hardware (such as semiconductors, transistors, chips, and circuit boards) and computer programs that the hardware stores and executes. The computer systems provide electrical power to the hardware via power supplies, often implemented as AC (Alternating Current) to DC (Direct Current) converters.
Computer systems typically have several power supplies connected in parallel or in series for capacity and redundancy reasons. The computer system needs power from the power supplies that is sufficient to provide the power consumed by the computer when it is operating with a maximum workload, which uses more power than a smaller workload. Additionally, for fault tolerance and high availability reasons, computer systems often have additional or redundant power supplies, so that if one or more power supplies fail, sufficient power capacity still exists to meet the needs of the computer system. Some computer systems have full redundancy, with one set of power converters connected to one source of AC and another set of converters connected to a separate source of AC, so that a failure in one of the AC systems does not compromise operation of the computer system.
The power supplies have an efficiency rating, which describes how much input power the power supplies need to supply a specific output power. For example, a power supply that has an 80% efficiency rating at 100 W (Watts) output power requires 100 W/80%=125 W of input power to attain that output. Efficiency ratings are typically neither linear nor uniform. Instead, efficiency ratings are typically curves that increase in efficiency from a lower power, peak at a high power, and then decrease to a maximum power output.