Power Supply Design Basics: The Power Factor

By Nuvation | Jul 16, 2013

In many products, devices are used to transform AC mains power to a different operating voltage. Power supply design is a crucial step in architecture development, as it can truly make or break a product. Modern power supplies are required to meet several stringent requirements for safety, EMI regulation and efficiency.

One of the common requirements of a power supply design is that it presents a “real” load to AC mains. For example, PG&E Industrial customers with a peak demand over 400kW will be penalized if their power factor is below 85%.

Power Factor

Power factor is the ratio between the DC equivalent power or Wattage used by an electrical load and the Volt*Amps (VA) consumed at the AC line by an electrical load. VA is defined as the apparent power.  The AC generator providing the power to the wall must provide power equivalent to the VA drawn whether or not it is used to do useful work.

In order for the load to be using the energy provided, the voltage across and the current through the load must be in phase.  An AC voltage source connected to a resistor has its voltage and current in phase i.e. the current rises in the resistor is in step with the voltage.

For a DC circuit and an AC source driving a pure resistance, the power used and delivered is simply P=VI.  In an AC circuit driving a general load we deal with the rms value of both current and voltage to calculate the “real” or equivalent DC power delivered to the load and the expression for power becomes, P = I·VCos(φ), where I and V are the rms values of each and φ is the phase angle between them.  Cos(φ) represents the power factor and is either expressed as a number between 0 and 1 or as a percentage with a value of 0 to 100%.  If φ = 0 than P = I·V the same as for a resistor or a DC load.  If  φ = 90°  than the power delivered to the load P = 0 even though there is voltage across the load and a current through it.  The generator providing the power must deliver I·V power even though none of it is being used to do anything useful.

Unfortunately, the input network in a normal power supply without any power factor correction is a diode bridge driving a large capacitor and as a result it presents a non-real load to the generator. Depending on the circuit components the power factor will be between 55% and 75% which implies, that the generator must provide between 125% to 145% more energy than is being used.  The difference either appears as heat or is reflected back to the AC mains.

Power Factor Controller

The most common means of correcting this condition is to employ a power factor controller.  In the case of the power supply the poor power factor is not caused by a reactive (i.e. capacitive or inductive) load but by a highly non-linear one.  A purely reactive load can be corrected by adding either an inductor, in the case of a capacitive load or a capacitor in the case of an inductive load.  In fact, this is what’s done when the load is an electric motor which looks like an inductor, when the load is non-linear as in the case of the power supply the solution is not quite so straight forward.

The non-linear load, in addition to reducing the power factor presents a considerable amount of “noise” to the AC mains in the form of harmonics of the line frequency.  While the purely reactive load can be compensated for simply, to correct for the poor power factor due to a distorted current waveform requires a change in equipment design or expensive harmonic filters to gain an appreciable improvement.

Keep reading to learn how to implement passive Power Factor Correction (PFC) and to when to use active PFC.


References: Pacific Gas and Electric Company: Economics of Power Factor Correction in Large Facilities