POWER FACTOR is the ratio between the useful (true) power (kW) to the total (apparent) power (kVA) consumed by an item of a.c. electrical equipment or a complete electrical installation.

POWER FACTOR is a measure of how efficiently electrical power is converted into useful work output. The ideal power factor is unity, or one. Anything less than one means that extra power is required to achieve the actual task at hand.
All current flow causes losses both in the supply and distribution system. A load with a power factor of 1.0 results in the most efficient loading of the supply. A load with a power factor of, say, 0.8, results in much higher losses in the supply system and a higher bill for the consumer. A comparatively small improvement in power factor can bring about a significant reduction in losses since losses are proportional to the square of the current.
When the power factor is less than one the ‘missing’ power is known as reactive power which unfortunately is necessary to provide a magnetising field required by motors and other inductive loads to perform their desired functions. Reactive power can also be interpreted as wattless, magnetising or wasted power and it represents an extra burden on the electricity supply system and on the consumer’s bill.
A poor power factor is usually the result of a significant phase difference between the voltage and current at the load terminals, or it can be due to a high harmonic content or a distorted current waveform. A poor power factor is generally the result of an inductive load such as an induction motor, a power transformer, a ballast in a luminaire, a welding set or an induction furnace. A distorted current waveform can be the result of a rectifier, an inverter, a variable speed drive, a switched mode power supply, discharge lighting or other electronic loads.
To read the whole article, please see the downloadable attachment below.