Approximately 10% of the energy produced by generation stations is lost during transmission and distribution to customers [53]. An estimated 60% of this loss occurs on distribution networks [54], [13]. As the demand for electric energy increases, power stations would need to be built and the power delivery systems would need to be expanded and/or strengthened to meet growing demands. Peak demands in a system usually lasts less than 5 percent of the time [53]. This means that some power plants may only be needed during the peak load hours where their productive capacity is utilized only occasionally. Similarly the power delivery system would be stressed only during peak periods and in some instances past the operation technical specifications of equipment. By active demand control and technical loss minimization strategies on distribution systems, the generation requirements for the grid can be reduced. Further, by freeing up capacity on the delivery system, additional load can be connected to the grid at reduced or no additional capital strengthening costs. To the environment, this implies reductions in carbon emission as well as reductions in the cost of electricity to end use customers [53].
The purpose of Volt/VAr control is to improve the efficiency of the power delivery system. This essentially involves the control of network equipment such as transformer OLTCs, capacitor banks and voltage regulators as shown in Figure 4-1 [55], in order to control the reactive power flow and manipulate the voltage regulation of feeders or subsystems which results in technical loss minimization and the ability to actively control demand.
By controlling and limiting the reactive power flow on the delivery system, losses can be reduced thereby improving the efficiency of distribution feeders. The ability to optimize the power factor on a network implies the utility has to generate and transport less power to meet demand requirements. This is achieved by the strategic placement and control of shunt capacitor banks on the system where an additional benefit is improvement in voltage regulation.
By improving the voltage regulation of feeders, demand can be reduced by purposefully reducing source voltages and exploiting the voltage dependent characteristics of load whilst maintaining statutory voltage limits. This technique is termed Conservation Voltage Reduction (CVR). A general
47
rule of thumb is that reductions in demand between factors of 0.7 % to 1.5% are obtainable for every 1% reduction in voltage [56].
Figure 4-1: Equipment for Voltage Support and Reactive Power Control [55]
Volt/VAr optimisation has been made possible through recent improvements in sensors, communications, control algorithms, and information processing technologies that monitor voltage levels throughout the distribution system [55]. Most devices required for Volt/VAr optimization is already in place in most utilities (e.g. capacitors, on load tap changers, voltage regulators) that serve local purposes however the requirements for further enhancement and optimal use of these devices involves communications and software solutions which becomes an enabler for more system wide control and optimisation.
Figure 4-2 illustrates the concept of Volt/VAr control. The baseline voltage regulation is manipulated by the substation on load tap changer, the line voltage regulator as well as the capacitor that has been placed towards the end of the feeder. In order to reduce the voltage at the substation for the purposes of CVR, the line voltage regulator must ensure that the tail end of the feeder is still within the statutory voltage requirement, in this case 114V after the substation voltage has been reduced. The capacitor has bank twofold benefit, whilst it reduces the reactive component making up the technical loss of the feeder; it also raises the voltage at the point of connection and has a flattening effect on the regulation profile. This then allows the voltage to be reduced even further at the source for a more “aggressive” demand reduction by CVR.
48
Figure 4-2: Feeder Voltage Profile showing nominal, normal and statutory voltage limit for a feeder with an OLTC, Voltage Regulator and Capacitor Bank [55]
With the application of Volt/VAr optimisation, the following benefits can be achieved:
Improvements in the efficiency of the delivery system
Improvements in voltage regulation of feeders
Reductions in peak demand and the ability to selectively control demand
Deferral of capital expenditure and improved capital asset utilization
Reductions in electricity generation and environmental impacts
Provision for greater operational flexibility to reduce the impacts of local network abnormalities such as line contingencies
The potential benefits depend on the specific applicability to local network conditions and characteristics. Uncertainties facing utilities that wish to incorporate Volt/VAr optimisation in their systems are summarised in the following questions [56]:
What selection criteria to apply to determine which feeders to implement on?
How will communications be achieved?
What are appropriate levels of implementation?
What is the return on investment?
49
The next sections explore the levels of implementation as well as implementation experiences in various utilities which will guide the author in terms of appropriate solutions for Volt/VAr management on Eskom distribution networks.