Improving the Power Grid IQ

The need for affordable, plentiful, and environmentally friendly energy is being addressed through numerous interrelated initiatives. Some aim to influence regulations to ensure growth. Others pursue alternative energy sources from solar and wind to cold fusion. Still others are directed at modernizing the electric utility distribution grid.

Removing or minimizing regulations can help to extend energy generation based on fossil fuel by making more of the known reserves accessible. Regulations in the form of incentives can accelerate development of alternative energy sources, the favorable tax status of the nuclear industry being an example.

However, regardless of how electricity is generated, it must be distributed, and here two major themes have emerged. Recent distribution grid failures have highlighted the need for modernization, summarized in the term smart grid. At the same time, distributed generation, often in the form of solar installations or wind farms, has become very popular with even residential-scale generation tied to the grid.

Both the tremendous growth in the number of distributed generators and smart grid development have highlighted the importance of flexible control mechanisms. These, in turn, rely on accurate, timely, and geographically detailed data.

The Smart Grid

These are the seven main characteristics of the smart grid as listed in a 2010 DOE report:1

  • Enables active participation by consumers. 
  • Accommodates all generation and storage options for more efficient, cleaner power production. 
  • Enables new products, services, and markets such as green power products and a new generation of electric vehicles.  
  • Provides power quality for the digital economy. 
  • Optimizes asset utilization and operates efficiently with fewer equipment failures and safer operations. 
  • Anticipates and responds to system disturbances (self-heals). 
  • Operates resiliently against attack and natural disaster to improve public safety.

These benefits can’t be enjoyed until the grid actually becomes smarter in several ways. A basic problem is lack of actionable information, which requires faster and more accurate measurement data.

Synchrophasors

The diverging phase diagram shown in Figure 1 has been featured in numerous post-mortem analyses of the 2003 eastern U.S. rolling blackout. It shows how the phase between two physically distant network nodes continued to increase for more than an hour. After the phase difference reached a critical point, generating stations could no longer cope and dropped their connections to the grid. As they did so, the transient effect aggravated the situation, causing more to drop off, which created the rolling blackout.

Figure 1 Diverging Phase Difference between Ohio and West Michigan Before August 2003 Blackout
Courtesy of Consortium for Electric Reliability Technology Solutions

At the time, synchrophasor equipment was becoming more widely accepted, but only a relatively few units were in place. Traditional power control measured amplitude with a 2-second sample period and could only compare phases locally.

Without time-aligned phase information, the power grid has been controlled through an iterative state estimation process. Supervisory control and data acquisition (SCADA) information including voltage, current, and power measurements at various points in the system is combined to estimate the state.

Mathematically, the process of relating actual measurements to state variables within a system model is equivalent to a change of variables, which requires calculation of a large partial derivative determinant—the Jacobian matrix. The Jacobian matrix “always has (2N-1) columns, where N is equal to the number of buses. The number of rows is equal to the number of measurements available. For a full measurement set, the number of rows will be equal to (3N+4B), where B is the number of lines. The elements of the matrix represent the partial derivatives of bus voltages, bus powers, and line flows with respect to state variables δ and V.”2

The amount of time required to determine the state of a large network is related to the amount of calculation that is needed. The usual scheme takes about 0.5 s for a 30-bus network. Reference 2 presents an approach to partitioning the overall Jacobian matrix into 16 submatrices as well as a method for their computation. Nevertheless, although the overall time taken for the estimations to converge is improved for large matrices, it still takes about two minutes for a 300-bus network.

In contrast, synchrophasors measure the phase and magnitude of current and voltage up to 60 times per second with GPS-accurate timing. Simultaneously sampled values are available for significant nodes throughout the network, which support impedance calculations. Of the hundreds of synchrophasors in use today, very few have been included in real-time control loops. Instead, they are providing accurate and timely inputs to the existing control systems. The ultimate goal is to develop real-time distributed control software that uses synchrophasor vector data to minimize grid disturbances.

Compared to the capabilities of conventional DAQ instruments, the timing and data volume requirements associated with a power application are straightforward. However, there’s no doubt that the associated signal conditioning and post-processing are not.

Specialized equipment is mandatory when working with long-distance transmission lines at 230 kV, subtransmission lines at 69 kV or 26 kV, or even the 4-kV primary customer parts of the network. Of course, it’s the medium and high voltage sections that carry the bulk of the power that must be controlled over large distances.

Schweitzer Engineering Laboratories (SEL) develops and produces digital protective relays used worldwide for voltages from 5 kV to 500 kV. Synchrophasors have been integrated into the company’s relays and special-purpose phasor data concentrators (PDCs) developed to combine the streaming vector data from multiple sources. A centrally located synchronous vector processor can perform control functions based on data received from PDCs throughout the network.

Real-time control is the goal, and it has been achieved in varying degrees on a small scale. As stated in a recent paper, “Direct state measurement is better than state estimation because it is simpler, costs less, requires less processing, has no convergence issues, is less dependent on system data, and is faster.”3

Security

Coincident with the need to measure simultaneously and more often are the requirements to efficiently concentrate and quickly transmit the resulting data. Ethernet often is used to communicate data and the IEEE 1588 Precision Time Protocol to supplement GPS. Yet, as the 2007 Aurora experiment proved, even communications within a relatively small network are vulnerable to attack.

Department of Energy (DOE) researchers hacked into a replica of a power plant’s control system, overriding the reconnection synchronism enforced by the generator’s protective relay. Aurora is the name associated with the experiment and more generally with power generator communications vulnerability.

Attempting to asynchronously reconnect a generator to the grid can cause huge transient currents if the phases are sufficiently misaligned. In the experiment, the researchers demonstrated that malicious cyber activity can indeed destroy a generator.4

Adding a noncommunicating protective relay in series with the original communicating relay could eliminate the threat. Both relays would measure the same quantities and make the same conclusions regarding reconnection. If the communicating relay were hacked, the autonomous relay would prevent asynchronous reconnection.5

Today’s State of Play

A North American Synchrophasor Initiative (NASPI) presentation by the organization’s project manager Alison Silverstein stated that more than 250 synchrophasors were in use by early 2011 with about 1,300 planned by the end of 2013 (Figure 2).6 So, it’s relatively early days for this technology and not surprising that divergent views are held by the various organizations involved in the adoption of synchrophasors.

Figure 2 Phasor Measurement Units in North American Power Grid
Courtesy of North American Synchrophasor Initiative; with information available as of March 16, 2011

NASPI’s job is to proselytize for synchrophasor technology as well as facilitate changes that enable its adoption. Standards are an important aspect of this work, and among NASPI successes are the following:

  • Adoption of the NASPInet communications architecture framework for phasor data. 
  • Assisting recipients of DOE Smart Grid Investment Grants (SGIGs). 
  • Development of IEEE and IEC phasor technology interoperability standards. 

MISO, the organization of Midwest Independent Transmission System Operators, is one of the SGIG grant recipients and in its December 2010 report stated, “…the application of synchrophasor technology and the associated uses are still early in development. While joint collaboration and investment by DOE and the industry at large will accelerate the development and deployment of new applications to improve reliable operation of the electrical grid, the timeframe required to develop applications into ‘production grade’ is a major challenge for all of the SGIG projects, with the Midwest ISO’s Synchrophasor Project being no exception.”7

In the report, the order in which functionality could be developed is suggested, starting with event analysis, model validation, and wide-area situational awareness completed during the 2012 to 2013 timeframe. Wide area control and protection and special protection including islanding are projected for 2018 to 2020.

In contrast, reference 3 states, “Synchrophasors have left the laboratory, moved beyond demonstration projects, and are used around the world in diverse applications: testing, commissioning, automatic station-level self-checks, disturbance recording, and wide-area protection and control systems.”

The reality is that a small but growing number of installations are benefiting from synchrophasor capabilities. Event analysis often is cited as a major advantage because the lack of synchronized voltage and phase measurements in a network’s traditional SCADA system makes post-event cause-and-effect analysis very difficult. So, this could be a good place to start.

Both references 3 and 4 comment that IEEE C37.118, which covers phasor measurement and communications, does not include real-time control. One reason why it is not totally practical is because the current SCADA systems may not have the necessary bandwidth to carry real-time phasor information sampled at up to 60 S/s. More importantly, the necessary algorithms are not available, nor are the system models sufficiently accurate to make use of the data.

Distributed Generation

While smart grid initiatives are steadily progressing, the number and size of solar and wind alternative energy installations have rapidly increased. Where islanding used to be an undesirable and potentially dangerous condition, with the large number of distributed generators connected to the grid, it’s become a desirable operating mode.

Within a larger network, islanding describes the state of one or more generators that are not sharing power with the rest of the network. When there were very few distributed generators, they were treated as exceptions. However, today’s solar and wind installations have their own local control that supports autonomous operation in the event of grid problems.

A solar array’s local inverter handles synchronization and reconnection to the grid. It also manages power regulation so that, within the maximum available power, the array can support local loads. If the array is coupled to an energy storage system, the full local load may be supplied for some period of time.

The detection of islanding has been approached from many directions. If passive measurements show significant differences from nominal set points in voltage, current, or frequency, it is easy to conclude that the generator should be disconnected from the grid. The more difficult problem is to detect the point at which the grid could be disconnected because the entire load is supplied ­locally.

An active perturb and observe algorithm similar to the kind used to maintain solar inverter operation at the maximum power point can determine islanding. Changing the generator output current will not significantly affect the voltage at the point of common connection (PCC) between the generator and the grid. However, if the generator has disconnected from the PCC, the generator’s output voltage will increase to compensate for the reduction in current.

Similar active islanding detection schemes can be applied to other parameters, but these methods all degrade output power quality. A recent paper describes the use of wavelet-based islanding detection, which features high sensitivity to islanding but does not affect output power quality.

The growth of distributed generation has several implications. At the local level, it should boost power availability by providing backup to the grid. However, for the grid operators, it requires control algorithms with a higher degree of granularity. Anticipating the effect of thousands of residential generators randomly switching kilowatts to or from the grid is different than dropping or adding one or two large generating plants.

Of interest to instrument vendors, the increased manufacturing and installation activity requires more test equipment. And, far back in the food chain, the growth of solar and wind power fosters further development from basic materials to better semiconductor devices and circuits. Communications also are involved, and with the need to share data comes further research into network security.

Instrumentation

Modern, digital protective relays are available from Siemens, GE Energy, ABB (ASEA), SEL, Basler, Eaton, and several other manufacturers. They are designed to be part of a larger control system and have application-specific characteristics. For example, they communicate using the power industry’s standard protocols. Nevertheless, there are many requirements for more general-purpose test instruments and software within the power generation and control environment.

A key part of modern control strategies is relatively high-speed and secure communications. Although the basic 50- or 60-Hz power frequency isn’t high, the numbers do add up. For Level 1 operation as defined in the IEEE C37.118 synchrophasor standard, synchrophasor measurements must have less than a 1% total vector error for voltages with up to 10% THD, ±5-Hz frequency deviation from nominal, and 10% out-of-band interference signal distortion.9

A consequence of a stringent performance specification is the need for a high sample rate—reporting rate in synchrophasor terminology—to avoid aliasing. The reporting rate must divide evenly into the power frequency. So, for 50-Hz systems, 10 and 25 Hz are recommended rates and for 60-Hz, 10, 12, 15, 20 and 30 Hz are specified, but 60 Hz also is allowed.

The measurements taken at the reporting intervals represent how far the actual signal has deviated from an ideal reference cosine wave. Higher reporting rates are needed depending on how much the generated voltage deviates from the nominal and at what rate. It’s a question of how fast the reporting rate must be to handle both amplitude and frequency modulation of the power frequency.

In an example discussed in reference 9, a reporting rate of 60 phasors/s for a voltage, five currents, five watt measurements (real power), five VAR measurements (reactive power), frequency, and rate of change of frequency, all reported as floating-point values, requires a 64,000-b/s bandwidth.

Ethernet communications test equipment is used to commission and troubleshoot secure fiber-optic networks throughout the grid. In addition to being secure, fiber-optic cables are discussed in many articles because of their immunity to the high EMI levels typically found in substations. Decoding actual values requires further application-specific capabilities.

Any direct measurements must be isolated to ensure that off-ground voltages do not damage the measuring equipment. The 3.5-kV galvanic isolation featured in Data Translation’s MEASUREpoint™ instruments may address applications such as temperature measurement of powered conductors.

Beyond hardware instruments, there also is a need for simulation capabilities. According to information on the MathWorks website, engineers and scientists in energy production use MATLAB® and Simulink® to design and develop renewable power sources and analyze and improve energy transmission and distribution. These tools support detailed models, which more closely track the actual network performance.

Hardware-in-the-loop simulation also is popular given the high degree of complexity in many of the applications. Siemens described a steam turbine simulator the company developed using National Instruments’ modules and LabVIEW to facilitate turbine controller testing:

“We test the automatic controller under different operating conditions to detect disturbances in the system…. For the computation, we provided a dynamic processing model consisting of a steam generator and turbine, rerouting station, condenser, generator, and electrical net. With this model, we can test the different operating modes including the turbine starting within a specified revolution rate, the turbine operating under load or pressure control, load decreases under idle operation, load removal, and turbine shutdown.”10

Finally, because of the diverse implementations of phase measurement units (PMUs) developed by different companies, there is a need to characterize them through a rigorous testing procedure. Readily available tools including a GPS time reference, a signal generator, a data acquisition system, and analysis software are needed for this activity.

The PMU Testing Guide is discussed in detail in reference 11, which includes extensive background material. When this paper was written in 2008, there were about 140 synchrophasors monitoring the North America Grid. As the authors commented, “The Aug. 12, 2003, blackout reinforced the value of synchronized phasor measurements for enhancing situational awareness.”

References

1. “Understanding the Benefits of the Smart Grid,” U.S. Department of Energy, National Energy Technology Laboratory, DOE/NETL-2010/1413, June 2010, p. 3.

2. Nor, N. M. et al., “Newton-Raphson State Estimation Solution Employing Systematically Constructed Jacobian Matrix,” Advanced Technologies, 2009, pp. 19-45.

3. Schweitzer, E. O. III and Whitehead, D. E., “Real-World Synchrophasor Solutions, Journal of Reliable Power, Vol.2, Number 2, May 2011, pp. 4-15.

4. Meserve, J., “Sources: Staged Cyber Attack Reveals Vulnerability in Power Grid,” CNN, September 2007.

5. Johnson, G. F., “Mitigating Aurora Vulnerability with Protective Relaying,” NETAWORLD, Fall 2011, pp. 109-113.

6. Silverstein, A., “Digital Power Transmission: The Future of Synchrophasor Measurement Technology,” Slides for Session TS4660 at NIWeek, NASPI/National Instruments, August 2011.

7. Report of the Midwest ISO Synchrophasor Deployment Project, “Synchrophasor Integration into Planning and Operational Reliability Processes, December 2010.

8. Hanif, M. M. et al, “A Discussion of Anti-Islanding Protection Schemes Incorporated in an Inverter Based DG,” International Conference on Environment and Electrical Engineering 2011, May 2011.

9. Adamiak, M. et al, “Synchrophasors: Definition, Measurement, and Application,” Protection & Control Journal, September 2006.

10. Schmidt, E. and Brackenhammer, E., “Siemens Develops a Real-time HIL Simulator for Electric Power Generation Turbines,” National Instruments Case Study.

11. Huang, Z. et al, “Performance Evaluation of Phasor Measurement Systems,” IEEE Power Engineering Society, 2008.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!