May 24, 2018
Feature

Moving Megawatts Securely and Economically

PNNL's adaptive tool streamlines complex model validation and parameter calibration for power plants

Image labeled "power plant model" showing a power plant miniature inside of a circle. The outside of the circle is divided into numbered ribbons, labeled, "1. Validation, 2. Calibration, 3 Verification."

Thirteen years ago, PNNL engineer Henry Huang embarked on an idea to make power grid planning tools more efficient and save money in the process. Now, the results of that concept are in the hands of utilities on the West Coast, helping them determine where investments will lead to a more resilient, reliable and economical electric grid for their customers.

Thumbnail
PNNL and Bonneville Power Administration have developed an open-source tool.

In February, General Electric shared a new version of their Positive Sequence Load Flow (PSLF) software package with hundreds of utilities in the Western Electricity Coordinating Council service area. Integrated into this package is PNNL’s prototype tool—the Power Plant Model Validation (PPMV) tool—to streamline the process of power system model validation and parameter calibration.

This feature will not only help large power plant owners meet federal testing requirements for power generation planning purposes, but also aid major Independent System Operators (ISOs) in making investment decisions to plan and operate today’s power grid.

“It’s a great example of moving a new idea from concept to commercial adoption,” said staff research engineer and program manager, Ruisheng Diao, Huang’s colleague at PNNL who took the final software over the finish line with the project team. That team included Bonneville Power Administration (BPA)—who also funded the work—GE Energy Consulting, GE Grid Solutions, and Peak Reliability.

“It was a long journey, but that’s the nature of the beast.”

From concept to commercialization

The road to commercial acceptance of any new technology is lengthy and rigorous. Diao says this is especially true in the power industry, where they tend to stick with tried and true technology.

“In the development phase, software prototypes coming out of new concepts are usually too fancy and risky for commercial power applications. Nobody wants to spend the time and money on lengthy training for engineers and operators unless they have to,” he explains.

Complex problems sorted out in the research lab need to be simplified and adapted to work within existing commercial tools that have been widely used by power industry for decades. It can take many iterations and lead to a totally different, but much more useful, end product.

In this case, the path from geeky lab prototype script to a sleek new commercial software tool for GE evolved in two key phases.

A streamlined model validation process. To run one validation study, it can easily take weeks for a plant engineer to collect all the necessary information such as operational snapshots, event information, sensor data, and monitored channels. After that, the plant generators come offline at up to $35,000 a pop to test a scenario through a series of calibration and validation steps. This is a very labor and time intensive process. Industry desperately needed an automated tool that was both fast and easy to use.

The concept for the PPMV tool was invented in early 2000 by BPA, GE, and PNNL. Their concept incorporated real-time frequency and voltage signals from phasor measurement units, or PMUs, placed throughout the electrical grid. With these “play-in” PMU signals, their prototype concept and algorithms ran great—in the developer script, Visual Basic.

“This software environment is popular for research and users who know exactly where to grab information, but it’s anathema to industry,” said Diao.

Building toward commercial integration with the automatic “play-in” function in PSLF streamlines the process of validating hundreds of power plants whenever a system event occurs. This new tool can cut the entire process time from thousands of labor hours down to just a few minutes.

An enhanced calibration algorithm. For secure and economical operation of the power grid, the integrity of the power plant planning model is essential. When models show deficiencies compared to field measurements, any decision based on the model will be less than optimal.

“To plan around the model, it has to be right or it’s a waste of money and time,” said Diao.

The early phase of PNNL’s model calibration algorithm was written in MATLAB research scripts and the performance was very slow—more than 24 hours to calibrate the model parameters of a single generator unit. It couldn’t handle nonlinear dynamics very well either.

Through a 3-year (2011-2014) project funded by DOE’s Advanced Scientific Computing Research program(Offsite link), Huang and Diao’s team developed a promising new method to calibrate flawed parameters for a power plant in a more efficient and accurate way. The algorithms showed great potential but the models being used were over-simplified—they couldn’t represent the realistic behavior of a power plant.

An integrated solution for the power industry

During the next phase of development between 2015 and 2017, Diao’s team and their BPA counterparts collaborated to successfully enhance and adapt the validation and calibration algorithms for commercial adoption. Their research reduced the time for model calibration from more than 24 hours to less than 1 minute. That work also earned the team a best conference paper from the 2017 Institute of Electrical and Electronics Engineers’ Power and Energy Society General Meeting.

The different planning and operation pieces are now connected and integrated into GE’s commercial products: eterra-phasoranalytics and PSLF. This not only makes GE’s software more efficient, but reduces the engineers’ training burden for one-off tools.

Over the next few years, testing data from both regional utilities and ISOs will provide valuable performance feedback on the new software tool suite. That data will inform improvements to the next version of the software.

From concept to commerce

Technology Readiness Levels are a type of measurement system used to assess the maturity level of a particular technology. These levels help management make decisions concerning product development and readiness to transition to industry.

Technology Readiness Levels

  1. Basic principles observed
  2. Technology concept formulated
  3. Experimental proof of concept
  4. Component validation in lab setting
  5. Component validation in working environment
  6. Component prototype demonstration
  7. System prototype demonstration

###

About PNNL

Pacific Northwest National Laboratory draws on its distinguishing strengths in chemistry, Earth sciences, biology and data science to advance scientific knowledge and address challenges in sustainable energy and national security. Founded in 1965, PNNL is operated by Battelle for the Department of Energy’s Office of Science, which is the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science. For more information on PNNL, visit PNNL's News Center. Follow us on Twitter, Facebook, LinkedIn and Instagram.

Published: May 24, 2018