This audio is auto-generated. Please let us know if you have feedback.

Digital technology providers have long been frustrated with what they see as utilities’ innovation-impeding focus on reliability. But rapid load growth and potential artificial intelligence solutions are inspiring collaboration, particularly on ways to use and share power system data. 

Utilities have seen the data on significant projected load growth from transportation, manufacturing, building electrification and data centers. And tech companies like NVIDIA, Microsoft, IBM and Schneider Electric are beginning to understand the regulatory barriers holding utilities back from transitioning to advanced AI computing strategies, executives said.

Digital technologies’ skyrocketing computational power and hyperscale cloud resources are transforming previous assumptions about AI’s potential to learn, execute and optimize system operations, starting utilities on what some are calling the “technology transition.”

“The utility industry is conservative, but it faces clean energy and emissions reduction mandates,” said Marc Spieler, senior managing director, energy, for microprocessor market leader NVIDIA. Advanced computing’s “real time predictions can optimize” decision making on things like bulk system dispatch and maintenance and NVIDIA’s “dedicated modules will apply learning from other industries to the energy transition,” he added.

Utilities have used advanced computing in wildfire mitigation, vegetation management and predictive maintenance, and are beginning to consider its potential to optimize system dispatch, analysts said.

“The technology business model is ‘move fast and break things’ and they haven’t always understood why regulated utilities don’t move as fast,” said Edison Electric Institute General Counsel, Corporate Secretary and Executive Vice President, Clean Energy, Emily Sanford Fisher. “But there seems to be a new spirit of cooperation on solving utility challenges.”

One obstacle may slow this technology transition. Tech companies have seen how innovations in advanced computing from other sectors can maximize AI capabilities. But utilities still must verify that potential and convince regulators that investments in implementing advanced computing capabilities are justified.

The challenge and the potential

The rapidly rising load growth is clear.

“Over the past year, grid planners nearly doubled the 5-year load growth forecast” from “2.6% to 4.7% growth,” a December 2023 Grid Strategies study reported. The 2024 forecast “is likely to show an even higher nationwide growth rate,” driven by investment in new manufacturing, industrial, and data centers, it added.

Broad use of advanced computing is now providing “some decision support to bulk system operators” for managing the new load, said Jeremy Renshaw, senior technical executive, AI, quantum, and innovation, with the Electric Power Research Institute.

But fully optimizing distribution system operations and dispatch “could be one breakthrough or years away,” Renshaw added.

Many tech companies are putting advanced computing to work in hopes of finding that breakthrough.

advanced computing

Permission granted by Grid Strategies

Tech companies at work

Advanced computing is starting to serve utilities.

BrightNight’s PowerAlpha platform can help design, operate and optimize clean energy projects to significantly increase load factors and reduce costs, said BrightNight Chief Technology Officer Kiran Kumaraswamy. Its machine learning and AI-based algorithms are focused on utility-scale assets, but it does not have “the level of granularity to optimize the distribution system,” he said.

Weather forecasts using machine learning have allowed Amperon clients to buy lower-priced power ahead of extreme events instead of facing elevated scarcity pricing, said Amperon CEO and co-founder Sean Kelly. “A person can’t compute hourly updates on 15-day forecasts and weekly longer term forecasts for up to 28 weather variables at 30,000 locations,” he added.

“Call it machine learning or AI,” but the Neara-built “computerized replica” of the Southern California Edison system will “apply more variables than a human can assimilate,” said Rob Brook, senior vice president and managing director, Americas, at advanced computing provider Neara. It “identifies ways to improve” wildfire mitigation and “removes human error and human cost,” he added.

But while much of the current focus on AI in the power sector is on bulk system and maintenance, one advanced computing-based company appears to be on the verge of a breakthrough in using advanced computing at the distribution system level.

“Utilidata is deploying the first distribution system AI platform,” using a “customized NVIDIA module,” said Utilidata President and Chief Operating Officer Jess Melanson. It is now being installed with Aclara meters, but it will “eventually be used in other system hardware like transformers,” he added. 

Both NVIDIA and Utilidata see “huge opportunity” in power system applications, Melanson said. “Until now, Utilidata analyses have used incomplete, old, or bad data,” but the world-standard NVIDIA chip allows Utilidata’s Karman platform analysis to detail “what is happening on the system, what will likely happen next, and what the best responses are,” he added. 

That level of intelligence at the distribution system level could allow customer-owned resources to play a greater role in reliability and reduce overall customer costs.

A key potential obstruction to realizing the benefits of advanced computing is limited access to utility and private customer data. It is a challenge those in the advanced computing world, including NVIDIA and IBM, take seriously.

advanced computing
DOE. (2024). “AI for Energy” [pdf]. Retrieved from DOE.

Foundation modeling and federated learning

NVIDIA’s software allows advanced computing “to anticipate patterns in the data and identify the next best action,” according to NVIDIA’s Spieler. It could, for instance, allow utilities to have a greater understanding of where outages might occur and do proactive maintenance, several analysts said.

The challenge of utilities not sharing the proprietary data needed to develop a greater and more granular understanding of the power system is real, “but federated learning, which is used in healthcare to protect patient data, can be a solution,” Spieler said. “With federated learning, collaborators can build models of their data and share it at a centralized location,” he added.

NVIDIA FLARE, a federated learning software tool, “builds additional synthetic data to solve new problems,” which answers privacy concerns, Spieler said.

“Utilities take data security seriously,” and must be sure it will be shared “in the right way,” EEI’s Fisher said. “There must be protocols to protect critical energy infrastructure information,” though “it is good news that there are constructive conversations about how to work together to do that,” she added.

Some think federated learning may be too limited a model for the complexities of the power system’s diverse regional uniquenesses and varying resource mixes.

“Foundation models are emerging to expand advanced computing capabilities,” said IBM Global Chief Technology Officer and Solution Leader, Energy, Environment and Utilities, Bryan Sacks. 

Instead of AI components being applied to individual problems, “orders of magnitude more data are pre-trained as a foundation model for multiple problems,” Sacks said. A foundation model could capture the power system’s diversity without divulging any operator’s proprietary specifics. But “the metering and monitoring system data used by utilities for operational decisions is not in current large language models and is protected from being shared externally,” he added.

“For a foundation model to understand power system operations, it needs specific time and place data for each connected asset, but that data must be anonymized,” Sacks said. “IBM has started a working group to build a foundation model to be trained from anonymized data for power system real-time and day ahead operations and long-term planning,” he added.

Different power system stakeholders would be able “to fine tune that foundation model to solve their different problems,” Sacks said. But “regulatory barriers or restrictions on market participants obtaining access to the data needed to train the model is a real concern,” he added.

IBM is engaging global stakeholders to contribute primary research, according to Sacks. It will also engage regulators “to help establish a governance system that will facilitate data sharing,” and build effective “guardrails” to protect the system and the data, he said.

IBM’s recognition of regulatory issues puts it on common ground with what utilities say is their foremost concern.

advanced computing
DOE. (2024). “AI and Utilities” [pdf]. Retrieved from DOE.

Utilities and regulators

Utilities seem more realistic than tech companies about the regulatory barriers to advanced computing.

Using advanced computing to optimize the distribution system in real time will require utilities and regulators to have “enough confidence” in it, said Steve Smith, National Grid group head of strategy, innovation and market analytics and president of National Grid Partners Corporate Venture Capital Fund. “We could be there in 10 years or 15 years,” he added.

“Tech companies and utilities have radically different business models,” and “tech companies don’t understand why it takes 10 years to bring new transmission online,” added EEI’s Fisher. It actually only takes regulated utilities “18 months to 24 months to build transmission, but it takes them eight years to site, permit and litigate it,” she said.

Introducing advanced computing will require “concrete evidence” for utilities and regulators concerned about rising rates that “the needed grid modernization expenditures will reduce customer costs,” Fisher said.

“Foundation models have a lot of potential for the energy industry,” and data federation is “absolutely required to derive and harmonize data from disparate siloed utility systems,” said Scott Harden, chief technology officer for global innovation with electric technology provider Schneider Electric.

An ideal power system architecture would be built on a power sector foundation model that captures the key features of its diversity and challenges, Harden said. It would also have much more extensive deployment of phasor measurement units — hardware devices that can record and transmit transmission and distribution system data — as well as “full deployment of smart meters at the system edge, and all data would be federated,” he added.

Building that architecture could begin with regulatory support of “the new computing power and the technologies needed to make it work,” Harden said. It is not yet clear how burdensome the cost and time for deployment would be, “but the more important question is what the cost would be for not deploying it,” he added.

“It is early days for everyone in a big AI space, and the question now is how to navigate that space,” Commissioner Allison Clements of the Federal Energy Regulatory Commission told Utility Dive. Advanced computing applications now in use “can create a positive feedback loop if federal and state regulators drive it,” she added.

“Regulators must have a growth mindset in this moment of change because this is the early part of the messy middle of grid modernization,” Clements said. Utilities “are working rate case by rate case to understand how to make this transition while protecting reliability and affordability,” she added.

“Federal and state regulators need to lean in because AI capabilities are coming,” Clements added. “Whether or not it benefits society or causes problems is up to utilities, policymakers, legislators and other leaders.”