Sign up by Feb. 7 for GreenBiz 25, our sustainable business event Feb. 10-12 in Phoenix, to save $200.

Article Top Ad

How quantum computing is poised to support sustainable power grids

As climate change propels severe weather threat to grids, classical computing is challenged. Read More

(Updated on July 24, 2024)

It is well-documented that severe weather events are the leading cause of power outages in the United States. And these events are increasing in frequency and scale as climate change fuels more extreme weather.

This summer, extreme heat in California led to rolling blackouts. Wildfires, aided by heat waves and drought conditions, are putting pressure on grid operators to maintain service. In August, a derecho — a sequence of thunderstorms with hurricane-force winds — caused widescale utility disruption in the Midwest and, more than a week later, tens of thousands of people still struggled without power. In 2017, Hurricane Harvey caused 10 gigawatts of forced outages and in 2012, Superstorm Sandy knocked out power to 8.6 million people in the Northeast.

The challenge of maintaining everyday reliability and resiliency in the face of extreme weather is compounded by increasingly complex grid management strategies and the increasing penetration of renewable energy resources.

Electric utilities continue to devote significant resources to predict, respond to and recover from extreme events — including extreme weather events, equipment outages and sudden loss of renewable generation.

To date, classical computing has aided in supporting all three steps. However, an increasingly complex grid, facing more frequent and intense weather events, is testing the limits of classical computing.

It is a reasonable assumption to make that as customers become more dependent on a reliable and resilient supply of electricity, utilities will need to move beyond classical computing for monitoring and control on blue-sky days as well as in severe weather.

Forecasting amid uncertainties

Forecasting the path, intensity and impact of a severe weather event on the grid is rife with uncertainty. There is also a great deal of uncertainty in the generation of renewable resources as well as availability of supply and delivery infrastructure. Classical computing uses predictive models that morph as new data is available, and the response timeframes for different environmental events vary significantly.

Grid operators must respond by assessing the likely impact on the grid and determining where and when to stage equipment and personnel to mitigate damage or how to redistribute resources. The recovery phase can be supported by computations that assign priorities and aid logistics. These are enormously complex tasks that must be accomplished in real time to maintain service to the greatest number of and most vulnerable customers.

The major benefit of quantum computing over classical computing is the substantial increase in computational speed. Running thousands, even millions of scenarios in real time to aid in decision-making would take classical computers too much time, even if multiple computer systems ran in parallel.

The good news is that feasibility testing or proof-of-concept for quantum computing to meet the increasing complexity of disaster preparedness for power utilities is well underway and showing great potential.

Proof-of-concept established

Last year, ComEd, the electric utility in northern Illinois and Chicago, initiated a collaborative group to study quantum computing for power systems. This effort included our team at the University of Denver along with representatives from the University of Chicago, Argonne National Laboratory and others.

We set out to study how quantum computing could be applied to analyze and model DC power flow, a useful proxy for understanding power system behavior to aid timely decision-making. (DC power flow is the most widely used power system analysis technique as a stand-alone application or embedded in other applications.)

Our DC power flow model was based on a three-node system (one virtual generation station and three virtual substations) using only four qubits on the IBM cloud-based quantum computing platform.

“Qubit” is a combination of the words “quantum” and “bit.” A qubit is the basic unit of information in quantum computing, as a bit is in classical computing. Using “only” four qubits for a DC power flow analysis is relevant due to its reduced hardware requirement and lower cost. The fewer the qubits used, the less expensive and more accurate this technology will be. Using the smallest number of qubits also will allow us to more easily scale up our approach to larger test systems as we move to pilot tests.

We established the proof of concept that, in fact, quantum computing can be used to analyze a practical and fundamental power system problem, although on a small-scale test system.

Our research on quantum computing focusing on grid resilience — how to prepare the grid against extreme events — is still ongoing and holds great promise for service restoration, minimizing the human and economic consequences of electricity disruption, and streamlining greater deployment of renewables.

Collaboration = faster implementation

Only through collaboration will we be enabled to determine how quantum computing can best be applied to power grids. A multi-stakeholder approach is required, as every power grid has a unique network topology and a different geographic setting, hence each face specific environmental threats. A multi-disciplinary approach is needed, as applying quantum computing to power systems runs the gamut of scientific, economic and other disciplines.

Organizations such as the IEEE Power & Energy Society (PES) have broad credibility and convening power to support such collaborative efforts. Also, government agencies play a critical role by providing the necessary funding for team building and to support research and development, as well as directing the efforts to ensure national importance and impact.

Electric utilities largely have been successful in reliably supplying electricity to their customers for over a century. This has been possible only by leveraging the latest technological advances to upgrade the grid and update decision-making processes. Now we are at yet another critical juncture that calls for advances in computing, and luckily, the solution is already here.

As quantum technology moves closer to commercialization, we must boost efforts in proof-of-concept and collaborative pilot studies, such as the one conducted by our team, to make sure we achieve the clean, secure, reliable and resilient grid of the future.  

Trellis Briefing

Subscribe to Trellis Briefing

Get real case studies, expert action steps and the latest sustainability trends in a concise morning email.
Article Sidebar 1 Ad
Article Sidebar 2 Ad