Coming out of the ESIG fall technical workshop in Providence, RI, the state of capacity expansion planning in utility applications is on my mind. The workshop offered plenty of food for thought. There were several sessions on how utilities are integrating more comprehensive capacity expansion approaches in their planning processes. These have been driven by three major waves of change, each impacting what the planning needs to consider. More than a decade ago, the increasing share of variable renewables started to disrupt the traditional load curve–based planning approaches. Then, a couple of years ago, battery storage triggered another disruption, causing the need for the planning processes to take time dependency seriously. Finally, the ESIG workshop made it evident that the new loads, like heat pumps, data centers, and electric vehicles, are challenging the planning processes yet again. It is not just about load profiles anymore. There is potential flexibility on the load side that needs to be captured since these loads, and their flexibility, are starting to influence the planning solutions that aim for cost efficiency and resiliency.
While there are modelling capabilities to incorporate these changes, both in commercial and open-source domains, it is not easy for utility planners to keep up with what is available. Furthermore, there are substantial challenges in adopting new methods, tools, and data to support the complex and weighty modelling processes where capacity expansion is usually just the first step. New tools and methods have a learning curve. There are risks in choosing the wrong tools, and everybody is busy with day-to-day business, partially due to the rapid pace of change that already demands more from the workforce.
Dealing with Uncertainty
When it comes to the data, it seems to be highly uncertain how soon these new large loads are going to need connection and where exactly successful projects will be located. In many cases, there are projects that will not succeed in permitting or financing, or the developer will simply choose an alternative location, especially when it comes to data centers or electrolysers. The potential load flexibility is also hard to gauge. With data centers, some operations are apparently easier to curtail than others. It can also be possible to move some types of operations geographically. Computation that is not immediately serving customers could be more flexible in responding to prices (or CO2 content of electricity) than computation where the customer is expecting a near-immediate answer from artificial intelligence.
On the electrolyser front, it should be possible to perform energy balancing, but it might be difficult to procure fast reserves due to the response characteristics of electrolysers – at least without extra hardware like supercapacitors. We also heard about the challenges of attracting demand response from residential customers (lack of trust, small benefits per customer, missing incentives), and we heard about how the high prices caused by the Russian invasion of Ukraine had incentivized Danes to move to day-ahead pricing en masse as well. Uncertainty in loads will also cause uncertainty in generation investments – it is easy to overbuild or underbuild if there is a lot of uncertainty about the load growth. All of this uncertainty should be reflected in the planning process, driving the need for a rather wide scenario space.
Advances in Tools and Methods
It was also evident at the workshop that tools and methods have kept evolving. As a co-lead for the Global Power System Transformation Consortium (G-PST) Pillar 5 on open-source tools and data, I found it satisfying to see how far open-source tools have advanced and to hear about the cases where open-source tools have been part of the planning process. The capabilities are there, and evolving further, with an increasing focus on usability and support. Those are a big focus for the development of Spine tools (https://github.com/spine-tools ), which I am involved in. On the new methods front, keep an eye on how chronological planning models are getting better at representing full year (or multiple year) time series. The computational burden can be reduced with little degradation to the solution quality. Finally, there was a lot of interest in better management of data and more automated model couplings. Integrated planning processes are very time consuming, and it’s often necessary to iterate between the different stages. The workshop had some examples of how utilities have started to work on this. In G-PST Pillar 5, we have also started to work towards interoperability “standards” – or at least trying to align ongoing efforts to that end.
Consider Collaborating Directly on Open Tools and Data
The pre-workshop integrated planning event co-organized by Breakthrough Energy and ESIG/G-PST highlighted the need for better tools and standards. To move forward, it is great to continue sharing ideas and best practices in future workshops, but there is the potential for even bigger benefits if we can collaborate directly through open tools and data whenever feasible. That is what G-PST Pillar 5 is working towards, and we welcome your involvement. For more information, please visit https://globalpst.org/what-we-do/open-data-tools/ .
Juha Kiviluoma
Principal Scientist
VTT, Finland
Co-lead for the Global Power System Transformation Consortium Pillar 5
Leave a Reply