Intermittent renewable sources (wind and solar), electric vehicles, and smart loads will vastly change the behavior of the world’s biggest machine, the electric power grid, and will result in new stochastics and dynamics that the grid has never seen nor been designed for.
Optimizing such a stochastic and dynamic grid with sufficient reliability and efficiency is a monumental challenge. Not solving this problem appropriately or accurately could result in either significantly higher energy cost, or decreased reliability inclusive of more blackouts, or both. Power grid data are clearly showing the trend towards dynamics that cannot be ignored and would invalidate the quasi-steady-state assumption used today for both emergency and normal operation.
The increased uncertainty and dynamics severely strains the analytical workflow that is currently used to obtain the cheapest energy mix at a given level of reliability. The current practice is to keep the uncertainty, dynamics and optimization analysis separate, and then to make up for the error by allowing for larger operating margins.
The cost of these margins is estimated by various sources to be in $5-15B per year for the entire United States. The situation will become far worse when allowing for the increased dynamical content and uncertainty compatible with renewable energy penetration levels that meet and exceed those the US aims for by 2030. It is the objective of this project to create the software tools that, leveraging exascale computing, can result in the best achievable bounds on these errors and thus resulting in potentially billions of dollars a year in savings by producing the best planning.