Managing an energy grid is like trying to solve a huge puzzle.
Network operators must ensure that the right amount of energy flows to the right areas exactly when it is needed, and they must do so in a way that minimizes costs without burdening the physical infrastructure. Moreover, to meet constantly changing demand, they must solve this complicated problem repeatedly and as quickly as possible.
To facilitate solve this consistent puzzle, MIT researchers have developed a problem solver that finds the optimal solution much faster than classic approaches while ensuring that the solution does not violate any system constraints. In a power grid, limitations may include generator and line performance, for example.
This novel tool includes a feasibility step in a powerful machine learning model trained to solve the problem. The feasibility stage uses the model’s predictions as a starting point, iteratively refining the solution until it finds the best achievable answer.
The MIT system can solve complicated problems several times faster than classic solutions, while providing a mighty guarantee of success. For some extremely complicated problems, it may find better solutions than tried and tested tools. The technique also outperforms pure machine learning approaches, which are brisk but not always able to find feasible solutions.
In addition to helping plan energy production on the power grid, this novel tool can be applied to many types of complicated problems, such as designing novel products, managing investment portfolios, and planning production to meet consumer demand.
“Solving these particularly thorny problems well requires us to combine tools from machine learning, optimization, and electrical engineering to develop methods that make the right trade-offs in terms of providing value to the field while still meeting its requirements. You need to look at the needs of the applications and design methods in a way that actually meets those needs,” says Priya Donti, Silverman Family Career Development Professor in the Department of Electrical Engineering and Computer Science (EECS) and principal investigator in the Information and Decision Systems Laboratory (COVER).
Donti, senior author of the open access book article about this new tool called FSNetJoined by lead author Hoang Nguyen, an EECS graduate student. The paper will be presented at the Neural Information Processing Systems Conference.
Combining approaches
Ensuring optimal power flow in the power grid is an extremely complex problem that is becoming increasingly complex for operators to solve quickly.
“When trying to integrate more renewable energy sources into the grid, operators must come to terms with the fact that the amount of energy produced will change at any given moment. At the same time, there are many more distributed devices that need to be coordinated,” explains Donti.
Network operators often rely on classic solvers to provide a mathematical guarantee that the optimal solution does not violate any of the problem constraints. However, finding a solution using these tools can take many hours or even days if the problem is particularly complicated.
On the other hand, deep learning models can solve even very complex problems in a fraction of the time, but the solution may ignore some critical limitations. For the power grid operator, this can result in problems such as unsafe voltage levels and even grid failures.
“Machine learning models have difficulty meeting all constraints due to the many errors that occur during the training process,” explains Nguyen.
For FSNet, researchers combined the best of both approaches into a two-step troubleshooting scheme.
Focusing on feasibility
In the first step, the neural network predicts the solution to the optimization problem. Neural networks, loosely inspired by neurons in the human brain, are deep learning models that excel at recognizing patterns in data.
Then, the classic solver incorporated into FSNet performs a feasibility search step. This optimization algorithm iteratively refines the initial prediction while ensuring that the solution does not violate any constraints.
Because the feasibility stage is based on a mathematical model of the problem, it can ensure that the solution is implementable.
“This step is very important. With FSNet we can get the rigorous guarantees we need in practice,” says Hoang.
The researchers designed FSNet to address both main types of constraints (equality and inequality) simultaneously. This makes it easier to utilize than other approaches, which may require adapting the neural network or solving each type of constraint separately.
“Here you can just plug and play with different optimization tools,” says Donti.
He adds that by thinking differently about how a neural network solves complicated optimization problems, scientists have managed to unlock a novel technique that works better.
They compared FSNet with classic solvers and pure machine learning approaches on a range of challenging problems, including power grid optimization. Their system reduced solution time by an order of magnitude compared to baseline approaches, while respecting all problem constraints.
FSNet also found better solutions to some of the most complex problems.
“Although this was surprising to us, it makes sense. Our neural network can itself find additional structure in the data that the original optimization tool was not designed to use,” explains Donti.
In the future, researchers want to reduce FSNet’s memory utilization, employ more competent optimization algorithms, and scale them to tackle more realistic problems.
“Finding feasible solutions to difficult optimization problems is critical to finding near-optimal solutions. Especially for physical systems such as power grids, proximity to optimality means nothing if it is not feasible. This work is an important step toward ensuring that deep learning models can generate constraint-satisfying predictions, with explicit constraints-enforcement guarantees,” says Kyri Baker, an associate professor at the University of Colorado Boulder, who was not involved in this work.
