It is estimated that about 70 percent of the energy generated worldwide ends up as waste heat.
If scientists could better predict how heat moves through semiconductors and insulators, they could design more capable power-generating systems. But the thermal properties of materials can be notoriously hard to model.
The problem involves phonons, subatomic particles that carry heat. Some of the thermal properties of a material depend on a measurement called the phonon dispersion relation, which is notoriously hard to obtain, let alone exploit in system design.
A team of researchers from MIT and elsewhere took on this challenge by rethinking the problem from the ground up. The result is a recent machine learning framework that can predict phonon dispersion relations up to 1,000 times faster than other AI-based techniques, with comparable or better accuracy. Compared to more classic non-AI approaches, it can be 1 million times faster.
The method could lend a hand engineers design power-generation systems that produce more power more efficiently. It could also be used to develop more capable microelectronics, since thermal management remains a major bottleneck in speeding up electronics.
“Phonons are responsible for heat loss, but obtaining their properties is extremely difficult, both computationally and experimentally,” says Mingda Li, assistant professor of nuclear science and engineering and senior author of the paper on the technique.
Li was joined on the paper by co-authors Ryotaro Okabe, a chemistry graduate student, Abhijatmedhi Chotrattanapituk, a graduate student in electrical engineering and computer science, Tommi Jaakkola, the Thomas Siebel Professor of Electrical Engineering and Computer Science at MIT, and others from MIT, Argonne National Laboratory, Harvard University, the University of South Carolina, Emory University, the University of California at Santa Barbara, and Oak Ridge National Laboratory. appears in
Phonon prediction
It is hard to predict which phonons carry heat because they have an extremely wide frequency range and the molecules interact with each other and move at different speeds.
The phonon dispersion relation of a material is the relationship between the energy and momentum of the phonons in its crystal structure. For years, scientists have tried to predict phonon dispersion relations using machine learning, but there are so many high-precision calculations that the models get bogged down.
“If you have 100 processors and a few weeks, you can probably calculate the phonon dispersion relation for a single material. The whole community really wants a more efficient way to do this,” Okabe says.
The machine learning models that scientists often exploit for these calculations are known as graph neural networks (GNNs). A GNN transforms the atomic structure of a material into a crystal graph consisting of many nodes, which represent atoms, connected by edges, which represent interatomic bonds between atoms.
Although GNNs are good at computing many quantities, such as magnetization or electric polarization, they are not elastic enough to efficiently predict an extremely high-dimensional quantity such as the phonon dispersion relation. Because phonons can move around atoms in the x, y, and z axes, their momentum space is hard to model with a fixed graph structure.
To achieve the needed flexibility, Li and his colleagues developed virtual nodes.
Virtual nodes are connected to the graph in such a way that they can only receive messages from real nodes. While virtual nodes will be updated when the model updates real nodes during computation, they do not affect the accuracy of the model.
“The way we do it is very efficient in coding. You just generate a few more nodes in your GNN. The physical location doesn’t matter, and the real nodes don’t even know the virtual nodes are there,” Chotrattanapituk says.
Cutting out the complexity
Since VGNN has virtual nodes representing phonons, it can skip many sophisticated calculations when estimating the phonon dispersion relations, which makes this method more capable than standard GNN.
The researchers proposed three different versions of VGNNs of increasing complexity. Each can be used to predict phonons directly from the atomic coordinates of the material.
Because their approach has the flexibility to rapidly model multidimensional properties, they can exploit it to estimate phonon dispersion relations in alloy systems. These sophisticated combinations of metals and nonmetals are particularly challenging for classic modeling approaches.
The researchers also found that VGNNs offered slightly better accuracy when predicting a material’s heat capacity. In some cases, prediction errors were two orders of magnitude lower with their technique.
Li says that VGNNs can calculate phonon dispersion relations for several thousand materials in a matter of seconds using a personal computer.
This efficiency could allow scientists to search a larger area for materials with specific thermal properties, such as better heat storage, energy conversion or superconductivity.
Moreover, the virtual node technique is not circumscribed to phonons and can also be used to predict hard optical and magnetic properties.
In the future, the researchers want to improve this technique so that virtual nodes are more sensitive and can detect diminutive changes that can affect the structure of phonons.
“Scientists have been too comfortable using graph nodes to represent atoms, but we can rethink that. Graph nodes can be anything. And virtual nodes are a very general approach that can be used to predict many high-dimensional quantities,” Li says.
“The authors’ innovative approach significantly extends the description of the solid-state neural network graph by incorporating key physics-based elements via virtual nodes, such as wavevector-dependent band structures and dynamical matrices,” says Olivier Delaire, assistant professor in the Thomas Lord Department of Mechanical Engineering and Materials Science at Duke University, who was not involved in this work. “I believe the speedup in predicting complex phonon properties is incredible, several orders of magnitude faster than the state-of-the-art machine learning universal interatomic potential. Impressively, the advanced neural network captures fine-grained features and obeys physical rules. There is great potential to extend the model to describe other important material properties: electronic, optical, and magnetic spectra and band structures come to mind.”
This work is supported by the U.S. Department of Energy, the National Science Foundation, a Mathworks Fellowship, a Sow-Hsin Chen Fellowship, the Harvard Quantum Initiative, and Oak Ridge National Laboratory.