Yet as researchers push the limits of plasma performance, they face new challenges in keeping plasmas under control, including one that involves bursts of energy escaping from the edge of a very hot plasma. These edge bursts negatively impact overall performance and even damage a reactor’s plasma-facing components over time.
Today, a team of fusion researchers led by engineers from Princeton and the U.S. Department of Energy Princeton Plasma Physics Laboratory (PPPL) have successfully deployed machine learning methods to remove these harmful edge instabilities, without sacrificing plasma performance.
Using their approach, which optimizes the system’s suppression response in real time, the research team demonstrated the highest fusion performance without the presence of edge bursts in two different fusion facilities, each with its own set of operating parameters. The researchers reported their results on May 11 in Nature Communications, highlighting the vast potential of machine learning and other artificial intelligence systems to rapidly eliminate plasma instabilities.
“Not only did we show that our approach was able to maintain high-performance plasma without instabilities, but we also showed that it could work in two different installations,” said research director Egemen Kolemen, associate professor of mechanical and aerospace engineering and the Andlinger Center for Energy and Environment. “We have demonstrated that our approach is not only effective, it is also versatile. »
The costs of high confinement
Researchers have long experimented with different ways to operate fusion reactors to achieve the conditions necessary for fusion. One of the most promising approaches is to operate a reactor in high confinement mode, a regime characterized by the formation of a strong pressure gradient at the plasma edge that provides enhanced plasma confinement.

However, high confinement mode has historically been accompanied by instabilities at plasma boundaries, a challenge that has forced fusion researchers to find creative workarounds.
One solution is to use the magnetic coils that surround a fusion reactor to apply magnetic fields to the edge of the plasma, thereby breaking up structures that might otherwise develop into true edge instability. However, this solution is imperfect: even if it succeeds in stabilizing the plasma, the application of these magnetic disturbances generally leads to an inferior overall performance.
“We have a way to control these instabilities, but in return we had to sacrifice performance, which is one of the main motivations for operating in high confinement mode,” said Kolemen, who is also a research physicist at the PPPL.
The performance loss is partly due to the difficulty in optimizing the shape and amplitude of the applied magnetic disturbances, which in turn arises from the computational intensity of existing physics-based optimization approaches. These conventional methods involve a complex set of equations and can take tens of seconds to optimize at any given time – which is far from ideal when plasma behavior can change in just a few milliseconds. As a result, fusion researchers had to predefine the shape and amplitude of magnetic disturbances before each fusion operation, losing the ability to make adjustments in real time.
“In the past, everything had to be pre-programmed,” said the co-first author SangKyeun Kimscientific researcher at PPPL and former postdoctoral researcher in Kolemen’s group. “This limitation has made true optimization of the system difficult, because it means that parameters cannot be changed in real time based on changing plasma conditions.”
Increase performance by reducing calculation time
The Princeton-led team’s machine learning approach reduces calculation time from tens of seconds to the scale of milliseconds, opening the door to real-time optimization. The machine learning model, which is a more efficient replacement for existing physics-based models, can monitor the plasma state from millisecond to millisecond and change the magnitude and shape of magnetic disturbances as needed . This allows the controller to strike a balance between edge burst suppression and high blending performance, without sacrificing one for the other.

“With our machine learning surrogate model, we reduced the time to compute a code we wanted to use by several orders of magnitude,” said co-first author Ricardo Shousha, a postdoctoral researcher at PPPL and former student graduated from Kolemen’s group.
Because their approach is ultimately physics-based, the researchers said it would be simple to apply to different fusion devices around the world. In their article, for example, they demonstrated the success of their approach both at the level Tokamak KSTAR in South Korea and Tokamak DIII-D in San Diego. In both facilities, which each have a unique set of magnetic coils, the method achieved strong confinement and high fusion performance without harmful plasma edge bursts.
“Some machine learning approaches have been criticized for being data-only, meaning they are only as effective as the amount of quality data they are trained on,” Shousha said. “But because our model is a surrogate for a physical code and the principles of physics apply equally everywhere, it is easier to extrapolate our work to other contexts.”
The team is already working to refine their model to be compatible with other fusion devices, including planned future reactors such as ITERwhich is currently under construction.
One of the Kolemen group’s active areas of work is improving the predictive capabilities of their model. For example, the current model still relies on encountering multiple edge bursts during the optimization process before operating efficiently, which poses undesirable risks for future reactors. If, instead, researchers could improve the model’s ability to recognize the precursors of these harmful instabilities, it might be possible to optimize the system without encountering a single edge explosion.
Kolemen said the ongoing work is another example of AI’s potential to overcome long-standing bottlenecks in the development of fusion energy as a clean energy resource. Previously, researchers led by Kolemen successfully deployed a separate AI controller to predict and avoid a different type of plasma instability in real time on the DIII-D tokamak.
“For many of the challenges we’ve faced with the merger, we’ve gotten to the point where we know how to approach a solution, but our ability to implement those solutions is limited by the computational complexity of our traditional tools,” he said. Kolemen said. . “These machine learning approaches have opened up new ways to approach these well-known fusion challenges. »
The newspaper, “Highest fusion performance without harmful edge energy bursts in the tokamak», was published on May 11 in Nature Communications. In addition to Kolemen, Kim, and Shousha, co-authors include SM Yang, Q. Hu, A. Bortolon, and J. Snipes of PPPL; A. Jalalvand of Princeton University; SH Han, YM Jeon, MW Kim, WH Ko, and JH Lee of the Korea Fusion Energy Institute; J.-K. Park and Y.-S. Na from Seoul National University; NC Logan, AO Nelson, C. Paz-Soldan, and A. Battey of Columbia University; R. Nazikian of General Atomics; R. Wilcox of Oak Ridge National Laboratory; R. Hong and T. Rhodes of the University of California, Los Angeles; and G. Yu of the University of California, Davis. The work was supported by the U.S. Department of Energy, the National Research Foundation of Korea, and the Korea Fusion Energy Institute.