February 2, 2022
Sixth in a series of articles describing aspects of the National Ignition Facility’s record-breaking 1.3-megajoule experiment.
You wouldn’t set out on a cross-country road trip without a current map or a functioning GPS. You might eventually reach your destination, but you could take a lot of wrong turns before you got there.
Likewise, inertial confinement fusion (ICF) researchers don’t design and carry out experiments on the National Ignition Facility (NIF), the world’s largest and highest-energy laser system, without the guidance of increasingly sophisticated computational models and simulations informed by diagnostic data and analysis of previous experiments.
Simulations allow the researchers to design implosion platforms and conduct trial-and-error tests of their theories before trying them out in time-consuming and costly experiments.
Computationally intensive 3D models, 2D integrated simulations that model cross-beam-energy transfer (CBET) inline, large computational ensembles, and simplified mathematical and data-based models played important roles over the decade of research building to the record-setting Hybrid-E experiment on Aug. 8 of last year that produced more than 1.3 megajoules of fusion energy.
The required precision for simulating these experiments, coupled with Lawrence Livermore National Laboratory (LLNL)’s world-class high-performance computing resources, can also improve the fidelity of simulations of interest to the National Nuclear Security Administration (NNSA)’s Stockpile Stewardship Program.
Nuclear weapon scientists and engineers use sophisticated simulations informed by NIF experimental results to assess the performance of nuclear weapon systems and perform weapon science and engineering calculations. NIF experimental data and simulations are combined with deep learning methods to improve understanding of the elements of ICF as well as the effects of aging on nuclear weapons in lieu of underground testing (see “Fusion Supports the Stockpile”).
NIF fires 192 high-energy lasers into a fusion target the size of a pencil eraser and is capable of producing temperatures of more than 100 million degrees Kelvin and pressures of hundreds of billions of Earth atmospheres (see “How NIF Targets Work”).
NIF’s progress toward ignition—formally defined as when the energy produced by fusion reactions meets or exceeds the laser energy delivered to the target—was largely driven by improvements in the design of experiments fielded on NIF that were shaped by the results of previous shots and by enhanced experiment-based modeling and simulation.
“You make a relatively small change in the inputs and suddenly you see this jump—that’s the threshold, or cliff-like, behavior we’ve been talking about for years. That’s one of the indications that ignition is about to happen.”
—ICF Chief Scientist Omar Hurricane
Computer codes were used to design, optimize, construct, and field the experiments while continually advancing understanding of key multi-physics mechanisms.
The hybrid experiments have increased the scale of the implosion to couple more energy to the hot spot compared to previous high-yield baseline designs and to achieve high levels of hot-spot pressure—moving NIF closer to ignition at the same level of laser energy and power. The record-breaking shot also benefited from a high-quality diamond target capsule and other aspects of new target design.
“We had to improve the efficiency of the hohlraum and balance many other design parameters at the same time to achieve the high pressures,” said LLNL physicist Annie Kritcher, lead designer for the Hybrid-E experiment that brought NIF to the threshold of ignition.
“Doing this in an integrated design sense, where we considered the detailed capsule physics together with what we could achieve with the hohlraum, was a key aspect” of the experiment’s success, Kritcher said. “We rely on the models for a lot of things: shock timing, energetics, symmetry during most of the (laser) pulse, implosion velocity, and so forth.”
“Knowing where the models can be trusted and where semi-analytical models were needed to make progress was important in this research.”
—Hybrid-E Lead Designer Annie Kritcher
While “modeling these extreme plasma conditions is very difficult,” she added, “they do extremely well in calculating the early-time radiation drive symmetry for these experiments thanks to the development of the inline CBET modeling capability.
“These experiments will continue to improve our modeling capability.”
Simulations are also used to infer certain properties of NIF experiments that can’t be measured directly, such as the extent of alpha heating in the cold fuel surrounding the hot spot, referred to as yield amplification, or Yamp. In the Aug. 8 experiment, energy yield was amplified by about 30 times over what it would have been without alpha heating—about six times more than the best previous Yamp.
“That sudden jump is what we’re looking for,” said ICF Chief Scientist Omar Hurricane. “That’s one of the indications that ignition is about to happen. You make a relatively small change in the inputs and suddenly you see this jump—that’s the threshold, or cliff-like, behavior we’ve been talking about for years.
“It’s super exciting because we’re starting to see it’s not a fantasy anymore. It’s actually happening.”
ICF experiments are modelled primarily by LLNL’s HYDRA multi-physics radiation-hydrodynamics code and the legacy LASNEX ICF code, which have been extensively validated against experiments, including hundreds of publications, over several decades.
HYDRA, with more than one million lines of code, has packages to simulate radiation transfer, atomic physics, hydrodynamics, laser propagation, and a number of other physics effects so that it can simulate a broad range of NIF experiments.
“We know the (standard multi-physics) models have difficulty, especially in the late-time beam propagation, and that is crucial for maintaining symmetry and implosion performance,” Kritcher said. “So in that regime, we use data-driven models to guide design choices. Knowing where the models can be trusted and where semi-analytical models were needed to make progress was important in this research.”
A Change in Awareness
The reliability of the models was an important issue in early NIF ignition experiments that produced fusion yields of only a few kilojoules—far less than the models predicted. A 2015 National Nuclear Security Administration review of the ignition program found that models and codes predicting that NIF would attain ignition conditions “are not capturing the necessary physics to make such predictions with confidence.”
The review also said experimental efforts were “frustrated by the inability to distinguish key differences” between laser shots, with similar experimental designs producing inconsistent results.
“The models we were using previously were very optimistic in their predictions,” Hurricane acknowledged. “Knowing where you can trust the model and where you can’t is not really an issue for the model, it’s more about a change in your own awareness. This skill is what we call ‘designer judgment,’ which is particularly important for the people who are responsible for nuclear weapons codes” (see “Climbing the Mountain of Fusion Ignition”).
In addition, he said, the degradations in implosion performance caused by such factors as the “tent” that suspends the target capsule in the hohlraum, the tube used to fill the capsule with deuterium-tritium (DT) fuel, and contamination of the fuel by capsule material, “weren’t in the models from years back, either because we didn’t have the computational power to include them, or we didn’t think they mattered, or we just didn’t know they were there.
“Those are the key aspects of higher-fidelity representations of what we are actually doing in the experiments—more capsule detail, more fill-tube detail, more tent detail, more hohlraum detail—we now have much higher-fidelity representations than we had before” (see “New NIF Implosion Models Reduce Discrepancies”).
In particular, advances in computer architecture have allowed for the advent of high-resolution 3D modeling and simulations by Dan Clark, Chris Weber, and their colleagues that have opened the door to identifying several key effects causing degradation in implosion performance.
“We’ve spent the last several years evolving our capsule modeling capabilities to include all of the various degradations that we’re aware of that are impacting NIF implosions in a single ‘integrated model,’” Clark said. “It’s challenging given the disparate scales at play, from sub-micron to millimeters, and the three-dimensionality of our implosions. That means you need a model that is both high-resolution and three-dimensional, and that’s computationally very demanding.
“We’re not there quite yet,” Clark said, “but we’ve made a lot of progress combining these two requirements—high resolution and 3D—to the point that our simulations are beginning to capture many of the features we see in experiment. And the more data you’re able to match on any given experiment, the more confidence you can have in applying that model to extrapolating and predicting future experiments, which is ultimately what we’re aiming for—a tool that can help guide us from 100 kilojoules to 1 megajoule and ultimately to 10 or 100 megajoules.”
In addition, 3D hohlraum modeling has been invaluable in assessing the impact of non-azimuthally symmetric target features, increasing researchers’ understanding of the delicate play of the different components of the NIF target as well as suggesting further improvements.
“There is also an effort afoot to improve the predictive capability of hohlraum modeling,” said Denise Hinkel, team lead for focused hohlraum modeling. “Typically, an integrated hohlraum-capsule simulation overpredicts the radiation drive that implodes the capsule as well as the amount of drive at the hohlraum waist, which impacts implosion symmetry,” she said. “Hypothesis-driven, focused experiments are underway at NIF to enhance both scientific understanding and predictive capability for hohlraum modeling.”
Keeping It Simple
Some issues, such as the multiple sources of asymmetries that degrade implosion performance, are so complex that comprehensive simulations become difficult to analyze. When that happens, researchers have found ways to simplify aspects of the implosion to reveal hidden relationships.
An example is the piston model, a mathematical abstraction of an implosion developed by Hurricane, Hybrid-E team member Dan Casey, and their colleagues.
“The ICF implosion is a very complicated thing, and that’s why we rely on the computer simulations so heavily,” Hurricane said. “But for the particular problem of asymmetry, you can abstract an implosion to a simple arrangement of pistons that looks a lot like a piston-driven rotary engine that you see in old biplanes from the 1920s.
“You can make a mathematical model that’s actually tractable—you can solve the math, and you see a number of simple but deep relationships between things we can measure and see in the experiments that were not obvious before.
“It’s an enormous simplification of a very complicated system that in this case was actually very, very productive and useful,” Hurricane said. “You get excited when you can explain some really complicated systems with a handful of simple equations.”
Simplification was also the goal of a data-based model for the late-time hohlraum radiation drive at the waist of the hohlraum that impacts symmetry. It was developed by LLNL physicist Debbie Callahan, a co-founder of the hybrid series of experiments. Simple models can be used to quickly scope out design parameter space under different assumptions and then identify interesting parts of that space for further study using more sophisticated tools.
“A key aspect of the hybrid strategy is to use some empirical models to guide us through the huge design space,” Kritcher said. “Being able to narrow down to something that looks like it’s going to meet all the criteria in just a handful of shots is very difficult. So, we have the famous ‘Debbie curve’ for hohlraum designers to figure out how to balance all the hohlraum parameters against what you’re trying to achieve for the capsule implosion.”
Large computational ensembles are another way to look for patterns in simulated data. An ensemble, used in machine learning for such things as weather forecasting, is a suite of sometimes thousands of simulations in which the variables are slightly changed over time to create a large database of inputs and outputs. Ensembles are used to develop quantitative, detailed estimates of the relative impact of physics phenomena and degradation mechanisms in ICF experiments.
Researchers are also building improved predictive models that can be used to foresee and design new NIF experiments. These models combine simulation prediction with the totality of existing NIF implosion data to give more accurate results (see “New Machine-Learning Tactic Sharpens NIF Shot Predictions”).
“There’s this complicated back and forth between diagnostics—a physical picture of what the implosion should be doing—and our models,” Hurricane said. “It’s an involved, iterative process. It takes time to understand what you’re doing and what you’re seeing.
“It’s a process of learning,” he said, “because you can’t always anticipate all the problems. Sometimes your guesses at what the problems will be are wrong—Mother Nature tells you different. Usually, she finds more problems for you than what you thought of on your own.”
“Lawrence Livermore researchers focus on fast flows in thermonuclear fusion,” LLNL News Release, September 24, 2021
“3D hohlraum model assists in indirect-drive implosions at NIF,” LLNL News Release, July 19, 2021
“Laser-Driven Ion Acceleration with Deep Learning Benefits NIF,” NIF & Photon Science News, June 4, 2021
“The data-driven future of high-energy-density physics,” Nature, May 19, 2021
“Scientists identify key trends in high-energy-density mixing layers,” LLNL News Release, May 17, 2021
“Machine Learning Speeds Up and Enhances Physics Calculations,” NIF & Photon Science News, October 1, 2020
“Deep Learning Surrogate Models Could Hasten New Scientific Discoveries,” NIF & Photon Science News, June 25, 2020
“Experimentally trained statistical models boost nuclear-fusion performance,” Nature, January 30, 2019
Follow us on Twitter: @lasers_llnl