Supercomputer Code Can Help Capture Carbon, Reduce Global Warming

Software allows designers to model liquid and gas flows before building expensive facilities

By Lawrence Bernard

Jordan Musser of the National Energy Technology Laboratory

Jordan Musser, a scientist at the National Energy Technology Laboratory, leads ECP’s MFIX-Exa subproject.

More than a third of carbon dioxide (CO2) emissions that contribute to global warming in the United States comes from power plants. What if scientists could capture the CO2 and store it where it would not contribute to climate change? Several technologies aim to do just that, and exascale computing can help.

Carbon dioxide is a greenhouse gas produced from the burning of fossil fuels that contributes to global warming. Like many multiphase flow devices, one of the biggest challenges in deploying carbon capture technologies lies in the scale-up from laboratory designs to industrial scales. The MFIX-Exa software subproject of the US Department of Energy’s (DOE’s) Exascale Computing Project (ECP) is helping achieve the necessary scaling.

“We are developing the tools to allow scientists and engineers to impact large-scale chemical reactors for a variety of industries,” said Jordan Musser, the principal investigator of MFIX-Exa. “This is a toolset that allows computer modeling to determine what’s going on inside gas–solid flow reactors. But we need this next-generation computational capability” because of the complexity of the calculations required.

Tracking Billions of Particles

This type of modeling tracks billions of individual particles simultaneously to simulate gas–solid flows typical of a power plant. Standard computing cannot ordinarily do that, Musser said. With MFIX-Exa, scientists can provide insight to engineers when, for example, a chemical looping reactor is not performing well. And, he added, using exascale computing, they can recognize when things go wrong far more rapidly than in a lab or experimental setting.

Musser, a scientist at DOE’s National Energy Technology Laboratory (NETL), is a mechanical engineer and applied mathematician who has always been interested in modeling physical processes. He began his career at NETL as a Fellow in the Oak Ridge Institute of Science and Education in 2009. A winner of the Presidential Early Career Award for Scientists and Engineers—the highest distinction bestowed by the US government to early-career researchers—Musser combines his engineering know-how with computational mathematics to solve real-world problems. Also crucial to developing this code are Weiqun Zhang and Andrew Myers, scientists at the DOE’s Lawrence Berkeley National Laboratory in California, and William Fullmer, a NETL research engineer.

Uneven Distribution Challenge Met

The MFIX-Exa team faced several challenges optimizing the code for exascale, but one of the biggest is that local particle concentrations can vary greatly in space and time, making efficient use of HPC resources difficult. The team addressed this challenge by employing a dual-grid approach that separates the fluid and particle computations, allowing particle work to periodically be rebalanced across the system. The team rewrote the physical models from the legacy MFIX code and ported them to the faster modern GPUs, successfully testing the code on supercomputers at DOE’s Oak Ridge National Laboratory (ORNL).

The code is geared toward simulating commercial chemical looping reactors, utilizing a technology to reduce CO2 emissions through capture and storage. The MFIX-Exa code calculates the fluid dynamics in these types of reactors and can simulate large-scale commercial reactors during their design phase. This allows engineers to prototype reactors with a high-fidelity model and diagnose issues before a plant is built.

The code “allows us to look inside the corrosive environment of these reactors and see how the process is behaving,” Musser said. An extension of the legacy MFIX primarily used for lab-scale devices, MFIX-Exa will allow a ramp-up of problem size, speed, and accuracy on exascale computers, like Frontier, over the next decade, Musser said.

Exascale Computing Can Help Reduce Risk

At ORNL, Frontier recently became the first supercomputer to reach exascale, with 1.1 exaflops of performance and a threshold of a quintillion calculations per second. The system will enable researchers to develop critically needed technologies for the country’s energy, economic, and national security missions, helping address problems of critical importance to the nation that lacked realistic solutions as recently as five years ago.

Carbon capture and storage technologies such as chemical looping reactors require high-performance computing to give designers data to make informed decisions before building and testing such a system.

“Exascale computing can model these technologies to optimize these systems, which will reduce the risk of failure,” Musser said.

MFIX-Exa is not isolated to chemical looping reactors and carbon capture technologies. The code can also help analyze a wide variety of engineering devices that operate with a gas and solids mixture, including the devices found in the pharmaceutical, steel, and cement industries, for example, Musser said. As exascale computing ramps up, these industries could benefit from the code originally meant for carbon capture and storage.

This research is part of the DOE-led Exascale Computing Initiative (ECI), a partnership between DOE’s Office of Science and the National Nuclear Security Administration. The Exascale Computing Project (ECP), launched in 2016, brings together research, development, and deployment activities as part of a capable exascale computing ecosystem to ensure an enduring exascale computing capability for the nation.