Recent Success Snapshots: Application Highlight Roundup

The following are narrative snapshots in time chronicling highlights of some of the Application Development efforts within the US Department of Energy’s (DOE) Exascale Computing Project (ECP).

ExaStar: Exascale Models of Stellar Explosions: Quintessential Multiphysics Simulations

Through observation, astronomers have concluded that heavy elements were produced during the early history of the galaxy, but they need to gain a fuller understanding of where heavy elements originated. Moreover, they are keenly interested in probing the behavior of matter and neutrinos at extreme densities and determining the sources of gravitational waves. The research team of the ECP effort called ExaStar is delving into those three core questions of nuclear astrophysics.

In the pursuit of answers, the ExaStar researchers are developing a new computer code suite called Clash, which is referred to as a component-based multiphysics adaptive mesh refinement (AMR)-based toolkit. Clash will accurately simulate coupled hydrodynamics, radiation transport, thermonuclear kinetics, and nuclear microphysics for stellar explosion simulations. The Clash code suite will reach exascale efficiency by building upon current multicore- and many-core-efficient local physics packages integrated into a task-based asynchronous execution framework based on current AMR technology.

Progress to Date

  • Verified a new neutrino transport model in the Clash computer code suite.
  • Completed a GPU implementation of nuclear kinetics and nonpolytropic equation of state.
  • Integrated adaptive meshing from the AMReX Co-Design Center into Clash and performed baseline calculations on the Titan supercomputer.

Principal Investigator

Daniel Kasen, Lawrence Berkeley National Laboratory

Collaborating Institutions

Lawrence Berkeley National Laboratory, Oak Ridge National Laboratory, Argonne National Laboratory, SUNY Stony Brook

Related Links

ExaStar project website

ExaStar project page on the ECP website

Article by Rachel Harken, Oak Ridge Leadership Computing Facility: OLCF Scientist Talks Early Summit Results at APS Meeting

ExaStar project simulation of the abundance of Nickel-56 (56Ni) in the late stages of a core-collapse supernova explosion

The ExaStar project simulated the abundance of Nickel-56 (56Ni) in the late stages of a core-collapse supernova explosion, just before the shockwave formed in the explosion is about to reach the surface of the star (i.e., more than 20 hours after the core initially collapsed in the center of the star). The turbulent mushrooms seen at various angles will produce characteristic structures in the supernova remnant, and the radioactive decay of the nickel will be visible in gamma-rays. The combination of the Summit supercomputer and the ExaStar-enhanced FLASH5 code enabled this significant extension of evolution from usual core-collapse simulations. Simulations like this—from initiation to observable consequence—allow researchers to directly connect simulations to observations of astrophysical explosions. This ability to compare simulation with observation is necessary to enable understanding of the origin of the elements in the universe, as well as a host of other fundamental physics, including the nature of dense matter and the generation of gravitational waves. Courtesy: the ExaStar project

E3SM-MMF: Cloud-Resolving Climate Modeling of Earth’s Water Cycle

Addressing the effects of climate change on the global and regional water cycles is one of the top objectives and most difficult challenges in climate change prediction. The reason is that current earth system models are limited in their ability to represent the complex interactions between atmospheric motions and the motions in the clouds and individual storms. The next-generation model that E3SM is employing will improve the scientific community’s ability to assess the regional impacts of climate change on the water cycle that affect multiple sectors of the US and global economies.

E3SM is examining a multiscale modeling framework (MMF) approach to cloud resolution modeling. Often referred to as superparameterization, this approach offers significant opportunities for unprecedented improvement of a model that has yet to be fully explored due to limiting computing resources, a situation that will change with exascale. This project is integrating a cloud-resolving convective parameterization, or superparameterization, into the DOE E3SM Earth System model using MMF and exploring its full potential to scientifically and computationally advance climate simulation and prediction.

The E3SM team is designing the superparameterization to make full use of GPU-accelerated systems and involves refactoring and porting other key components of the E3SM model for GPU systems. The acronym E3SM-MMF is used to refer to the superparameterized version of the E3SM model being developed under this ECP effort.

Progress to Date

  • Completed an initial port of the superparameterization subcomponent to GPUs.
  • Completed an initial port of E3SM-MMF atmosphere/land “F” compsets to the Summit supercomputer.
  • Completed in fiscal year 2019 the first port of the atmosphere model to Summit, running at 0.6 simulation years per day and achieving a speedup of 31x relative to the Titan supercomputer.

Principal Investigator

Mark Taylor, Sandia National Laboratories

Institutions

Sandia National Laboratories, Lawrence Livermore National Laboratory, Pacific Northwest National Laboratory, Los Alamos National Laboratory, Argonne National Laboratory, Oak Ridge National Laboratory, University of California-Irvine, Colorado State University

Related Links

E3SM YouTube video

E3SM project page on the DOE Office of Science website

E3SM project page on the ECP website

Exascale Computing Project E3SM-MMF cloud-resolving model

Embedding a cloud-resolving model within a global model grid column allows better approximation of the effects of clouds on the larger-scale model. Courtesy: the E3SM project

EQSIM: High-Performance, Multidisciplinary Simulation for Regional-Scale Earthquake Hazard and Risk Assessments

Large earthquakes present a significant risk around the world and are a major issue across the DOE mission space ranging from the safety of DOE’s own inventory of one-of-a-kind mission-critical facilities to all major US energy systems. Beyond the DOE enterprise, addressing earthquake risk, both from the standpoint of life safety and the effects of damage, is a major societal challenge for virtually every element of the built environment, such as transportation, health, data, commerce, and all urban infrastructure.

ECP’s EQSIM project is tapping the tremendous developments occurring in high-performance computing, data collection, and data exploitation to help advance earthquake hazard and risk assessments. EQSIM application codes are removing the reliance on simplifying idealizations, approximations, and sparse empirical data; instead, they focus on resolving with the fundamental physics uncertainties in earthquake processes. Through EQSIM, regional-scale ground-motion simulations are becoming computationally feasible, and simulation models that connect the domains of seismology and geotechnical and structural engineering are drawing closer to being achieved.

The EQSIM application work is aimed at creating an unprecedented computational toolset and workflow for earthquake hazard and risk assessment. Starting with a set of the existing codes—SW4 (a fourth-order, 3D seismic wave propagation model), NEVADA (a nonlinear, finite displacement program for building earthquake response), and ESSI (nonlinear finite-element program for coupled soil-structure interaction)—EQSIM is building an end-to-end capability to simulate from the fault rupture to surface ground motions (earthquake hazard) and ultimately to infrastructure response (earthquake risk). The end goal of the EQSIM development effort is to remove computational limitations as a barrier to scientific exploration and understanding of earthquake phenomenology, and to practical earthquake hazard and risk assessments.

Progress to Date

  • Porting of SW4 to the CORI supercomputer and implementations of optimized loops and a hybrid MPI/OpenMP implementation;
  • Successfully ported SW4 to GPU-based platforms, including Sierra and Summit, with use of the RAJA libraries to enable platform independence, resulting in major performance advancements, among which is an unprecedented earthquake ground-motion simulation with resolution to 10Hz in a regional-scale model of the San Francisco Bay area; and
  • Implemented a capability to overlay statistically representative stochastic geology variations across a large SW4 geologic model to represent finite-scale geologic variability.

Principal Investigator

David McCallen, Lawrence Berkeley National Laboratory

Institutions

Lawrence Berkeley National Laboratory; Lawrence Livermore National Laboratory; University of Nevada, Reno; University of California, Davis

Related Link

EQSIM project page on the ECP website

Podcast interview with David McCallen

 

Exascale Computing Project EQSIM hazard and risk assessment

The EQSIM project is building a simulation framework for hazard and risk. Courtesy: the EQSIM project