The Exascale Computing Project has concluded. This site is retained for historical references.

LatticeQCD

Strong interactions between quarks and gluons represent 99% of the mass in the visible universe. Understanding these interactions and the phenomena that result from them is the central goal of nuclear physics. Over the past three decades, quantum chromodynamics (QCD) computations have been a driver of and have benefited from the spectacular advances in high-performance computing (HPC). Computing at the exascale is essential for reaching two decadal challenges of central importance to nuclear and high-energy physics. The LatticeQCD project is implementing scalable QCD algorithms to realistically simulate the atomic nucleus to reveal a deeper understanding of the fundamental organization of matter at the subatomic level. These calculations will help us understand the fundamental interactions and nature of matter beyond “elementary’’ particles.

Summary

Almost all the mass in the visible universe arises from interactions between quarks and gluons, which make up seven of the sixteen elementary particles in the Standard Model of physics. Exploring the ways in which these elementary particles interact is the central focus of modern particle physics, and provides valuable information for nuclear physics and astrophysics researchers. High performance computers have become an integral part of this process by interpreting information from high energy physics experiments at particle accelerators and providing powerful simulations of quark-gluon interactions to guide future experiments.

Quantum Chromodynamics (QCD) is the mathematical framework used to understand subatomic particle interactions within a nucleus, but applying the model to real-world interactions requires huge computational resources. The Exascale Computing Project’s Lattice Quantum Chromodynamics (LatticeQCD) team uses next-generation supercomputers to model subatomic interactions at unprecedented resolutions and timescales. The team’s applications are used by hundreds of researchers around the world to generate new experimental designs and to analyze and validate data more quickly than ever before.

Physicists have already used QCD to determine some properties of particles such as protons and neutrons, computing their masses and decay properties, but until recently they lacked the computational resources to study more fundamental particle interactions at sufficient spatial and temporal fidelity. Exascale computing has enabled the study of these interactions by guiding experimental research on current and future particle accelerators such as the Large Hadron Collider (LHC) at CERN and the Electron-Ion Collider (EIC) under construction at Brookhaven National Laboratory.

While exascale computing is now a reality, it could not be exploited fully to solve QCD problems without major advances in algorithms to take advantage of exascale architectures. The LatticeQCD team has created and optimized software for next-generation exascale computers, improving simulation speeds by a factor of more than fifty since 2016 and enabling more extensive gauge-field simulations—a critical tool for understanding the dynamics of quarks and gluons. These improvements will enhance the return on investment in the multi-billion-dollar LHC and EIC, leading to deeper understanding of how nature works at the tiniest length scales ever explored.

As the electric, atomic, and quantum revolutions have illustrated, advances in understanding of the fundamental laws of nature precipitate dramatic technological innovation. By enabling physicists to explore how elementary particles interact, LatticeQCD will contribute to the development of the Standard Model of physics, refine the theoretical underpinnings of nuclear and astrophysicists, and open the door to a new understanding of nature.

Technical Discussion

Atomic nuclei and most particles produced by high-energy accelerators are tightly bound composites of quarks and gluons. The fundamental interaction of these quarks and gluons is known as the strong (nuclear) force—one of the four fundamental forces of nature (i.e., strong, weak, electromagnetic, gravity). These nuclear interactions are explained with mathematical precision by QCD, and HPC is required to predict the consequences of this underlying theory. The properties of the resulting bound states and the nature of their strong, highly nonlinear interactions are the central focus of nuclear physics and an important context in which high-energy physics research must be conducted.

The couplings between the quarks and the W, Z, and Higgs bosons lie at the heart of the Standard Model of particle physics and can be studied, often with exquisite precision, by measuring the properties of the bound states formed from these quarks and gluons. Recent simulations of QCD on the previous generation of massively parallel computers have enabled a comparably precise theoretical understanding of these fundamental interactions of quarks and gluons.

The advances in exascale capability expected over the next decade offer to extend these exciting opportunities to even more groundbreaking discoveries in high-energy and nuclear physics. Exascale computing could realistically simulate the atomic nucleus and discover the first harbingers of new laws of nature, revealing a deeper theory that underlies the present elementary particles. These possibilities can be achieved only if new and impending advances in computer science can be harnessed to provide a software framework that allows lattice QCD code to efficiently exploit exascale architectures, enabling application scientists to create and refine that code as new challenges and ideas emerge.

The challenge problem consists of six computations representative of three of the common fermion actions currently used by the worldwide lattice-QCD community. Each action has specific advantages for different physical problems in nuclear and high-energy physics.

HISQ: The benchmark problem measures two rates—first, the rate of generating a new gauge configuration using a molecular dynamics (MD) algorithm for a Markov chain Monte Carlo, and second, the rate of making a representative set of “measurements” on the gauge–field configuration. Both steps are required for any campaign to calculate quantities of scientific importance.

DWF: As with the HISQ action, two figures of merit have been adopted for the DWF component of the application. The first measures the rate at which a current state-of-the-art gauge–field ensemble can be generated, and the second calculates a suite of observables using this ensemble.

Wilson-clover: The Clover benchmark has two components. The first is the rate at which dynamical Clover fermion lattices can be generated using a molecular dynamics (MD) algorithm. Several solutions of the Dirac equation are computed and contracted to construct observables as part of the second component of the benchmark.

Principal Investigator(s)

Andreas Kronfeld, Fermilab

Collaborators

Fermilab, Brookhaven National Laboratory, Jefferson Lab, Argonne National Laboratory, Boston University, Columbia University, State University of New York at Stony Brook, College of William and Mary, Indiana University, University of Illinois at Urbana-Champaign, University of Utah

National Nuclear Security Administration logo Exascale Computing Project logo small U.S. Department of Energy Office of Science logo