These projects target crosscutting algorithmic methods that capture the most common patterns of computation and communication (known as motifs) in the ECP applications. The goal of the co-design activity is to integrate the rapidly developing software stack with emerging hardware technologies while developing software components that embody the most common application motifs.
Adaptive mesh refinement (AMR) is like a computational microscope; it allows scientists to “zoom in” on particular regions of space that are more interesting than others. Cosmologists might want to zoom in on detailed cosmic filaments; astrophysicists might focus on regions of nucleosynthesis; and combustion scientists may examine where the burning occurs.
Principal Investigators: John Bell, Lawrence Berkeley National Laboratory
Efficient exploitation of exascale architectures requires a rethink of the numerical algorithms used in large-scale applications of strategic interest to the DOE. Many large-scale applications employ unstructured finite element discretization methods—the process of dividing a large simulation into smaller components in preparation for computer analysis—where practical efficiency is measured by the accuracy achieved per unit computational time.
Principal Investigators: Tzanio Kolev, Lawrence Livermore National Laboratory
By 2024, computers are expected to compute at 10^18 operations per second but write to disk only at 10^12 bytes/sec: a compute-to- output ratio 200 times worse than on the first petascale systems. In this new world, applications must increasingly perform online data analysis and reduction—tasks that introduce algorithmic, implementation, and programming model challenges that are unfamiliar to many scientists and that have major implications for the design of various elements of exascale systems.
Principal Investigators: Ian Foster, Argonne National Laboratory
Particle-based simulation approaches are ubiquitous in computational science and engineering. The “particles” may represent, for example, the atomic nuclei of quantum and classical molecular dynamics methods or gravitationally interacting bodies or tracer particles in N-body simulations. In each case, every particle interacts with its environment by direct particle-particle interactions at shorter ranges and/or the particle-mesh interactions between a particle and a local field that is set up by
Principal Investigators: Susan Mniszewski, Los Alamos National Laboratory
Fundamental changes in the way electric power is generated, transmitted, and consumed today have resulted in an unprecedented need for computation to solve problems related to the design, planning, and operation of the power grid. This endeavor and the need to computationally design and model chemicals, materials, and biosystems at a molecular level are fundamental to accomplishing DOE’s energy, environment, and national security missions. Recent advances in systems-based approaches coupled with
Principal Investigators: Mahantesh Halappanavar, Pacific Northwest National Laboratory
The ExaLearn Co-design Center leverages the revolution in what is variously termed machine learning, statistical learning, computational learning, and artificial intelligence. New machine learning technologies can have profound implications for computational and experimental science and engineering and thus for the exascale computing systems that DOE is developing to support those disciplines.
Principal Investigators: Francis Alexander, Brookhaven National Laboratory
Proxy applications (proxy apps) are small, simplified codes that allow application developers to share important features of larger production applications without forcing collaborators to assimilate large and complex code bases. Proxy apps are frequently used to represent performance critical kernels, programming models, communication patterns, or essential numerical algorithms.
Principal Investigators: David Richards, Lawrence Livermore National Laboratory