Entering a new era in scientific discovery
Exascale computing will transform the ability to answer some of the world’s toughest and most important questions

Francis J. Alexander
Project Director, ExaLearn, Brookhaven National Laboratory
With the arrival of exascale computing, we will be able to make significant progress across a wide swath important scientific and engineering problems. These challenges include the development of more accurate climate models for improved long-term...
With the arrival of exascale computing, we will be able to make significant progress across a wide swath important scientific and engineering problems. These challenges include the development of more accurate climate models for improved long-term prediction; higher-resolution tokamak simulations to enable practical magnetic fusion energy; and powerful machine learning methods to predict the response of complex biological systems, such as cancer cells, to specific drugs. As we envision the exascale future, the outlook for scientific progress and discovery is, indeed, exciting.

Amitava Bhattacharjee
Princeton Plasma Physics Laboratory
Fusion energy could one day be a transformative energy source, because it will be clean, cheap, and nearly unlimited, with sea water supplying its basic fuel. A whole-device computer model can offer insights about the plasma processes that go on in...
Fusion energy could one day be a transformative energy source, because it will be clean, cheap, and nearly unlimited, with sea water supplying its basic fuel. A whole-device computer model can offer insights about the plasma processes that go on in the fusion device and predictions regarding the performance and optimization of next-step experimental facilities. Using Frontier, we will be able to add new capabilities to the whole-device model, including the effects of the plasma boundary, the effects of fusion products, the influence of sources of heating, and the superimposed engineering structure that would make a fusion reactor operate as a unit.

David Bross
Argonne Computational Chemist and Aurora Early Science Program Principal Investigator
Exascale will allow us to solve yesterday’s insurmountable scientific problems. Until now, we have been forced to navigate tradeoffs between the fidelity of a simulation, the time necessary to derive it, and the number of simulations we can perform...
Exascale will allow us to solve yesterday’s insurmountable scientific problems. Until now, we have been forced to navigate tradeoffs between the fidelity of a simulation, the time necessary to derive it, and the number of simulations we can perform with a finite number of cycles. With a greatly increased computational resource like Aurora, we can perform vastly more high-fidelity simulations. Therefore, we can get much better quantitative descriptions to fuel our work.

Jacqueline Chen
Sandia National Laboratories
Combustion systems are projected to dominate the energy marketplace for decades to come. One engine concept—a low-temperature, reactivity-controlled, compression ignition engine—has the potential to deliver groundbreaking efficiencies of up to 60...
Combustion systems are projected to dominate the energy marketplace for decades to come. One engine concept—a low-temperature, reactivity-controlled, compression ignition engine—has the potential to deliver groundbreaking efficiencies of up to 60 percent while reducing emissions. On Frontier, we anticipate using high-fidelity simulations with machine learning and A.I. to model the underlying processes of this promising engine.

Steve Conway
Senior advisor for HPC market dynamics, Hyperion Research
U.S. Government investments in exascale technology will benefit not just exascale supercomputers themselves but high performance computers of all sizes, even one or two racks, as well as server systems used to support business operations in...
U.S. Government investments in exascale technology will benefit not just exascale supercomputers themselves but high performance computers of all sizes, even one or two racks, as well as server systems used to support business operations in enterprise data centers. These investments will advance the overall state of the art in information technology.

Trish Damkroger
Vice President and General Manager, High Performance Computing, Intel
Supercomputing, or high-performance computing (HPC) has enabled scientists and engineers to push the edge of what is possible for science and innovation for many years. What excites me most about Exascale is the convergence of advances in modeling...
Supercomputing, or high-performance computing (HPC) has enabled scientists and engineers to push the edge of what is possible for science and innovation for many years. What excites me most about Exascale is the convergence of advances in modeling and simulation, data analytics, machine learning, and artificial intelligence coupled with the exponential increases in Hardware (such as memory, storage, and CPU and GPU compute power) and Software technology. What fuels my passion is knowing the Exascale computing capabilities will help scientists and engineers solve problems that previously were out of reach or impossible. Exascale computing impacts will be far-reaching across national security, scientific discovery, economic security, energy security, and healthcare.

Tom Evans
Oak Ridge National Laboratory
We are approaching a revolution in how we can design and analyze materials. We can look and carefully characterize the electronic structure of fairly simple atoms and very simple molecules right now. But with exascale computing on Frontier, we're...
We are approaching a revolution in how we can design and analyze materials. We can look and carefully characterize the electronic structure of fairly simple atoms and very simple molecules right now. But with exascale computing on Frontier, we're trying to stretch that to molecules that consist of thousands of atoms. The more we understand about the electronic structure, the more we're able to actually manufacture and use exotic materials for things like very small, high tensile strength materials and buildings to make them more energy efficient. At the end of the day, everything in some sense comes down to materials.

Salman Habib
Argonne National Laboratory
Exascale will enable cosmology simulations large enough to model the distribution of billions of galaxies but also fine-grained enough to compare to a range of ground- and satellite-based observations, such as cosmic microwave background measurements...
Exascale will enable cosmology simulations large enough to model the distribution of billions of galaxies but also fine-grained enough to compare to a range of ground- and satellite-based observations, such as cosmic microwave background measurements and radio, optical, and xray data sets. At the same time, Frontier’s AI-oriented technology will enable us to analyze data from simulations in ways we simply can’t today.

Mike Heroux
Sandia National Laboratories
ECP Software Technology is excited to be a part of preparing the software stack for Frontier. We are already on our way, using Summit and Sierra as launching pads. Working with OLCF, Cray, and AMD, we look forward to providing the programming...
ECP Software Technology is excited to be a part of preparing the software stack for Frontier. We are already on our way, using Summit and Sierra as launching pads. Working with OLCF, Cray, and AMD, we look forward to providing the programming environments and tools, and math, data and visualization libraries that will unlock the potential of Frontier for producing the countless scientific achievements we expect from such a powerful system. We are privileged to be part of the effort.

Vivek Kale
Computer Scientist, Brookhaven National Laboratory
Exascale supercomputers provide many challenges, or opportunities, for maximizing the amount of simulated science per hour that can be done on a supercomputer. Compared with current pre-exascale supercomputers like Summit, on exascale supercomputers,...
Exascale supercomputers provide many challenges, or opportunities, for maximizing the amount of simulated science per hour that can be done on a supercomputer. Compared with current pre-exascale supercomputers like Summit, on exascale supercomputers, the processing power per node will increase as much as—if not more than—the number of nodes. Node-level computational performance needs to be prioritized together with and optimized alongside the performance of inter-node communication. ECP Software technologies like MPICH, LLVM’s OpenMP, and intelligent runtime systems for interoperability, communication optimizations, and load balancing will be key to providing performance for scientific applications run on exascale supercomputers. Furthermore, ECP software technology developers’ engagements with application programmers and interactions with software technology vendors will ensure readiness of the ECP software technologies for exascale supercomputers.

Doug Kothe
ECP Director, Oak Ridge National Laboratory
Exascale injects an urgently needed step change in computing technologies, tools, and applications that will have a profound impact on accelerating scientific discovery and improving our quality of life. Exascale will enable a new level of...
Exascale injects an urgently needed step change in computing technologies, tools, and applications that will have a profound impact on accelerating scientific discovery and improving our quality of life. Exascale will enable a new level of understanding, opening many new paths of discovery and breakthroughs. This new era of exascale computing would not be upon us without a truly collaborative community effort and the development of an incredible workforce. ECP is honored to have played a role in our biggest asset—our people.

Andreas Kronfeld
Fermilab
Exascale computing will be essential to precisely illuminating phenomena that emerge from neutrino physics experiments and maintaining the superb cross talk that has existed between the quantitative and the qualitative sides of discoveries in...
Exascale computing will be essential to precisely illuminating phenomena that emerge from neutrino physics experiments and maintaining the superb cross talk that has existed between the quantitative and the qualitative sides of discoveries in particle and nuclear physics. We anticipate that Frontier will provide the compute power and, just as important, the architecture for computation we must have to do our complicated, difficult calculations.

Katie Lewis
Advanced Machine Learning Project Leader Lawrence Livermore National Laboratory
Exascale represents a new paradigm in computing, with increased heterogeneity of systems as one of its hallmarks. This fundamentally changes how the scientific community thinks about solving some of our most challenging problems in computational...
Exascale represents a new paradigm in computing, with increased heterogeneity of systems as one of its hallmarks. This fundamentally changes how the scientific community thinks about solving some of our most challenging problems in computational science. In my work, this provides an exciting opportunity to use AI-enabled hardware and AI-inspired algorithms to advance analysis capabilities, enhance user efficiency and better couple complex physics.

Sherry Li
Senior scientist, Lawrence Berkeley National Laboratory
As numerical linear algebra algorithm and software developers, exascale inspires us to redesign the algorithms with the new capabilities far beyond what were available five years ago. These capabilities, accessible from our math libraries, enable...
As numerical linear algebra algorithm and software developers, exascale inspires us to redesign the algorithms with the new capabilities far beyond what were available five years ago. These capabilities, accessible from our math libraries, enable scientists and engineers to vastly improve their understandings of all kinds of physical phenomena, with unprecedented high-fidelity simulations and thorough analysis of a tremendous amount of data.

Alex Norton
Principal Technology Analyst and Data Analysis Manager, Hyperion Research
Thinking about the future of “exascale computing”, it’s more relevant to think of exascale applications, rather than simply the size and performance of the system. In this regard, public clouds are moving towards, if not already there, having...
Thinking about the future of “exascale computing”, it’s more relevant to think of exascale applications, rather than simply the size and performance of the system. In this regard, public clouds are moving towards, if not already there, having exascale-level performance on specific applications. Not just from the sheer amount of compute available, but the composability of resources to allow for future exascale-level computing to run on efficient and proper hardware solutions based on the various components of HPC applications. Further, as more applications become containerized and run on micro-services, composability and flexibility within clouds can lend to highly efficient and performant HPC applications at an exascale level.

Mark Nossokoff
Senior Research Analyst | Lead Storage Analyst, Hyperion Research
Even more important than capacity and performance, however, is data management. The ability to search, find, retrieve, process, and store the data wherever and whenever it’s required in real time will determine the best of breed architectures....
Even more important than capacity and performance, however, is data management. The ability to search, find, retrieve, process, and store the data wherever and whenever it’s required in real time will determine the best of breed architectures. POSIX-compliant I/O has become the primary bottleneck in data-intensive workloads. POSIX will yield to innovative data-aware applications and file systems to achieve the step-function improvements that will be required post-exascale.

Amedeo Perazzo
SLAC National Accelerator Laboratory
Free-electron X-ray laser facilities, such as the Linac Coherent Light Source (LCLS) at the SLAC National Accelerator Laboratory, produce ultrafast pulses from which scientists take stop-action pictures of moving atoms and molecules for research in...
Free-electron X-ray laser facilities, such as the Linac Coherent Light Source (LCLS) at the SLAC National Accelerator Laboratory, produce ultrafast pulses from which scientists take stop-action pictures of moving atoms and molecules for research in physics, chemistry, and biology. For example, LCLS will be able to reconstruct biological structures in unprecedented atomic detail under physiological conditions. We foresee that access to Frontier will enable the LCLS users to achieve not only higher resolution and significantly deeper scientific insight than are possible today but also a dramatically increased image reconstruction rate for the delivery of information in minutes rather than weeks.

Danny Perez
Los Alamos National Laboratory
Studying materials at exascale could have a significant impact on our world, because materials show up everywhere in the economy. Using a combination of advanced methods and scalable codes on Frontier, we’ll be able to perform simulations with...
Studying materials at exascale could have a significant impact on our world, because materials show up everywhere in the economy. Using a combination of advanced methods and scalable codes on Frontier, we’ll be able to perform simulations with potential millionfold increases in our time scales. We’ll also be able to do one-to-one comparisons with experiments and make better predictions about the evolution of these systems.

Sanjiv Shah
Vice President of Developer Software, Intel
Programming models were highlighted as one of the six fundamental challenges for Exascale computing. oneAPI is an open, standards-based, cross-platform programming model which bridges that gap. oneAPI will help the Exascale community build...
Programming models were highlighted as one of the six fundamental challenges for Exascale computing. oneAPI is an open, standards-based, cross-platform programming model which bridges that gap. oneAPI will help the Exascale community build applications for Aurora to address some of our greatest challenges, from understanding climate change to accelerating cures for disease.

Andrew Siegel
Argonne National Laboratory
At the inception of the ECP project we asked researchers to imagine new frontiers in science and engineering enabled by exascale computing. With Frontier, we have the opportunity now to fully realize our original vision, solving grand challenge...
At the inception of the ECP project we asked researchers to imagine new frontiers in science and engineering enabled by exascale computing. With Frontier, we have the opportunity now to fully realize our original vision, solving grand challenge problems that lead to breakthroughs in areas of energy generation, materials design, earth and space sciences, and related fields of physics and engineering.

Rick Stevens
Associate Laboratory Director for Computing, Environment and Life Sciences, Argonne National Laboratory
What excites me most about exascale systems like Aurora is the fact that we now have, in one platform and one environment, the ability to mix simulation and artificial intelligence. This idea of mixing simulation and data-intensive science will give...
What excites me most about exascale systems like Aurora is the fact that we now have, in one platform and one environment, the ability to mix simulation and artificial intelligence. This idea of mixing simulation and data-intensive science will give us an unprecedented capability, and open doors in research which were inaccessible before, like cancer research, materials science, climate science, and cosmology.

William Tang
Principal Research Physicist at the Princeton Plasma Physics Laboratory and Aurora Early Science Program Principal Investigator
Exascale computing’s ability to handle much larger volumes of data unlocks our ability to prove what was once unprovable. Plus, the incredible speed of supercomputers shortens our time-to-discovery by a huge margin. Work that used to take months or...
Exascale computing’s ability to handle much larger volumes of data unlocks our ability to prove what was once unprovable. Plus, the incredible speed of supercomputers shortens our time-to-discovery by a huge margin. Work that used to take months or years now takes hours or days. Therefore, we finally have the means to validate theories statistically and prove their reality.

Gina Tourassi
Oak Ridge National Laboratory
My team and I have been training computers to read complex medical documents, understand the clinical context, and extract information about cancer patients. Running the application on Summit enabled us to train these networks in a much shorter time...
My team and I have been training computers to read complex medical documents, understand the clinical context, and extract information about cancer patients. Running the application on Summit enabled us to train these networks in a much shorter time than before – fifty times faster on Summit than on Titan. Frontier will enable us to analyze these heterogenous data sets in a fully integrated way. We are talking about massive volumes of data and being able to do the analysis in this integrated manner is not currently possible with the computing resources we have available.

John Turner
Oak Ridge National Laboratory
The thing that’s really attractive about Frontier is the powerful nodes. Having fewer powerful nodes with a very tightly integrated set of CPUs and GPUs at the node-level gives us the ability to distribute hundreds or thousands of microstructure...
The thing that’s really attractive about Frontier is the powerful nodes. Having fewer powerful nodes with a very tightly integrated set of CPUs and GPUs at the node-level gives us the ability to distribute hundreds or thousands of microstructure and property calculations on one or a few nodes across the machine. With Frontier, we’re going to be able to predict the microstructure and properties of an additively manufactured part at much higher fidelity and in many more locations within a part than we are able to even with the world’s current fastest supercomputers.

Jean-Luc Vay
Accelerator Modeling Program Head, Lawrence Berkeley National Laboratory
Particle accelerators have revolutionized science and had profound impact on many aspects of our society, from medical treatments, to material processing, to the manufacturing of semiconductors (to cite just a few). Exascale computing will enable us...
Particle accelerators have revolutionized science and had profound impact on many aspects of our society, from medical treatments, to material processing, to the manufacturing of semiconductors (to cite just a few). Exascale computing will enable us to speed up dramatically the development of a kind of particle accelerator that is based on the use of ionized gases (plasmas). This development promises to shrink particle accelerators by orders of magnitude, leading to applications that are only a dream with existing technologies.

Dr. Robert W. Wisniewski
Chief Architect HPC, Aurora Technical Lead and PI, Intel
Supercomputers of the Exascale Era are going to allow new discoveries in science and engineering and accelerate advances that will benefit and propel humanity forward. Achieving exascale and beyond will require leveraging heterogeneity. We at Intel...
Supercomputers of the Exascale Era are going to allow new discoveries in science and engineering and accelerate advances that will benefit and propel humanity forward. Achieving exascale and beyond will require leveraging heterogeneity. We at Intel are delighted to be partners with Argonne and the broader DOE as we deliver Aurora and drive the technology to meet their mission needs.