Application Development Q+A

Doug Kothe

Bert Still

Exascale Computing Project Q&A: Building science applications for a new era of supercomputers

Applications are the tools that direct a computer to work on specific tasks. Scientific applications for high-performance and data-analytic computing—such as those that model the dynamics of atoms or electrons—impact nearly every corner of research and development, from the physics of star explosions to squeezing the last percent of efficiency out of a jet engine. Over the next decade, as supercomputing moves from petascale (1015 calculations per second) to exascale (1018 calculations per second), advances in technology will change what applications are capable of and how they interact with new software and hardware technologies.

The Exascale Computing Project (ECP), a 10-year collaboration between computing and science experts at US Department of Energy (DOE) science and national security laboratories, is kick-starting the development and deployment of exascale applications. In this Q&A, Doug Kothe, ECP Applications Development Focus Area director located at Oak Ridge National Laboratory, and Bert Still, focus area deputy director located at Lawrence Livermore National Laboratory, discuss the challenges ahead.

How will applications for exascale computing help solve real-world problems?

Kothe: As part of ECP, we’ve identified many critical areas in science, energy, national security, and economic competitiveness where exascale applications will be game changers. With the computational power at exascale, we will have greater predictive capability and, hence, higher confidence in our results. Researchers will be able to more realistically simulate processes related to important energy applications—for example, integration of renewable energy on the power grid; safer and longer-lasting batteries; experimental fusion reactors and advanced nuclear reactors; adaptive wind energy plants; discovery, design, and manufacture of new materials; design of catalysts used in the production of biofuels as alternatives to petroleum products; ensuring the performance and security of our nuclear stockpile; and the list goes on.

Why is it important to start applications development before an exascale system has even been built?

Still: One of ECP’s goals is to leverage exascale computing to achieve the science missions of DOE, so the first step is to consider which applications we need to solve certain high-priority research problems. We then use those applications to help determine the characteristics of an exascale system.

Kothe: The complex multi-physics applications like those we use to simulate materials behavior, climate change, or energy production (including energy from nuclear fusion and fission, wind, solar, bio and fossil fuels, and so on) take many years to develop and sometimes even longer to validate. Even in cases where we may start from a mature simulation capability that is already being used today, exascale applications will still have to exploit imminent, yet exciting, changes in computer hardware and software technology.

How will requirements for applications inform development of hardware and software technologies?

Still: Once we identify the features we need in the applications, we will look at the software requirements, such as the compilers and libraries. Ultimately, we want a software stack with high fault (or failure) tolerance that does the underlying nuts-and-bolts work to run the application, while the application itself focuses on problem solving. Meeting these requirements also depends on how the software interacts with the hardware technology and how much “crunch,” or computational performance, the hardware can deliver. So we’re going back and forth—we call it “codesigning”—between the software and hardware to achieve the best computing architectures for the greatest number of applications. Interestingly, computer performance is becoming increasingly dependent on how efficiently data are used, and that’s the area where scientific computing and data analytics overlap.

ECP will enable the design of two kinds of exascale computing architectures. Why are diverse architectures important for applications development?

Kothe: Each application has a unique footprint on the hardware that we can think of as a pressure point or way in which the application stresses the hardware. These stresses can be different from application to application. If we were to create an application with only one hardware target, some of the other applications we want to develop would have difficulty in efficiently executing on that one type of hardware. That’s why diverse architectures are important for application longevity and broad-scale adoption.

Are there scientific problems that will drive the development of new applications for exascale?

Kothe: Data science—which includes statistical analysis, image processing, machine learning, data mining, and much more—is definitely one area. We’re going to see data science applications in ECP that connect experimental and computational facilities in real time or analyze large amounts of simulation data in real time. One application already in development, in collaboration with the National Institutes of Health, supports the Vice President’s Cancer Moonshot initiative to advance precision medicine for cancer therapy. For this application, we are using supervised machine learning methods to capture the complex relationships between the properties of drugs and the properties of tumors to predict responses to treatment. Ultimately, we hope to develop a model that provides treatment recommendations based on specific information about the patient and tumor characteristics. We are really just beginning to scratch the surface of how exascale computing can impact large data problems—and society as a whole.

 

National Nuclear Security Administration logo U.S. Department of Energy Office of Science logo