Sunita Chandrasekaran Reflects on Teaching Supercomputing and Leading the ECP SOLLVE Project
As principal investigator of the ECP SOLLVE project, Sunita Chandrasekaran emphasizes the development of sustainable software.
As principal investigator of the ECP SOLLVE project, Sunita Chandrasekaran emphasizes the development of sustainable software.
The latest in the code-for-Aurora series explores an app aimed at high-fidelity whole device modeling of magnetically confined fusion plasmas.
ExaGraph aims to leave a lasting legacy of algorithms, implementations, and graph-enabled applications for scientific discovery.
The National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory (Berkeley Lab) today formally unveiled the f
This time, ECP's special podcast series on preparing code for the Aurora exascale supercomputer takes a look at the cosmological code HACC.
The Exascale Computing Project Software Deployment at Facilities project tests and verifies software functionality and efficiency.
ECP’s special podcast series on preparing code for the Aurora exascale system begins by focusing on an earthquake risk assessment application.
Simulating earthquake processes from end to end is key to being able to design bridges and buildings to be more resilient to earthquakes.
ECP's Data and Visualization portfolio is delivering data management software to store, save state, share, and facilitate the analysis of exascale data.
By exploiting the power and performance advantages of GPUs and exascale computing, researchers aim to better predict changes to Earth’s water cycles.
Spack (Supercomputer PACKage manager) is an R&D 100–Award winner because of its worldwide impact on high-performance computing.
The WarpX project is developing an exascale application for plasma accelerator research that will pave the way for new breeds of virtual experiments.
Machine learning, artificial intelligence, and data analytics are converging with high-performance computing to advance scientific discovery.
CODAR, an Exascale Computing Project co-design center, aims to produce an infrastructure for online data analysis and reduction.
Technologies known as the Data Transfer Kit (DTK) and ArborX enable researchers to focus more on their science rather than on low-level algorithms in simulations.
A collaborative team is working to get NWChem ready to run on exascale machines and to provide a starting point for future code development.
Gina Tourassi discusses the Oak Ridge National Laboratory effort within the Exascale Deep Learning–Enabled Precision Medicine for Cancer (CANDLE) project.
The ExaSky project conducts extreme-scale simulations to further our understanding of the makeup and evolution of the universe.
Ensuring speedup for exascale, managing dependencies and versions, and fostering great collaborative communication are key in deploying ECP software.
Researchers involved with an input/output software product describe how it will enable users of high-performance computing systems to roll with the rapid pace of change.