Online Data Analysis and Reduction at the Exascale

Project Details

By 2024, computers are expected to compute at 1018 operations per second but write to disk only at 1012 bytes per second—a compute-to-output ratio 200 times worse than on the first petascale systems. In this new world, applications must increasingly perform online data analysis and reduction, which are tasks that introduce algorithmic, implementation, and programming model challenges that are unfamiliar to many scientists and have significant implications for the design of various exascale system elements.

The goal of the Co-design Center for Online Data Analysis and Reduction at the Exascale (CODAR) is to produce infrastructure for online data analysis and reduction (Cheetah/Savanna); import, integrate, and develop essential libraries by using this infrastructure (FTK, MGARD); incorporate these libraries into scientific applications and quantify accuracy (Z-checker) and performance (Chimbuko); release software artifacts; construct application-oriented case studies; document success stories and the process applied to obtain them; and report on codesign trade-off investigations.

Principal Investigator(s):

Ian Foster, Argonne National Laboratory


Argonne National Laboratory, Brookhaven National Laboratory, Oak Ridge National Laboratory, Brown University, Rutgers University, State University of New York at Stony Brook, University of Oregon

National Nuclear Security Administration logo Exascale Computing Project logo small U.S. Department of Energy Office of Science logo