Oak Ridge National Laboratory
Accelerating Delivery of a Capable Exascale Ecosystem
The Second Wave
You may know that the ECP has been cited numerous times by the US Department of Energy (DOE)—by Secretary Perry, in fact—as one of DOE’s highest priorities. This is not only incredibly exciting but also a tremendous responsibility for us. There are high expectations for the ECP, expectations that we should not just meet—I believe we can far exceed them. All of us involved in this project are undoubtedly believers in the value and imperative of computer and computational science and engineering, and more recently of data science—especially within an exascale ecosystem. Meeting and exceeding our goals represents a tremendous return on investment for US taxpayers and potentially for the nation’s science and technology base for decades to come. This is a career opportunity for everyone involved.
I would be remiss if I were not to thank—on behalf of all of us—Paul Messina, our inaugural ECP director. His experience and expertise have been invaluable in moving ECP through an admittedly difficult startup. The ECP is, after all, an extremely complicated endeavor. His steady hand, mentoring, and leadership, from which I benefitted first hand as the Application Development lead, have been vital to the project’s early successes. We will miss Paul but will not let him “hide”—we’ll maintain a steady line of communication with him for advice, as a sounding board, etc. Thanks again, Paul!
As we focus our research teams on years 2 and 3 of the ECP, we must collectively and quickly move into a “steady state” mode of execution, i.e., delivering impactful milestones on a regular cadence, settling into a pattern of right-sized project management processes, and moving past exploration of technology integration opportunities and into commitments for new integrated products and deliverables. We are not there yet but will be soon. Some of this challenge has involved working with our DOE sponsors to find the right balance of “projectizing” R&D while delivering tangible products and solutions on a resource-loaded schedule that can accommodate the exploratory high-risk/high-reward nature of R&D activities so important for innovation.
Changes in the ECP Leadership
We are currently implementing several changes in the ECP, something that is typical of most large projects transitioning from “startup” to “steady state.” First, some ECP positions need to be filled. ECP is fortunate to have access to some of the best scientists in the world for leadership roles, but these positions take time away from personal research interests and projects, so some ECP leaders periodically may rotate back into full-time research. Fortunately, the six DOE labs responsible for leading the ECP provide plenty of “bench strength” of potential new leaders. Next, our third focus area, Hardware Technology, is being expanded in scope and renamed Hardware and Integration. It now includes an additional focus on engagement with DOE and National Nuclear Security Administration computing facilities and integrated product delivery. More information on both topics will follow.
Looking toward the horizon, we must refine our resource-loaded schedule to ensure delivery on short-term goals, prepare for our next project review by DOE (an Independent Project Review, or IPR) in January 2018, and work more closely with US PathForward vendors and DOE HPC facilities to better understand architecture requirements and greatly improve overall software and application readiness. ECP leadership is focused on preparing for the IPR, which we must pass with flying colors. Therefore, we must collectively execute on all research milestones with a sense of urgency—in about a year, we will all know the details of the first two exascale systems!
We’ve recently spent some time refining our strategic goals to ensure a clear message for our advocates, stakeholders, and project team members. ECP’s principal goals are threefold, and they align directly with our focus areas, as follows:
- Applications are the foundational element of the ECP and the vehicle for delivery of results from the exascale systems enabled by the ECP. Each application addresses an exascale challenge problem—a problem of strategic importance and national interest that is intractable without at least 50 times the computational power of today’s systems.
- Software Technologies are the underlying technologies on which applications are built and are essential for application performance, portability, integrity, and resilience. Software technologies span low-level system software to high-level application development environments, including infrastructure for large-scale data science and an expanded and vertically integrated software stack with advanced mathematical libraries and frameworks, extreme-scale programming environments, tools, and visualization libraries.
- Hardware and Integration points to key ECP-enabled partnerships between US vendors and the ECP (and community-wide) application and software developers to develop a new generation of commodity computing components. This partnership must ensure at least two diverse and viable exascale computing technology pathways for the nation to meet identified mission needs.
- The expected ECP outcome is the accelerated delivery of a capable exascale computing ecosystem to provide breakthrough solutions addressing our most critical challenges in scientific discovery, energy assurance, economic competitiveness, and national security. Capable implies a wide range of applications able to effectively use the systems developed through the ECP, thereby ensuring that both science and security needs will be addressed because the system is affordable, usable, and useful. Exascale, of course, refers to the ability to perform >1018 operations per second, and ecosystem implies not just more powerful systems, but rather all methods and tools needed for effective use of ECP-enabled exascale systems to be acquired by DOE labs.
To close, I’m very excited and honored to be working with the most talented computer and computational scientists in the world as we collectively pursue an incredibly important and compelling national mission. I think taking the journey will be just as fun as arriving at our destination, and to get there we will need everyone’s support, talent, and hard work. Please contact me personally if you ever have any questions, comments, or concerns.
In the meantime, as former University of Tennessee Lady Vols basketball coach Pat Summitt said, “Keep on keeping on.”
Featured Lab Partner: Los Alamos
The synergy of national security science and multidisciplinary research experience at Los Alamos National Laboratory is a pillar of strength among the DOE lab ecosystem and a significant contributor to the Exascale Computing Project.
U.S. Industries Look to Exascale
“This is much bigger than any one company or any one industry. If you consider any industry, exascale is truly going to have a sizeable impact, and if a country like ours is going to be a leader in industrial design, engineering and manufacturing, we need exascale to keep the innovation edge.”
PathForward Vendor Hewlett Packard Enterprise Presents Eighteen Zeros Movie
Watch a video by Exascale Computing Project PathForward Vendor Hewlett Packard Enterprise on how the implications of an exascale computer will touch every aspect of our lives.
Leaning into the Supercomputing Learning Curve
The Argonne Training Program on Extreme-Scale Computing, part of the Exascale Computing Project, aims to touch on all of the key skills and approaches a researcher needs to take advantage of the world’s most powerful computing systems.
Berkeley Lab Researchers Lead Development of Workflow to Predict Ground Movement
With emerging exascale supercomputers, researchers will soon be able to accurately simulate the ground motions of regional earthquakes quickly and in unprecedented detail, as well as predict how these movements will impact energy infrastructure.
Exascale Computing to Accelerate Clean Fusion Energy
Using supercomputers to model and simulate plasma behavior, scientists have made great strides toward building a working fusion reactor. Exascale systems, the next generation of supercomputers on the horizon, will bring the promise of fusion energy closer. An article by Jon Bashor of Lawrence Berkeley National Laboratory, published in the insideHPC blog, has the story.
ECP Co-Design Center Achieves Orders of Magnitude Speed-Up in Latest Software Release
Just one year after the US Department of Energy began funding projects to prepare scientific applications for exascale supercomputers, the Block-Structured Adaptive Mesh Refinement Co-Design Center has released a new version of its software that solves a benchmark problem hundreds of times faster than the original baseline.
IDEAS Goes Big: Fostering Better Software Development for Exascale
Scalability of scientific applications is a major focus of the Department of Energy’s Exascale Computing Project (ECP) and in that vein, a project known as IDEAS-ECP, or Interoperable Design of Extreme-scale Application Software, is also being scaled up to deliver insight on software development to the research community.
Cartography of the Cosmos
A recent feature article tells the story of the “Computing the Sky at Extreme Scales” project, or “ExaSky,” one of the first efforts funded by the Exascale Computing Project.
Department of Energy’s Exascale Computing Project Names Doug Kothe as Director
Doug Kothe becomes new director of the Exascale Computing Project effective October 1. He replaces Paul Messina, who is stepping down after two years to return to Argonne National Laboratory.
Slate Writes that Exascale Will Foster Competitiveness, Sustainability, and Security
Achieving exascale computing will result in more realistic models for predicting weather and earthquakes and for making many other essential scientific and technological advances, including better industrial designs, cost reduction of sustainable energy systems, and enhanced abilities to analyze surveillance data.
Berkeley Lab’s Kathy Yelick on the Promise and Progress of Exascale Science
On Friday, September 8, Kathy Yelick of Lawrence Berkeley National Laboratory and the University of California, Berkeley, delivered the keynote address on “Breakthrough Science at the Exascale” at the ACM Europe Conference in Barcelona. In conjunction with her presentation, Yelick agreed to a short Q&A discussion with HPCwire.
Wired Magazine Takes a Look at ORNL’s Next Supercomputer
When Summit at Oak Ridge National Laboratory comes online next year, it will be the most powerful supercomputer in the country and perhaps in the world.
Argonne Steps Up to the Exascale Computing Project
A special guest feature in the insideHPC blog describes how Argonne National Laboratory, working with its lab colleagues and partners, is using its experience and perspective to help frame and support the Exascale Computing Project.
NREL Report: Wind Could Supply 20 Percent of US Energy by 2030
HPCwire summarizes a recent report from the US Department of Energy’s National Renewable Energy Laboratory that explains how supercomputing-led scientific advances could make wind a greater supplier of the country’s energy needs.
Oak Ridge Researchers Turn to Deep Learning to Solve Sciences’ Big Data Problems
A team of researchers from Oak Ridge National Laboratory has been awarded nearly $2 million over three years from the Department of Energy to explore the potential of machine learning in revolutionizing scientific data analysis.
Webinar on Managing Defects in HPC Software Development
As part of the IDEAS Productivity project, Tom Evans of Oak Ridge National Laboratory on November 1 at 1pm Eastern Time will present a webinar on Managing Defects in HPC Software Development. Evans is principal investigator of the Exascale Computing Project Application Development focus area effort called Coupled Monte Carlo Neutronics and Fluid Flow Simulation of Small Modular Reactors, also known as the ExaSMR project.
Webinar on Tau Performance System
The Exascale Computing Project on November 8 at 1pm Eastern Time will sponsor a webinar presented by Sameer Shende of the University of Oregon on the TAU Performance System. TAU is a powerful profiling and tracing toolkit that covers multiple aspects of performance instrumentation, measurement, and analysis.
Birds-of-a-Feather Session at SC17 to Address Exascale Challenges and Opportunities
Jim Ang, former director of the Exascale Computing Project (ECP) Hardware Technology focus area, and Doug Kothe, ECP director, on November 14 at the SC17 conference in Denver will lead a Birds-of-a-Feather (BOF) session at the International Conference for High Performance Computing, Networking, Storage, and Analysis. The session is titled Exascale Challenges and Opportunities, and will run from 12:15pm to 1:15pm Mountain Time in rooms 301, 302, and 303. The BOF sessions offer opportunities for attendees to connect and interact with others who share a mutual interest. This is the 30th consecutive SC conference, which has been growing steadily since the first event in 1988 in Orlando, FL. The conference was first known as the “Supercomputing” conference but since changed its official name to SC. SC17 is expected to draw more than 11,000 attendees.
Paul Messina to Present Invited Talk at SC17
Former Exascale Computing Project Director Paul Messina will present an invited talk on “The U.S. D.O.E. Exascale Computing Project—Goals and Challenges” at SC17 in Denver.
MPI Forum: Standardization Meeting
The next meeting of the MPI Forum, the open group that defines and maintains the Message Passing Interface standard, is December 4–7 in San Jose, CA.