From harnessing the power of the atom to sequencing the human genome, the Department of Energy (DOE) has a long history of developing cutting-edge science and technology. One of the fresh challenges DOE has taken on is the use of supercomputers to accelerate cancer research.
Computer simulation has become an essential and core component of earthquake design for major infrastructure, but researchers need to better understand and quantify future earthquakes. Exascale computing could allow for the realization of such advances. Learn more on Let's Talk Exascale.
Plenty of people around the world got new gadgets Friday, but one in Eastern Tennessee stands out. Summit, a new supercomputer unveiled at Oak Ridge National Lab is, unofficially for now, the most powerful calculating machine on the planet.
Several important events at the end of May 2018 served to advance the US Department of Energy’s Exascale Computing Initiative for the United States. Among those events was the passage of exascale budgets by both the full House and Senate Appropriations Committees.
ECP Industry Council Chair Dave Kepczynski, CIO, GE Global Research, General Electric, recently talked with the publication EnterpriseTech about the council's mission and the eagerness of industry to have exascale computing capability.
Sandia National Laboratories reports on the work of the Energy Exascale Earth Systems Model (E3SM) and how the system E3SM has developed the last four years is expected to offer one of the best resolutions ever achieved by supercomputers simulating aspects of the planet’s climate.
The ADIOS project—led by Scott Klasky of Oak Ridge National Laboratory—is optimizing I/O on exascale architectures and making itself easily maintainable, sustainable, and extensible, while ensuring its performance and scalability.
ECP’s Training and Productivity program delivers a robust developer training for ECP project members and the broader HPC community, including industry’s computing staff members, so they will be able to take full advantage of exascale hardware and software.
ECP’s Software Technology (ST) research focus area is working on several elements of its Software Development Kit initiative and preparing for the coordinated release of ST products for Q1FY19. Those are two of the topics ST Director Michael Heroux discusses in an audio update.
The status of the testing of ECP applications on the Summit system at Oak Ridge National Laboratory and the success and impact of the co-design centers are among the subjects of an audio update from Andrew Siegel, director of ECP’s Application Development research focus area.
Despite exponential advances in computing power, many physical processes are still problematic relative to modeling and simulation. Fusion energy is one of those processes. But coupled computer codes and the promising power of exascale computing could bring commercial fusion power more within reach.
The US Department of Energy high performance computing laboratories will acquire, install, and operate exascale-class systems. ECP's Hardware and Integration focus area helps the labs and ECP applications and software teams prepare for exascale through mutually beneficial collaborations.
Organizing the Exascale Computing Project's efforts in software development kits, or SDKs, makes the process of developing a capable exascale ecosystem more effective and more efficient. Listen to the Let's Talk Exascale podcast to learn more.
ECP's Combustion-Pele project involves predictive simulation of in-cylinder combustion processes to explore the potential for groundbreaking efficiencies while limiting the formation of pollutants. Combustion-Pele's principal investigator, Jackie Chen, is guest on the Let's Talk Exascale podcast.
Source: Investors Business Daily, 4/30/2018 Commentary by Paul Dabbar and Thomas Zacharia When scientists discovered the immense power contained in an atom, the United States government provided the laboratories,
Rich Brueckner of insideHPC talked with researcher David McCallen of Lawrence Berkeley National Laboratory and the Exascale Computing Project recently at the HPC User Forum in Tucson about how exascale computing will enhance earthquake simulation for improved structural safety.
Many of the phenomena in our daily lives are controlled by molecular processes. An example is the performance of automobile engines. Software called NWChem, or Northwest Chem, can tell researchers a lot about fuel combustion and many other molecular systems. Here about NWChemEx for exascale on Let's Talk Exascale.
A new earth modeling system unveiled today will have weather-scale resolution and use advanced computers to simulate aspects of Earth’s variability and anticipate decadal changes that will critically impact the U.S. energy sector in coming years.
Lawrence Berkeley National Laboratory's David McCallen, a researcher from the Exascale Computing Project, on April 17 spoke at the 69th HPC User Forum in Tucson, Arizona, about exascale simulations for regional-scale earthquake hazard and risk.
Brookhaven Lab was the setting February 26–March 2 for a hackathon for research and computational scientists, code developers, and computing hardware experts to optimize scientific application codes for high-performance computing. ECP's SOLLVE Software Technology project collaborated to assist the users.
Andew Siegel, Exascale Computing Project Application Development focus area director, on April 5 provided overview and insight into ECP in a keynote address at the Ohio Supercomputer Center Statewide Users Group spring conference in Columbus, Ohio.
https://www.energy.gov/articles/secretary-energy-rick-perry-announces-18-billion-initiative-new-supercomputers Systems Will Solidify U.S. Leadership in the “Exascale” Computing Era WASHINGTON, D.C. – U.S. Secretary of Energy Rick Perry today announced a Request for Proposals (RFP), potentially worth up
Scott Baden of Lawrence Berkeley National Laboratory leads a project directed at providing lightweight communication and global address space support for exascale applications. Learn more about it on the Let's Talk Exascale podcast.
ECP's co-design process is essential to ensuring that future exascale applications adequately reflect the complex interactions and tradeoffs associated with the many new, and sometimes conflicting, design options.
The Power Steering project within the ECP provides a job-level power management system that can optimize performance under power and/or energy constraints. Tapasya Patki, the project's principal investigator, is guest on the latest Let's Talk Exascale podcast.
Exascale Computing Project Director Doug Kothe is among HPCwire's 2018 People to Watch. The list aims to foster a dialogue about the high-performance computing industry and highlight some of its best and brightest minds.
The online newspaper the Los Alamos Daily Post features an article about ECP's effort to create a capable exascale ecosystem and profiles two of its researchers, Jim Ahrens and Danny Perez, of Los Alamos National Laboratory.
The chief science officer for Mars, Incorporated, said exascale computing will be a radical enabler for helping the food, nutrition, and agriculture sectors evolve and possibly revolutionize themselves to address grand challenges.
Hayward Fault Earthquake Simulations Increase Fidelity of Ground Motions Article ID: 689010 Released: 8-Feb-2018 5:05 AM EST Source Newsroom: Lawrence Livermore National Laboratory Lawrence Livermore and Lawrence Berkeley national laboratory scientists have used
The Proxy Apps project is curating a collection of proxy apps that will represent the real applications of importance to the ECP. The project's principal investigator, David Richards of Lawrence Livermore National Laboratory, is in the latest ECP podcast episode.
The Data and Visualization project is responsible for the storage and visualization aspects of the ECP and for helping its researchers understand, store, and curate scientific data. The Let's Talk Exascale podcast features a conversation with the project's principal investigator, Jim Ahrens of Los Alamos National Laboratory.
Mark Gordon, Ames Laboratory Associate and Distinguished Professor at Iowa State University, spoke with ECP Communications at SC17 in Denver about the ECP project he leads, called General Atomic and Molecular Electronic Structure System (GAMESS).
Danny Perez of Los Alamos National Laboratory (LANL) spoke with ECP Communications at SC17 in Denver. Perez is a member of the Exascale Atomistic Capability for Accuracy, Length, and Time (EXAALT) project team, led by Principal Investigator Arthur Voter, also of LANL.
The CANcer Distributed Learning Environment (CANDLE), an effort of the Exascale Computing Project, was honored in the HPCwire Readers’ & Editors’ Choice Awards, presented at the 2017 International Conference for
The Center for Efficient Exascale Discretizations provides Exascale Computing Project applications with leading-edge simulation algorithms that can extract greater performance from exascale hardware than what is currently available.
February 6–8, 2018, at the Knoxville Convention Center, Knoxville, TN, the Exascale Computing Project (ECP) Second Annual Meeting will convene to highlight technical accomplishments that are being enabled by interactions and collaborations within the ECP community.
The US Department of Energy's booth (#613) at SC17 will be the setting for the Oak Ridge Leadership Computing Facility's 25th Anniversary celebration on November 13, at 7:15pm MST. The celebration, which will take place during the Opening Gala of the conference, is to feature a short video, brief remarks, and a toast.
U.S. industries aim for faster, more efficient product design through exascale computing Jeremy Thomas Lawrence Livermore National Laboratory Public Affairs Computer-aided design and engineering have come a long
HPCwire - The Exascale Computing Project's ATPESC (Argonne Training Program on Extreme-Scale Computing. https://www.hpcwire.com/off-the-wire/argonne-training-program-leaning-supercomputing-learning-curve/
From Science Daily Assessing regional earthquake risk and hazards in the age of exascale With emerging exascale supercomputers, researchers will soon be able to accurately simulate the ground motions of
Exascale Computing to Help Accelerate Drive for Clean Fusion Energy Jon Bashor Lawrence Berkeley National Laboratory Computing Sciences For decades, scientists have struggled to create a clean, unlimited energy
An article by Jon Bashor of Lawrence Berkeley National Laboratory Computing Sciences examines the need for exascale computing power to advance research on fusion reactor design and highlights the potential for collaboration with industry partners who will require this kind of power.
Using supercomputers to model and simulate plasma behavior, scientists have made great strides toward building a working fusion reactor. Exascale systems will bring the promise of fusion energy closer.
News Media Contact: Mike Bernhardt, bernhardtme@ORNL.gov, 503.804.1714 For Immediate Release: September 20, 2017 Department of Energy’s Exascale Computing Project (ECP) Names Doug Kothe as Director OAK RIDGE, Tenn. –
HPCwire summarizes a recent report from the US Department of Energy's National Renewable Energy Laboratory that explains how supercomputing-led scientific advances could make wind a greater supplier of the country's energy needs.
Scientists developing applications for exascale systems depend on an intricate set of software that makes the computing system usable and the job of the application developer easier. The broad services this software provides are often collectively referred to as the software stack.
A team of researchers from Oak Ridge National Laboratory has been awarded nearly $2 million over three years from the Department of Energy to explore the potential of machine learning in revolutionizing scientific data analysis.
Posted on Federal News Radio By David ThorntonAugust 21, 2017 3:33 pm https://federalnewsradio.com/agency-of-the-month-shows/2017/08/energy-dept-shoots-for-exascale-computer-in-a-national-lab-by-2021/ “The goal of the Path Forward program is for the six companies who got contracts to develop
Oak Ridge National Laboratory has begun to install Summit. Besides providing unprecedented amounts of computational capacity for traditional high-performance computing applications, it will offer the largest platform in the world for deep learning workloads.
A recent EE Times blog post underscores the importance of supercomputing to society and the strength that public-private research partnerships such as the PathForward program of the Exascale Computing Project contribute to America's competitiveness and innovation.
The Trinity supercomputer’s two partitions sited at Los Alamos National Laboratory were merged, with both Xeon Haswell and the Xeon Phi Knights Landing processors available for production computing in the laboratory’s classified network.
Scientists have developed new computer models to explore what happens when a black hole joins with a neutron star, the superdense remnant of an exploded star. A participant in the research from Lawrence Berkeley National Laboratory is working to build increasingly sophisticated models of neutron star mergers and supernovae through his involvement in the US Department of Energy's Exascale Computing Project.
A team of researchers from the Lawrence Berkeley National Laboratory has developed a new algorithmic framework called multi-tiered iterative phasing that utilizes advanced mathematical techniques to determine 3D molecular structure from very sparse sets of noisy, single-particle data.
A mathematical, or numerical, software library is a collection of algorithms and software that can make the development of scientific applications better, faster, and cheaper. The Extreme-Scale Scientific Software Development Kit project coordinates and enables dependable mathematical software libraries designed and developed for exascale platforms.
New collaboration initiative between SLAC National Accelerator Laboratory, CAMERA, the National Energy Research Scientific Computing Center (NERSC) and Los Alamos National Laboratory as part of DOE’s Exascale Computing Project (ECP).
Wind is an abundant and secure energy resource and could one day supply up to 30 percent of electrical power in the United States. Scientists have made great improvements in wind turbine efficiency, but to advance wind energy and harness its full potential, they need exascale computing.
(This is an introductory-level article targeted for readers somewhat new to HPC and exascale.) Without the invisible infrastructure called the software stack, even the world’s fastest computer wouldn’t compute much
The website for the ECP's Center for Efficient Exascale Discretizations (CEED) co-design center is now live. CEED is one of the ECP’s five co-design centers established to overcome performance barriers
Source: Tri-City Herald, Kennewick, WA A contributed article from Steven Ashby, Director, Pacific Northwest National Laboratory We often hear how supercomputers are used to tackle incredibly complex problems. They have
Department of Energy Awards Six Research Contracts Totaling $258 Million to Accelerate U.S. Supercomputing Technology QUOTES FROM VENDORS RECEIVING AWARDS Advanced Micro Devices (AMD) Sunnyvale, California “AMD is excited
HPCwire's John Russel interviewed Doug Kothe, the Exascale Computing Project's Director of Application Development for this feature article on 'The Race to Build Exascale Applications.' https://www.hpcwire.com/2017/05/29/doug-kothe-race-build-exascale-applications/
HPCwire Covers the Exascale Computing Project: A Presentation by ECP Director, Paul Messina. John Russell provided this thorough recap of ECP Director Paul Messina's opening presentation at the Hyperion Research
Tom Evans from Oak Ridge National Laboratory discusses “Coupled Monte Carlo Neutronics and Fluid Flow Simulation of Small Modular Reactors (ExaSMR)” – research being conducted as part of the Exascale Computing Project.
LBNL, PNNL Researchers Make NWChem’s Planewave “Purr” on Intel’s Knights Landing Architectures A team of researchers at the Lawrence Berkeley National Laboratory (Berkeley Lab), Pacific Northwest National Laboratory (PNNL)
ECP has selected its fifth co-design center. It will focus on graph analytics, combinatorial kernels that play a crucial enabling role in many data analytic computing application areas as well as several ECP applications.
A federally-funded effort to deliver HPC systems 50 times faster than today's supercomputers has added a business perspective to the multi-year development project, which one industry observer said could reduce
Today the Exascale Computing Project announced the formation of the ECP Industry Council, an external advisory group of executives from some of the nation’s most prominent companies with a collaborative interest
Oak Ridge National Laboratory experts are playing leading roles in the recently established Department of Energy’s (DOE’s) Exascale Computing Project (ECP), a multi-lab initiative responsible for developing the strategy, aligning
Exascale Computing Project Q&A: Building science applications for a new era of supercomputers Applications are the tools that direct a computer to work on specific tasks. Scientific applications for high-performance