APS News

October 2015 (Volume 24, Number 9)

Physicists Set Course for the Exascale

By Emily Conover

Titan supercomputer
Photo: Oak Ridge National Laboratory

The 17-petaflop Titan holds second place among supercomputers. Researchers now want to reach exaflop speeds.

The next frontier in supercomputing is the exascale — computers that can perform 1018 floating-point operations per second, or exaflops. Such computers tantalize scientists from across a significant technology gap. But with President Obama’s recent announcement of a National Strategic Computing Initiative (NSCI), the machines are beginning to feel within reach. The much-anticipated supercomputers could be as few as ten years away, and physicists are already hatching schemes to take advantage of them.

“Exascale computing in many ways is a game-changer,” says Robert Roser, chief information officer at Fermilab. The NSCI calls for supercomputers that are at least a hundred times as powerful as the current generation and capable of working with exabytes of data.

Created by an executive order on July 29, the NSCI directs government agencies to work together to achieve exascale computing, citing the Department of Energy (DOE), Department of Defense, and National Science Foundation as major players in the effort.

Supercomputers are already essential tools in many fields of physics, running the gamut from nuclear physics to fluid mechanics, from particle physics to astrophysics and cosmology, among others. And the impact is just as huge in other scientific disciplines — including climate science, neuroscience, and materials science — and in solving national security problems, like maintaining the U.S. nuclear weapons stockpile now that treaties ban test detonations.

“It’s an impressive array of possibilities. I think the categories are going to grow in depth and become deeper and deeper,” says Douglas Kothe, Deputy Associate Laboratory Director, Computing and Computational Sciences Directorate at Oak Ridge National Laboratory, who is spearheading the applications development for DOE’s exascale initiative.

The top U.S. supercomputer is Titan, located at Oak Ridge. With over 17 petaflops, it is the second-most-powerful computer on the planet, according to the TOP500 ranking of the world’s supercomputers. As of June 2015, China holds the global top spot with Tianhe-2, which boasts a performance of over 33 petaflops — a meteoric ascent given that a dozen years ago China failed to break the top 50. Japan has also garnered first place in recent years.

Titan’s second-place finish reveals one reason for the exascale push that goes beyond just the usefulness of the tool: “Internationally, this is a huge deal, because it’s very competitive,” says plasma physicist William Tang of Princeton University.

A number of challenges stand between us and exascale computing, says Steve Binkley, head of DOE’s Office of Advanced Scientific Computing Research. For one, the new supercomputers can’t be made just by beefing up current machines. Without new technology, an exascale computer would have a power consumption of hundreds of megawatts — consuming much of the output of a small nuclear reactor. Getting that number down to 20 MW is one goal of exascale pioneers.

Another sticking point is simply programming the machines. They will be massively parallel, with a billion calculational steps taking place at once. “Getting software that can do that effectively is a major challenge,” says Binkley. And ensuring the resiliency of exascale machines will also demand care. “Anytime you have a large system made up of many, many individual parts, getting reliable operation is hard,” Binkley says.

Overcoming these obstacles, Binkley says, will require an R&D stage of about four years before beginning collaboration with computer vendors to work towards production of an exascale computer by the mid-2020s.

Applications of extreme scale computing intersect nearly every area of physics. For instance, in nuclear physics, lattice quantum chromodynamics calculations are “an incredibly nonlinear problem; we can’t do it with pen and paper,” says David Richards of Thomas Jefferson National Accelerator Facility. And higher-precision predictions of the interactions of quarks and gluons, and first-principles calculations of the properties of nuclei will be possible with the new machines, says Martin Savage of the University of Washington. 3-D simulations of core-collapse supernovae will  likewise become more manageable. “These are things that are just a dream at the moment,” says Savage. “I’m looking forward to the machines hitting the floor.”

Supercomputers allow plasma physicists to make simulations of fusion reactors on the variety of distance scales relevant to the ultra-hot plasma within — from a tenth of a millimeter to meters in size. Better computers allow for more detailed simulations that more closely reproduce the physics, says Choong-Seock Chang of Princeton University. “With bigger and bigger computers, we can do more and more science, put more and more physics into the soup.” Plus, the computers allow scientists to reach their solution faster, Chang says. Otherwise, “somebody with a bigger computer already found the answer.”

High energy physicists are beginning to jump on the supercomputing bandwagon as well. The LHC currently relies on grid computing instead, harnessing a collaboration of computing centers across the globe. But as the LHC looks forward to a high-luminosity upgrade planned for 2020 that will boost its collision rate by a factor of 10 beyond the original design, the grid may not be able to keep up.

But it’s not just a problem of capacity, it’s also a problem of complexity, says Tom LeCompte of Argonne National Laboratory, who is working to make LHC code run on supercomputers. As LHC scientists simulate more complicated events, increasingly powerful supercomputers will be essential tools.

The NSCI doesn’t stop at the exascale. The end of Moore’s Law — which holds that computing power doubles every two years, thanks to improvements in semiconductor technology — is looming. The NSCI tasks scientists with going beyond Moore’s Law to future computing technologies, possibly including quantum computing or neuromorphic computing, which attempts to mimic the nervous system. “A practical quantum computer is still years away, but it’s time to start investing in the necessary research now,” Binkley says.

Exascale computers will invigorate a variety of fields, researchers say. “It enables the young people to do things that haven’t been done before and that brings a different level of excitement to the table,” Tang says. And exascale is only the beginning, he adds: “To me an exascale supercomputer is just a signpost along the way. Human creativity will drive you further.”

©1995 - 2024, AMERICAN PHYSICAL SOCIETY
APS encourages the redistribution of the materials included in this newspaper provided that attribution to the source is noted and the materials are not truncated or changed.

Editor: David Voss
Staff Science Writer: Emily Conover
Contributing Correspondent: Alaina G. Levine
Art Director and Special Publications Manager: Kerry G. Johnson
Publication Designer and Production: Nancy Bennett-Karasik

October 2015 (Volume 24, Number 9)

APS News Home

Issue Table of Contents

APS News Archives

Contact APS News Editor


Articles in this Issue
Open Access Could Mean Authors Pay to Publish
Physics in Iran After the Nuclear Agreement
CEO Robert Brown Discusses Plans for American Institute of Physics
Charges Dropped for Physicist Accused of Sharing Sensitive Technology with China
Blewett Fellowships Help Women Return to Physics
PhysTEC Book Shares Strategies for Educating Physics Teachers
Physicists Set Course for the Exascale
Cycling Across America … For Science!
Letters to the Editor
The Back Page
Members in the Media
This Month in Physics History
Diversity Update
Profiles In Versatility