Note to U.S.: Uphold High-Performance Computing Leadership
May 19, 2016
Much is made about the U.S. imperative to maintain the strongest military, but in today’s technology-driven world, American exceptionalism in high-performance computing (HPC) may be just as important to the country’s long-term future.
HPC is the centerpiece of modern competitiveness, as essential to U.S. national security as it is to scientific discovery and commercial innovation. In particular, HPC is a critical enabler of new product design and development, serving as the platform for robust simulation studies that help manufacturers virtually validate and test ideas to come up with optimized solutions that aren’t feasible or cost-effective with traditional physical prototyping methods. America’s ability to maintain its leadership position developing HPC technology, including making it readily accessible to both large corporations and small-to-mid-size businesses, is a critical objective, and one that should not be taken lightly.
A recent report released by the Information Technology & Innovation Foundation (ITIF) raised concerns about an eroding U.S. HPC position as the race intensifies and countries like China, Japan and Russia amplify their own HPC development efforts. While the ranking of the world’s top five fastest supercomputers has remained static for the last few years, the deck was shuffled upon the November 2015 launch of China’s Tianhe-2, which boasts a peak theoretical performance speed of 54.9 petaflops. The Tianhe-2 is twice as fast as the second fastest supercomputer — the United States’ Titan — which operates at a maximum speed of 27.1 petaflops from its home base at the Oak Ridge National Laboratory in Tennessee.
As it that wasn’t alarming enough, China is said to be furiously at work to best its own record, planning to release of a pair of 100-petaflop-capable supercomputers some time this year. While the United States is by no means standing still (the Department of Energy (DOE) contracted with IBM and NVIDIA to launch two 150-petaflop supercomputers) their entry is not expected until the 2017-2018 timeframe. And the real long-term race, according to the ITIF report, will come down to which country develops the first “exaflop” HPC platform, a milestone being tackled not just by the United States, but by the European Union, China and Japan, all with a 2020 target date.
There are other indications that the U.S. HPC standing could be in jeopardy. The Top 500 list of supercomputers, ranked every six months, has the United States at the top of the list with 199 supercomputers as of November 2015. However, that 199 figure represents the fewest number of supercomputers the country has placed on the list since the ranking came into being in 1993, and a 14% decline from November 2014 when it placed 231 supercomputers, according to research detailed in the report. China, honing its HPC muscle, is sneaking up with 109 supercomputers listed in 2015.
Stephen Ezell, vice president of global innovation policy at ITIF and one of the lead authors of the report, contends we’re at an inflection point with HPC. “This is a technology where global leadership is fiercely contested,” he notes. “While the U.S. position remains strong, if we don’t commit ourselves to making continuing investments, we will rapidly continue to lose ground.”
Maintaining our leadership both in producing and using HPC technology is equally important. The manufacture of HPC equipment provides a robust source of employment, exports and economic growth for the United States. In addition, there are no guarantees that U.S. companies and research entities could get access to state-of-the-art HPC technology if it was primarily manufactured in another country, Ezell says. Finally, HPC systems are not produced in a vacuum — they are often the result of strong co-design partnerships between vendors and customers, and this symbiotic relationship is critical for pushing the frontier of HPC systems forward, he contends.
While the situation is by no means dire, the ITIF has made a credible case, both for the importance of HPC for competitive advantage and for why the United States needs to do everything in its power not to cede its lead. The Obama administration seems to get the message and has taken some pretty productive steps to make sure that doesn’t happen. The National Strategic Computing Initiative, launched in July 2015 by way of executive order, is a federally funded R&D effort spanning multiple agencies that is tasked with keeping the United States at the forefront of HPC development, including building exaflop supercomputers, making HPC resources more available to the public and private sector, and helping HPC application developers be more productive.
There are also numerous efforts underway to bring HPC resources to smaller companies. The National Center for Manufacturing Sciences (NCMS) has created a dozen centers across the United States to help connect manufacturers with HPC resources, and the Ohio Supercomputer Center’s AweSim program, a result of a partnership between the OSC and simulation experts, also assists SMEs with simulation-driven design.
Beyond those initiatives, the ITIF report has a number of other recommendations to ensure that America has a bright HPC future including:
- Hold hearings on the NSCI and the intensifying race for global HPC leadership to keep the issue front and center.
- Authorize and appropriate NSCI funding levels.
- Reform export control regulations to match the reality of current HPC systems.
- Continue work on technology transfer and commercialization activities at the country’s national labs.
- Emphasize HPC in federal worker training and retraining programs and in relevant Manufacturing Extension Partnerships.
About the Author
Beth Stackpole is a contributing editor to Digital Engineering. Send e-mail about this article to DE-Editors@digitaleng.news.Follow DE