October 23, 2013
Illinois Senator Dick Durbin likes supercomputers, or more to the point he likes what they can do to boost innovation and competitiveness. At the July 1st dedication ceremony for Mira, the IBM BlueGene/Q supercomputer installed last year at the Argonne National Laboratory in Illinois, Senator Durbin said:
"They know the cost [of supercomputers] but they don't know the value. We really need to educate members of Congress. This supercomputing competition is really key to America's competitiveness, and to a lot of breakthroughs that will benefit the whole world."
More recently, Senator Durbin penned an article on the importance of HPC along with Dr. Eric Isaacs, director of Argonne National Laboratory. The esteemed duo recalled some of the many wins for supercomputing: helping the GE Global Research team develop quieter, more fuel-efficient jet engines, optimizing Illinois' power grid, among other difficult challenges.
They note that "across the country, researchers and private companies are using supercomputers built by the U.S. Department of Energy to conduct cutting-edge research and solve tough problems on a scale never before seen." Still, despite decades of supercomputing prominence, the US has relinquished its lead and perhaps put its leadership position in jeopardy.
Senator Durbin and Dr. Isaacs write: "Currently, our nation is home to 252 of the 500 fastest computer systems worldwide. But China, Japan, India, Russia and the European Union are making investments that challenge U.S. leadership in this vital field. In June, the new Tianhe-2 supercomputer at the National University of Defense Technology in China vaulted to the top of the world rankings. At its peak speed, this new Chinese computer is almost four times faster than its nearest rival.
"South Korea, Japan and China are doubling their investments in supercomputing because it saves time, money and energy, strengthening a country's economy and national security. The United States can't afford to fall behind."
The coming exascale era poses a difficult challenge but there is no better opportunity – teaming with both real-world and symbolic significance – than coming together in support of this goal. As President Obama has stated, this is our generation's Sputnik moment.
There is already legislation pending that sets a course to fielding an exascale-class supercomputer. The US Senate's 2014 Energy Appropriations bill will provide $150 million for the Exascale Computing Initiative at DOE. Also up for consideration by the Senate is the ExaSCALE Computing Leadership Act of 2013 – a mechanism for public-private partnerships to research and develop next-generation exascale-class systems by addressing the major technological barriers in power, hardware, reliability, memory and software.
"As we look ahead," the authors write, "it's clear that the road to exascale computing will require substantial public investment. But that expense must be measured against the enormous long-term costs our nation will face if we abandon our quest for leadership in high-performance computing. The country that wins the race to achieve exascale computing capabilities will gain significant intellectual, technological and economic advantages over other nations."
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?