April 01, 2011
The Department of Energy's Office of Science website currently offers as one of its feature articles a detailed look at how advances in high-performance computing have brought the power of simulation to bear on almost every facet of the scientific landscape. Dr. Steven E. Koonin, Under Secretary for Science, examines the link between computer simulation and scientific progress, citing a variety of real-world disciplines that have been enhanced by significant, sustained progress in the computational domain.
Koonin explains how the DOE makes supercomputing resources available for both scientific and industrial simulation endeavors. Last fall, Koonin's office held a Simulations Summit in Washington, which brought together more than 70 leaders from academia, industry, government, and national research laboratories to discuss how science and technology policies affect the nation's ability to compete on a global playing field. Keynote speaker Secretary Chu emphasized that "the DOE strategy should be to make simulation part of everyone's toolbox."
The Department of Energy's Office of Science (SC) is addressing that need by pushing the boundaries of computing and simulation to advance key science, math, and engineering challenges facing the nation. SC makes advanced supercomputers available and supports high-fidelity simulations that give scientists the power to analyze theories and validate experiments that are dangerous, expensive or impossible to conduct. Scientific simulations are used to understand everything from stellar explosion mechanisms to the quarks and gluons that make up a proton. They can tell us how blood flows through the body and how to make a more efficient combustion engine. And they can do much more.
Koonin goes on to list the some of the merits of a fully-supported national supercomputing strategy:
Improvements in high-performance computing benefit all computer users, not just those who use these world-class machines. Hardware innovation to drive down the energy consumption of processors and memory for exascale machines will be directly applicable to commodity electronics, making portable computers and smart phones much more powerful. Private sector consumers of high-performance computing use simulation to accelerate and reduce the cost of innovation in the design and manufacturing of their products, in applications stretching from advanced materials for engines and airplane wings to advanced chemicals for household products to the design of newer and faster consumer electronics.
More and more, scientific breakthroughs are predicated on continued, steady progress in computing. As Koonin notes, the US still leads the world in computing. Today's supercomputers are one trillion times faster than the their 1950s counterparts, and more than half of the TOP500 systems originate in the US. Koonin credits the actions of the Department of Energy for much of this progress, but warns that continued government support is necessary to sustain the current trajectory: "A golden moment has presented itself to continue U.S. leadership in simulations," Koonin remarks, "but concerted action and continued DOE leadership are necessary to turn this opportunity into reality."
Full story at Department of Energy's Office of Science
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?