|The Leading Source for Global News and Information Covering the Ecosystem of High Productivity Computing / December 15, 2006|
Like movie makers who dazzle audiences with virtual imagery created with supercomputers, scientists rely on massive computational power to build digital worlds that simulate major earthquakes to help design better buildings, predict the path of plumes from smokestacks, and even recreate the birth of stars. And just like in the movies, bigger supercomputers help scientists move virtual worlds closer to reality.
Toward that end, the San Diego Supercomputer Center (SDSC) at UC San Diego announced today that its IBM eServer Blue Gene supercomputer has been tripled in size, giving a peak performance of 17.2 teraflops. It would take a person operating a hand-held calculator more than 500,000 years to do the calculations this supercomputer completes every second.
SDSC was the first academic institution to deploy a Blue Gene supercomputer. Since then, the Blue Gene architecture has been adopted more widely.
"As more scientists have used SDSC's Blue Gene Data system, we've had increasing demand for time on it, which led us to expand it," said Richard Moore, Director of Production Systems at SDSC. "With the expansion, we will have tripled the capacity, and scientists will routinely be able to get access to up to 6,144 processors with excellent overall performance, a rare opportunity that will enable new scientific breatkthroughs."
SDSC's Blue Gene Data system has proven well-suited to applications in a range of disciplines, from physics and molecular dynamics to fluid dynamics and other fields. For example, scientists in a major particle physics collaboration are probing the ultimate building blocks of matter using the newly expanded machine. Scientist David Baker of the University of Washington can run his Rosetta protein structure prediction code on the Blue Gene system to design proteins more complex than previously possible, opening the way for new life-giving drugs. Researcher P. K. Yeung of Georgia Tech runs simulations of how substances mix in the chaos of turbulent flows, yielding important insights for such engineering applications as improving combustion efficiency. And Southern California Earthquake Center (SCEC) scientists create virtual earthquakes to guide preparations for the "big one" that threatens California.
"The most powerful supercomputers in the world, Blue Gene systems like the one at SDSC are at the forefront of enabling the next generation of computational science and engineering," said Dave Jursik, vice president of Deep Computing sales at IBM. "The architecture, optimized for bandwidth, scalability, and handling large amounts of data, supports the most advanced applications while consuming only a fraction of the power and floor space required by other systems."
SDSC's powerful IBM eServer Blue Gene system is housed in only three computer racks. Each rack holds 1,024 compute nodes and 128 I/O nodes, which is the maximum ratio of I/O to compute nodes, needed to support data-intensive computing. Each node consists of two PowerPC processors that run at 700 megahertz and share 512 megabytes of memory, giving an aggregate peak speed of 17.2 teraflops.
Blue Gene's efficiencies in power consumption, cooling, and space requirements are vital for institutions hosting large computing power. As the person responsible for managing SDSC's supercomputers, Moore's enthusiasm for this efficiency is clear.
"We're very pleased at how cost-effective this upgrade is," said Moore. "We're adding more than 11 teraflops of computing power for scientists and engineers, with very little incremental system administration time or operations costs.
"It's also impressive that we were able to get the entire system into full production less than a week after the new racks were delivered. IBM has produced an easily extensible machine, and both IBM's personnel and our staff deserve a lot of credit for making this happen."
SDSC's Blue Gene Data machine is also playing an important role in the march toward "petascale" supercomputers -- systems that can run at the blinding speed of one thousand trillion calculations per second -- hundreds of thousands of times faster than a typical PC. SDSC staff has worked with users to scale three important science codes to run on up to 40,960 processors of the largest open system in the world, IBM's 114 peak teraflops Blue Gene Watson system. The Rosetta protein structure prediction code, P.K. Yeung's turbulence simulations, and the SCEC earthquake simulations have all achieved unprecedented scaling, setting the stage for more accurate "virtual realities" that can illuminate the next generation of scientific progress.
Source: San Diego Supercomputer Center