December 13, 2010
Dec. 13 -- When you spot laborers pouring concrete for a new building or bridge, it may not occur to you that they are working with a substance so complex that it requires the world's most powerful computers to understand its behavior. But the Department of Energy (DOE) has just awarded a team of computer scientists, physicists and mathematicians at the National Institute of Standards and Technology (NIST) millions of hours of supercomputing time to analyze concrete flow, in the hope that this common material can be improved.
The DOE's INCITE award provides the NIST team with up to 75 million "processor hours" over three years on the IBM Blue Gene/P computer at Argonne National Laboratory near Chicago; the first year provides 25 million hours. One hour is equivalent to one computer processor crunching numbers for 60 minutes, and the Argonne machine contains many thousands of processors that can run in parallel.
The team's aim is to simulate and analyze the movement and interaction among the many particles and fluids that comprise unset concrete -- by no means an easy task. The particles vary widely in size -- from large stones to grains of sand -- and the fluid in which they are suspended flows in ways that change depending on the pressure during stirring and mixing.
All in all, it's a heap of mathematical trouble to model such a material, so why bother? Concrete's use stretches back millennia, but modern additives make it possible to create concrete that is not only strong but translucent, flexible, or possessing a number of other qualities that make it as valuable to today's architects as to ancient Rome's. There's a $100 billion industry in concrete in the United States alone, making it critical to stir up the best concrete possible to improve our infrastructure.
"You can mix in materials to make concrete better, but right now choosing these additives is based largely on guesswork. We'd like to be able to improve it intelligently," says NIST's William George. "This is by far the largest supercomputing project that NIST has ever taken on in terms of time and processing power, and we hope it will provide the world with more long-lasting, sustainable concrete."
Source: National Institute of Standards and Technology (NIST)
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?