September 24, 2012
BOISE, Id., Sept. 24 — Boise State researchers are generating astonishing amounts of data, but often face obstacles when it comes to effectively accessing and analyzing that data. That’s about to change, now that a collaborative group of Boise State University faculty from engineering, biological sciences, geosciences and computer science have received a grant from the National Science Foundation (NSF) to build a new high-performance computing and visualization instrument.
When the project is completed, researchers across multiple disciplines will have access to vastly improved capabilities for tackling large computational problems.
The $555,384 grant was awarded under NSF’s Major Research Instrumentation (MRI) Program. The funding will be used to build a 32-node GPU/CPU cluster with a data storage array and a 5×8 foot tiled display that will be located in a visualization theater setting. GPU computing uses graphics processing units (GPUs) together with conventional central processing units (CPUs) for faster processing of computational science and engineering problems. The GPU/CPU cluster will support parallel computing and rendering, data storage and high-resolution imaging.
“Without supercomputing resources, computational analysis and massive data stores can become more of a burden than a help,” said Inanc Senocak, associate professor of mechanical and biomedical engineering and the principal investigator on the project. “This new computing cluster will extend our range of exploration in science and engineering projects and substantially accelerate the time it takes to get results.”
Senocak and co-investigators Peter Müllner (materials science and engineering), Hans-Peter Marshall (geosciences), Julie Oxford (biology) and Tim Andersen (computer science), have proposed using the computing cluster to support research projects as diverse as wind energy forecasting, modeling for threat reduction in chemical and biological defense, materials characterization and modeling, snow hydrology and remote sensing, and mechanisms of skeleton development in living systems.
“The benefits of this grant will be felt far beyond Boise State University,” said Amy Moll, dean of the College of Engineering. “The plan is for the parallel computing and visualization cluster to be housed at a facility open to university researchers, as well as local technology companies and partners. This advanced cyber-infrastructure resource has the potential for a huge impact on our regional economy.”
Senocak said that the researchers involved in the grant also plan to make this cyber-infrastructure accessible to high-school science, technology, engineering and math scholars through outreach activities such as hands-on exercises for modeling and simulation, visualization of earth and space scientific data and high-resolution imagery.
“If we look beyond the obvious benefits to the researchers, we can only imagine the profound impact this kind of experience might have on the next generation of scientists and engineers,” he added.
Source: Boise State University
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?