November 08, 2011
WEST LAFAYETTE, IN, Nov. 8 -- For Tyler Reid, a junior in computer science from Zionsville, being a member of Purdue’s student supercomputing team means a chance to get his hands on some of the latest hardware, an opportunity too good to pass up.
The six-member team, which built its own supercomputer this semester, will be competing Nov. 14-16 in the 2011 Cluster Challenge, the student competition at SC11, the world’s largest supercomputing conference. The conference is being held in Seattle Nov. 12-18. ITaP is sponsoring the Cluster Challenge team with Intel.
“I love to solve problems on the fly,” says Reid, who’s on Purdue’s Cluster Challenge team for the first time. “I also am looking forward to competing against teams of students from all over the world.”
The Purdue team was one of eight qualifiers for the 2011 competition, one of just four from the U.S. The team will compete against teams from China, Russia and other countries. Purdue’s is the lone team from the Big Ten.
In addition to Reid, the Cluster Challenge team members are Alex Bartol, a senior in computer science from Fort Wayne; John Blaas, a senior in computer and information technology from Lafayette; Joad Fattah, a junior in computer science from Carmel; Michael Heffernan, a senior in computer science from Kokomo; and Andrew Huff, a junior in computer science from Cary, N.C
“This year's team has a nice mix of experience, ingenuity, and skills, with three of the members returning from last year's competition,” says Mike Baldwin, a Purdue atmospheric scientist serving as faculty advisor.
Intel is Purdue’s partner in the Cluster Challenge and with 160 of the company’s processors inside, a hundred times more than a typical personal computer, the 2011 entry is akin to a mini version of the Hansen cluster supercomputer Purdue installed over the summer and the three other clusters ITaP has built in partnership with Purdue Faculty since 2008. Researchers in earth and atmospheric sciences, chemistry, physics, computer science, aeronautics and astronautics, electrical and computer engineering and materials engineering, among other fields, use Hansen.
Likewise, the Cluster Challenge team — limited to undergraduates — has to prepare its machine to run an assigned selection of real research software crunching voluminous sets of sample data as quickly and efficiently as possible. The 2011 applications are used for studying the actions of chemical molecules, the colliding and merging of galaxies, the basic workings of biological life and the motion of the oceans. Most of this year’s Purdue team helped build Hansen as student workers at the Rosen Center for Advanced Computing, ITaP’s research computing unit.
“We have fewer nodes, more processing power per node, and we have solid state drives for each node,” Fattah said of this year’s Cluster Challenge entry. “So overall, everything is more power efficient and faster, with the bottlenecks in particular made both fewer and faster.”
The students share the load monitoring and adjusting their supercomputer around the clock as they work to process as much data as possible and to meet a 26-amp power usage limitation that’s a nod to what are becoming problematic energy demands from high-performance computing systems.
The Purdue team began working on its machine even before the semester started and is tailoring the research software to run it to best advantage on the cluster’s hardware.
“We are all much more familiar with the applications compared to last year,” Bartol says.
As part of their involvement in the Cluster Challenge, the students on the team are in a high-performance computing class taught by Baldwin, an earth and atmospheric sciences professor. But they spend hours outside class getting ready and have to juggle other classes, homework and exams to attend SC11 for the competition.
“It’s been a real-eye-opening experience as far as how hard it is to get certain applications to run,” Blaas says. “But it’s also been pretty fun, actually going from the ground up.”
“I am graduating in December so being able to get this chance to delve deeper into high-performance computing has been very beneficial in developing skills,” Blaas adds. “I would like to stick to an academic computing environment, but I can see my skills being applicable to a lot of the bigger companies that are now offering cloud computing as a service.”
Like Bartol and Fattah, Heffernan is participating in his second Cluster Challenge. All three were team members in 2010.
“I decided to join again because not only did I have a great experience last year, I also learned a lot,” Heffernan says. “Participating in the Cluster Challenge has exposed me to areas of computer science that aren't taught in school. This makes me a more dynamic and well-rounded individual and potential employee.”
Huff, who hopes to pursue a doctorate in computer science, echoed those sentiments.
“Designing and building a small cluster sounded like an interesting endeavor,” Huff says. “It’s allowed me to dive into areas of high-performance computing that I normally don't get to touch so it's a great learning opportunity.”
Source: Purdue University
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?