April 11, 2013
Now that the science teams working around the Large Hadron Collider (LHC) have confirmed the existence of the Higgs boson, there are questions about what will be researched next as the facility is shut down for two years to perform hardware upgrades.
Helping point the direction of the LHC’s next research project is Gordon, a supercomputer at the San Diego Supercomputing Center on the campus of the University of California at San Diego (UCSD).
Through a partnership between the UCSD physics department and the Open Science Grid (OSG), which is a multidiscipline effort funded by the US Department of Energy and the National Science Foundation, Gordon has been processing massive datasets that are generated by one of the particle detectors at the LHC called the Compact Muon Solenoid, or CMS.
Gordon has shown its scientific computing prowess by processing 125 terabytes of data over four weeks, reportedly making it available for analysis months before it was scheduled to. According to UCSD, this was accomplished over 1.7 million core hours on Gordon, encompassing about 15 percent of the supercomputer’s compute capacity.
“With only a few weeks’ notice,” said Frank Wuerthwein, physics professor at UCSD and member of the CMS project, “we were able to gain access to Gordon and complete the runs, making the data available for analysis in time to provide crucial input toward international planning meetings on the future of particle physics.”
That future might include the search for dark matter, a search which could prove to be orders of magnitude more difficult and complex than the search for the Higgs boson. The properties of the Higgs were already predicted by the standard model of particle physics, meaning physicists had more of an idea of what to look for.
With dark matter, a material that accounts for what is estimated to be 90 percent of the universe’s mass, the properties are less clear. A principle called supersymmetry, which is represented in part below, could be the theoretical guide.
However, according to Wuerthwein, no physical evidence yet exists to support the hypothesis that unifies all fundamental forces at the moment of the Big Bang.
Whichever hypothesis or principle ends up driving physics forward, Gordon figures to be a part of that research as scientists convene and plan out the future of the study of particle physics. “Giving us access to the Gordon supercomputer effectively doubled the data processing compute power available to us,” adds Lothar Bauerdick, OSG’s executive director and the U.S. software and computing manager for the CMS project. “This gives CMS scientists precious months to get to their science analysis of the data reconstructed at SDSC.”
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?