July 20, 2011
A group of researchers at Fermi National Accelerator Laboratory have been granted 80 million processor hours at the Oak Ridge Leadership Computing Facility (OCLF) and the Argonne Leadership Computing Facility (ALCF) to better understand the interaction of these subatomic particles as they collide with the four fundamental forces of nature.
Collectively, this research falls under the banner of quantum chromodynamics, which involves finding ways in which basic interactions among gluons and quarks and the forces of nature can shed light on dominant theories in physics.
Protons, neutrons and electrons contain smaller particles that are called quarks and gluons, which have been the subject of numerous studies as scientists look for meaning how these subatomic particles interact.
The scientists behind the project will be reliant on the power of the Cray XT4/Xt5 at the OLCF and the IBM Blue Gene/P at the ALCF will allow the researchers to produce “and validate the high-precision lattice QCD calculations that are essential to the analysis of recently completed experiments in high-energy and nuclear physics and other studies currently in progress.” As a release from OLCF stated, “Simulations are used to relate the fundamental theoretical equations governing quarks and gluons to predictions of physical phenomena made in laboratories.”
The research teams will use Monte Carlo techniques to anticipate and the random movement of particles. As OCLF explained:
“Employing fundamentals from the Standard Model of subatomic physics, team members are exploring quark properties and dynamics. They are trying to determine the mass spectrum and coupling of strongly interacting particles and the electromagnetic properties of particles made up of interacting quarks (baryons and mesons) to create an understanding of a nucleon’s internal structure.”
The work was funded by the National Science Foundation and physics-related departments within the U.S. Department of Energy. According to Kathryn Jandeska, the use of the two supercomputing facilities has led to major advances in the field by yielding strong data to hold up to current theories in high-energy and nuclear physics.
More on the theories and source of the research can be found in this detailed writeup from Oak Ridge National Laboratory.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?