December 02, 2005
In honor of the 25th anniversary of a scientific paper describing the first use of Monte Carlo methods and lattice gauge calculations in the study of quantum chromodynamics (QCD) -- the theory that describes the interactions of subatomic particles -- scientists gathered at the U.S. Department of Energy's Brookhaven National Laboratory for a morning of talks to dedicate the newest supercomputer devoted to these studies. Known as the US QCDOC (quantum chromodynamics on a chip), the 10-teraflop computer is sister to a similar machine dedicated at Brookhaven Lab in May and also one located at the University of Edinburgh in Scotland.
Like those machines, the US QCDOC was designed and built by a collaboration of Brookhaven Lab, Columbia University, IBM, the UKQCD collaboration in the United Kingdom, and RIKEN (the Institute of Physical and Chemical Research in Japan). Capable of performing 10 trillion arithmetic calculations per second, the $5-million machine is already being used by U.S. scientists for calculations in quantum chromodynamics.
Quantum chromodynamics is largely concerned with the interactions of quarks, the particles that make up more-familiar protons and neutrons in the nuclei of ordinary atoms. In 1974, Nobel laureate Kenneth Wilson, now at Ohio State University, proposed "placing" the quarks at discrete points on a geometric crystal-like lattice to simplify the study of these interactions.
Brookhaven theoretical physicist Michael Creutz -- along with colleagues Claudio Rebbi, now at Boston University, and Laurence Jacobs, now at kdlabs AG in Zurich -- then demonstrated that one could compute the properties of these interactions using random numbers, or Monte Carlo techniques, to explore the vast range of possible values for the fields binding the quarks on the lattice. Creutz's seminal paper, showing that the force between quarks did not decrease even as they moved farther apart, was published in the journal Physical Review in 1980.
"This paper became the most cited of the year, and Creutz's computational methods have since been applied to many other QCD calculations and a variety of theoretical problems in physics," said Sam Aronson, Brookhaven Lab's Associated Laboratory Director for High Energy and Nuclear Physics.
Creutz and fellow theoretical physicists from Brookhaven and other U.S. institutions will use the new machine to continue to explore the complex interactions of quarks, and to predict and better understand some of the properties now being observed in a new form of matter under study at Brookhaven's Relativistic Heavy Ion Collider (RHIC). Recent analysis has revealed that the "soup" of quarks created in this machine behaves more like a liquid than the gas that had been predicted. So-called "finite temperature QCD" calculations, currently being performed on the QCDOC machines are aimed at determining the apparently complex structure of this liquid.
Though the QCDOC architecture was designed for efficient calculation of strong interactions, it turns out to have much broader applicability. The machines are currently being applied to problems in computational biology and chemistry as well as physics.
The US QCDOC machine was built with funding from the offices of High Energy Physics, Nuclear Physics, and Advanced Scientific Computing Research within DOE's Office of Science and receives operational funding from the offices of Nuclear Physics and High Energy Physics. It was built as part of the United States Lattice Gauge Theory Computational Program (USQCD) with the understanding that Brookhaven Lab would be one of three labs providing the primary computational resources for USQCD, the others being Thomas Jefferson National Accelerator Facility and Fermi National Accelerator Laboratory.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?