December 02, 2005
In a paper which was featured on the cover of the July 28, 2005 issue of Nature, an international group of researchers reported the first observation of geologically produced anti-neutrinos. The observation is giving scientists new insight into the interior of our planet.
While the "geo-neutrinos" were detected at the KamLAND facility in Japan, most of the data was stored on High Performance Storage System (HPSS) at the U.S. Department of Energy's National Energy Research Scientific Computing Center (NERSC) and analyzed using the PDSF cluster at NERSC. Together, these systems allowed scientists to find the scientific equivalent of a needle in a very large haystack.
KamLAND records data 24 hours a day, seven days a week. These data are shipped on tapes from the experimental site to LBNL, where it is read off the tapes and stored in the HPSS at NERSC. KamLAND records about 200 GB of data each day and HPSS currently has more than 250 TB of KamLAND data stored, making KamLAND the second-largest user of NERSC's HPSS system.
The KamLAND experiment, located in a mine in Japan, is a 1 kiloton liquid scintillator detector that was built to study anti-neutrinos coming from Japanese nuclear reactors, which are about 200 km from the detector. KamLAND is the first reactor experiment that observed the disappearance of electron anti-neutrinos from the reactor to the detector. Last year, the experiment also showed that the energy spectrum has a distortion typical of neutrino oscillation and measured the so-called mass-splitting, a key parameter in neutrino oscillation.
During dedicated production periods at NERSC, the KamLAND data are read out of HPSS and run through the reconstruction software to convert the waveforms (essentially oscilloscope traces) of about 2,000 photo-multiplier tubes (PMTs) to physically meaningful quantities such as energy and position of the event inside the detector. This reduces the data volume by a factor of 60-100 and the reconstructed events are stored on disk for further analysis.
"The event reconstruction requires a lot of computing power, and with over 600 CPUs, PDSF is a great facility to run these kinds of analysis," said Patrick Decowski, an LBNL physicist who works with NERSC staff on the project. "PDSF has been essential for our measurements."
With the data on disk, specialized analysis programs run over the reconstructed events to extract the geo-neutrinos and perform the final analysis. PDSF is also used for various simulation tasks in order to better understand the background signals in the detector.
"The whole analysis is like looking for a needle in a haystack - out of more than 2 billion events, only 152 candidates were found," Decowski said. "And of these, 128 - plus or minus 13 - are background events."
Despite the poor signal-to-background ratio of the early measurements, they are nonetheless exciting, since the data open up a completely new field on how to study the Earth's interior. Forty years ago, the late John Bahcall proposed the study of neutrinos coming from the sun to understand the fusion processes inside the sun. The measurement of a persistent deficit of the observed neutrino flux relative to Bahcall's calculations led to the 2002 Nobel Prize for Ray Davis and the discovery of neutrino oscillation.
Today, anti-neutrinos are being used to study the interior of the Earth, which is still little known. The deepest borehole ever drilled is less than 20 km in depth, while the radius of the Earth is more than 6000 km. While seismic events have been used to deduce the interior makeup of the Earth's three basic regions - the core, the mantle and the crust - there are no direct measurements of the chemical makeup of the deeper regions.
An important measurement to understand the Earth is the measurement of the heat-flux coming from within. These measurements show that the Earth produces somewhere between 30 and 45 TW of heat. Two important sources of heat generation are the primordial energy released from planetary accretion and latent heat from core solidification. However, it is believed that radiologically produced heat (heat from radioactivity) also plays an important role in the Earth's heat balance, contributing perhaps half of the total heat.
Neutrinos can help in the understanding of the Earth's internal structure and heat generation. Three important isotopes that are part of current Earth models - potassium, uranium and thorium - produce electron anti-neutrinos in their radioactive decay. These neutrinos (so-called geo-neutrinos) only interact with the surrounding Earth material very weakly and almost all of them reach the surface of the Earth.
However, occasionally they do interact with normal matter, and by building a large device that can detect them, something can be learned about the abundance of these isotopes. This allows scientists to study part of the composition of the Earth and most importantly, provide an estimate of the amount of heat produced through radioactive decay. The research is a multinational effort, as shown by the fact that the Nature article represented the work of 87 authors from 14 institutions spread across four nations.
This is a reprint of an article originally published by Berkeley Lab Computing Sciences.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?