December 09, 2005
The National Center for Supercomputing Applications (NCSA) is leading the development of data-management and data-processing tool for the Dark Energy Survey (DES), a multi-institution collaboration aimed at discovering the nature of "dark energy," a mysterious substance discovered in 1998 that is causing the expansion of the universe to accelerate rather than slow down. The data pipeline currently is undergoing its first phase of testing at NCSA.
To try to understand dark energy, a new large (0.5 gigapixel) camera will be installed on a National Optical Astronomy Observatory (NOAO) telescope in Chile. Beginning in 2009, this telescope will deeply image 10 percent of the sky for 500 nights, producing a precise picture of the largest-scale structures in the universe and a detailed measurement of how those structures have evolved over cosmic time. Scientists believe this data will lead to insights about the nature of dark energy.
The telescope will generate a flood of data to be stored, managed, shared, mined, and visualized. In collaboration with scientists from the University of Illinois, Fermilab, and NOAO, NCSA is leading the development of cyberinfrastructure for data management and processing. The DES camera will send data to NCSA for archiving and processing using robust, automated grid-based tools, and the processed data will be served to the research community through an archive system deployed at NCSA.
This data-processing pipeline is undergoing its first phase of testing at NCSA. Simulated images of about 1 percent of the sky that are generated at Fermilab are being routed through the data-processing pipeline to test its performance and scalability. Raw data is moved from the archive at NCSA to the TeraGrid, the National Science Foundation's network of distributed computational systems, for processing. Then the processed data are moved back to the NCSA archive, where the measured position and brightness of about 50 million objects are added to a database that is used to test the accuracy of the pipeline. The results of these tests will be used to refine the development model leading to the next Data Challenge in October 2006.
The DES data management project is led by University of Illinois at Urbana-Champaign astronomy and physics professor Joe Mohr and NCSA project manager Cristina Beldica. The DES project is adopting and testing prototyped middleware developed by the Large Synoptic Survey Telescope (LSST) group at NCSA, which is led by Ray Plante.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?