August 19, 2010
OAK RIDGE, Tenn., Aug. 19 -- Buried in mountains of meteorological and hydrological data are likely clues that could help in predicting floods, hurricanes and other extreme weather events.
Through a new multi-institution project that includes the University of Tennessee and the Department of Energy's Oak Ridge National Laboratory, Auroop Ganguly and colleagues plan to use data mining techniques to enhance the accuracy of climate and earth system models. The $10 million project is funded by the National Science Foundation and led by the University of Minnesota. This project is one of the largest investments made by NSF's Directorate for Computer and Information Science & Engineering.
"We want to be able to predict large shifts in regional climate patterns or statistical attributes of severe meteorological and hydrological events with greater accuracy to assist decision makers," said Ganguly, a senior staff member in ORNL's Computational Sciences and Engineering Division and faculty member at UT.
Ganguly noted, however, that there could be valuable information hidden within massive volumes of model data like temperature and humidity profiles or sea surface temperatures. Provided data mining algorithms are made sophisticated enough to capture the complexity of climate data, the extracted data could help improve predictions of crucial variables in impacts like extreme rainfall or hurricanes.
"Recent research appears to suggest this may be possible, but a systematic approach has been lacking," Ganguly said. "This is precisely what this NSF proposal aspires to achieve through innovative approaches in computational data sciences."
Based on the traditional strengths of UT and ORNL in climate modeling and impacts assessments as well as new vision in areas like knowledge discovery, Ganguly expects the collaboration to achieve great success in pushing these boundaries to more fully understand variables in climate change.
ORNL's Oak Ridge Climate Change Science Institute, or CCSI, is helping organize the lab's contributions to the project as part of its goal to integrate scientific projects in modeling, observations, and experimentation with ORNL's powerful computational and informatics capabilities to answer some of the most pressing global change science questions.
The researchers propose to leverage the Leadership Computing Facilities at ORNL, which are funded by the Department of Energy's Office of Science.
The interdisciplinary project team features 13 researchers from seven institutions, who hope to build a bridge between credible projections from physics-based climate models and crucial requirements for impacts and integrated assessment to inform stakeholder and policy needs.
"A data-driven approach to informing societal decisions can be a significant step forward for policy making as well as for sustaining the nation's critical infrastructures and key resources," Ganguly said.
ORNL is managed by UT-Battelle for the Department of Energy's Office of Science.
Source: Oak Ridge National Laboratory
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?