August 12, 2010
Dr. Tim Killeen, representing the National Science Foundation (NSF), last week addressed the annual TeraGrid '10 conference in Pittsburgh, Pa. His keynote emphasized the urgent need for sustainable cyberinfrastructure in the geosciences and across all domains of science.
"The geosciences is a domain in which cyberinfrastructure is incredibly important," Killeen, the NSF assistant director for geosciences, said. "There is need for end-to-end cyberinfrastructure that is accessible to brilliant young career professionals across the country. We need the capabilities now."
The NSF and funding agencies from countries including Brazil, Australia, Russia, Canada, France, Germany, Great Britain, and Japan one year ago declared that they would work collectively "to deliver knowledge to support human action and adaptation to regional environmental change" for global issues such as climate change and the availability of fresh water on the planet.
"We don't have a century to get this right," Killeen said. "We need the resources, sustained investments, smooth transitions, and accessibility of these resources to the brain trust of the nation and internationally. It's amazing when you look at the strategic plans of other countries and see how parallel they are to our own."
The crux of the challenge is developing an earth-human knowledge management system (aka "Earth-Cubed") to support a more complete understanding of the earth system and the human interactions with that system. "It's a scientific and technical challenge for the 21st century."
Killeen asked each conference participant to consider their role and the TeraGrid's role with regard to "Earth-Cubed" and cited the geosciences as an example of the domain requirements and cyberinfrastructure needs put on the TeraGrid community.
"I have yet to see a high-performance computing center that doesn't use the geosciences as a driver or rationale as to why we need this type of capability," he said.
Yet, focusing on sustainability is going to stretch the NSF and other agencies to do this in a robust way, Killeen said. "It's going to place demands on the types of products and services that come out of TeraGrid. Cyberinfrastructure's interface with science and society is going to be challenging -- no question about it."
Currently, the NSF's Geosciences Direcorate invests 10 percent of its overall budget in cyberinfrastructure in addition to the investments made by the NSF Office of Cyberinfrastructure (OCI). "We like to invest in multiple approaches with multiple outcomes...things that enhance productivity and capability. We look to the community for direction and priority. We like to understand the full life cycle costs and process. We want to anticipate increases in demand, make new investments, and address workforce issues."
It's a very exciting time for the geosciences, according to Killeen. There are new ways of looking at the Earth system and new methods by which much tougher problems that require advanced computing simulations are addressed. In addition, the data volumes are large and the data return is in the upper 90th percentile. "It's about the data and the value that data brings to understanding. Data-intensive computing is a very high priority," he said.
According to Killeen, on-demand, global experiments in the geosciences are becoming the norm. But, the changing context as it relates to human interactions with the Earth system is much more complex. It requires another level of integrated assessment models that can only be achieved through advanced computing simulation.
The next generation of models will be able to resolve detailed processes in the oceans, atmosphere and land. "We've reached this threshold, in part, by TeraGrid's efforts in building the appropriate cyberinfrastructure and computational capability. We're coming to a point where the sensor arrays and the development of the earth models are maturing at the same time. Models have become substantially more complete; they drive the most capable computation."
Killeen said the common architecture for cyberinfrastructure includes hardware, software, data provision, networking, sensor deployment, model assimilation, middleware, tools, Web services, cloud computing, free global access to information, and single-password authentication, as well as integrated services (data and instrument services, governance activities, computational expertise).
"Overall, it's a sustained investment and a balanced approach to drive transformation in the scientific disciplines. This is what NSF wants to get to...it involves training people to address incredibly important societal challenges with all the tools at our command. It's going to be the cyberinfrastructure that transforms the geosciences and takes it to the next level. We're now poised to do it in the next 10 years."
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?