September 23, 2005
From Sept. 11 to Sept. 15, 80 meteorologists and HPC experts from 12 countries and five continents attended the bi-annual CAS 2005 workshop on the use of HPC in meteorology, held at the idyllic Imperial Palace Hotel, Annecy, France, organized by the National Center for Atmospheric Research (NCAR).
This excellent, relatively small and friendly workshop provided a tour de force in meteorological and computing techniques by active practitioners striving to maximize the latest HPC technology to refine and improve their climate prediction models. It was augmented by talks from broader scientific centers of excellence, such as NERSC and ORNL from the USA and CCLRC from the UK, presenting the e-science program. Most presenters came from sites in the U.S. with large IBM P3/4/5 systems, while the European contingent included a strong representation from sites with large NEC SX-6 and SX-8 systems. This article highlights a few of the many climate issues raised by presentations given at this workshop.
There were 39 presentations in four and a half days, some describing the Grid's enabling potential for international collaboration within the community of climate system modeling (CCSM). The talks were crammed with technical information on how to use parallel supercomputers for computation using mathematical models, which describe climate/weather patterns over time. They were interspersed with weather maps and video pictures from simulations and these were compared with satellite pictures of the behavior of actual weather events.
Why are meteorologists doing all this Earth System Modeling and what is the urgency? Dramatic reports of flooding and other climate change events now appear frequently in the press and on television. Climate simulations show that intensely hot summers and increase in rainfall, causing flooding, are likely to become more common. These images are injecting a political dimension into the proceedings.
The destruction of New Orleans by hurricane Katrina provides a cautionary tale. The Earth System Models correctly predicted the path of Katrina days before it hit the historic city. Predictions were communicated to the authorities, yet the emergency response infrastructure failed in its mission to minimize damage to property and empower prompt evacuation of citizens while minimizing loss of life and inevitable pain. It is easy to scapegoat individuals about this failure, but the reality goes much deeper. It is a systemic failure on how fast scientific discovery is inserted into public domain infrastructures for the benefit of society at large. Reporting new knowledge in scientific forums is not enough. It requires effort to engage the political class so it can take ownership of this knowledge and inform policy for risk management infrastructure, so it becomes part of the fabric of the emergency response process. In the case of Katrina, the fact that the Bush administration is in denial about global warming exacerbated this lack of preparation. This denial mindset inevitably downgrades fiscal requirements to deal with potential risk. For the people of New Orleans, it was a catastrophic tragic outcome.
Experiments by the Intergovernmental Panel for Climate Change (IPCC) were completed this summer. The fourth assessment report, to be published in 2007, is being prepared. The scientific results from these experiments predict a grim future. The trend is clear. More extreme weather, flooding, droughts, stronger and more frequent hurricanes and climate changes involving decertification and higher sea levels, are inevitable during this century. The experiments show that human activity is contributing to global warming. The reduction of snow in the north, the melting of glaciers, the projection of no snow in the north in the year 2100 (even at the North Pole) and the implied rise in sea level, raise questions on the state of the atmosphere, ocean, sea-ice, land surfaces and humankind. In short, there is a perceived pending catastrophe, because of global warming exacerbated by greenhouse gases and other pollutants from human activities.
Some scenarios show that sea level rise alone could deprive a billion people of food in the next 100 years. Insurance companies cannot protect against consequences of this magnitude. Thus, the stakes are high and finding answers to the socio-economic effects of climate change has climbed to the top of the political agenda, but sadly not in the U.S., where more than a quarter of pollutants and greenhouse gases are generated, as per the communicate from August's G8 meeting at Gleneagles.
The key goal of the climate change efforts is to develop and enhance our capability to monitor and predict how the Earth System is evolving. Temporal scales seasonal and inter-annual, weather forecasting and climate change predictions are dominated by initial conditions of the atmosphere, the oceans and by forcing factors (naturally-occurring and human-induced).
Warren Washington, chairman of the NSF National Science Board, presented the findings of his team at NCAR, called "IPCC climate change simulations of 20th and 21st century: Present and Future"
"The Community Climate System Model (CCSM), has produced one of the largest data sets for the IPCC fourth assessment," he said. "As a result of this and other assessments, most of the climate research science community now believes that humankind is changing the earth's system and that global warming is taking place."
CCSM is a comprehensive system for simulating the past, present and future climates of the Earth. It grew out of a collaborative development effort involving NCAR, university investigators and scientists from several U.S. federal agencies. One of CCSM's distinguishing features is that the complete source code, documentation and simulation data sets are freely distributed to the international climate research community. It initially consisted of four major components representing the atmosphere, ocean, sea ice and land surface. The exchange of energy, water and other constituents at the interfaces among these components is simulated using a flux coupler.
The current version of the model, called CCSM3, has been developed to facilitate work on a variety of scientific problems. These include the interactions between aerosols and climate, the relative importance of natural and anthropogenic forcing from the last millennium, and the nature of abrupt climate change. Results from CCSM3 form the basis for NCAR's contribution to forthcoming international (IPCC and WMO) climate fourth assessments. This talk chronicled the major new features and improvements in CCSM3 relative to its predecessors.
These include new radiation and cloud parameterizations in the atmosphere; heating of the ocean surface by chlorophyll and detailed vegetation ecology. The improvements in simulations of present-day climate produced by the new model physics were illustrated with recent coupled experiments. Global and regional climate aspects investigated using a climate model included El Nino, La Nina, monsoons, the north Atlantic oscillation and the Arctic oscillation.
The controversy of global warming was settled in 2005. With more green house gases climate models project: Troposphere temperature increase, stratosphere temperature decrease, surface temperature increase and troposphere, warms more than earth surface. Observations show that since 1960, surface and troposphere warm about the same rate. There are strong decreases in stratosphere temperature and increases in tropopause height since 1979 (T. Karl, NOAA).
Climate change scenarios show that: "At any point in time, we are committed to additional warming and sea level rise from the radiative forcing already in the system. Warming stabilizes after several decades, but sea level from thermal expansion continues to rise for centuries. Each emission scenario has a warming impact."
Climate models can be used to provide information on changes in extreme events such as heat waves. Heat wave severity is defined as the mean annual three-day warmest nighttime minima event. The Model compares favorably with present-day heat wave severity. In a future, warmer climate, heat waves become more severe in southern and western North America, and in the western European and Mediterranean regions.
In the next few years, CCSM will be further expanded to include reactive troposphere chemistry, detailed aerosol physics and microphysics, comprehensive biogeochemistry, ecosystem dynamics and the effects of urbanization and land use change. These new capabilities will considerably expand the scope of earth system science that can be studied with CCSM and other climate models of similar complexity. Higher resolution is especially important near mountains, river flow and coastlines. Full hydrological coupling, including ice sheet, is important for sea level changes. It will include better vegetation and land surface treatments with ecological interactions as well as carbon and other biogeochemical cycles.
For example, one of the carbon cycle methods being tested is based on microbe activity. There is a strong feedback between decomposition and plant growth: soil mineral nitrogen is the primary source of nitrogen for plant growth. Nitrogen fixing bacteria/algae are very important, however, there are limited field and laboratory data, on their role. It has been suggested that their role in nitrogen fixing can result in a shift from "carbon source" to "carbon sink," under a warming scenario.
The proposed DoE climate science Computational End Station (CES), set up at ORNL, will address Grand Challenge scale problems, to predict future climate change resulting from various energy options. It will use the CCSM for studies of model biases, climate variability, abrupt climate change, and global carbon and other chemical cycles and pursue high resolution, atmosphere and ocean studies.
The computer requirements, for the next generation of comprehensive climate models, can only be satisfied by major advances in computer hardware, software and storage. The classic climate model problems with supercomputer systems are that the computers (with the exception of vector systems) are not balanced between processor speed, memory bandwidth and communication bandwidth between processors, including global computational needs. They also are more difficult to program and optimize; it is hard to get I/O out of computers efficiently and computer facilities need to expand archival data capability into the petabyte range. In addition, there is a weak relationship between peak performance and performance on actual working climate model programs.
The major atmospheric research centers now have systems consisting of several thousands of IBM P3/4/5 processors, up to a thousand Cray X1E vector processors or several hundred NEC SX-6 and SX-8 vector processors. In either case, they can achieve about a half Teraflop per second sustained and sometimes one Teraflop per second on certain application codes. The exemption to this is the Earth Simulator in Japan, based on NEC SX-6 technologies (5120 processors), which delivers over 12 Teraflops per second of sustained performance.
Thus with sustained performance Teraflop per second computing on the horizon and occasionally on stream, meteorologists are moving from Climate to Earth System Modeling (ESM). This is because feedback loops of climate system with other relevant systems, such as ecology and socio-economy, are not negligible. Climate Modeling is not possible without proper representation of these systems; hence, ESM. Earth System Modeling is multi (time and space) scale, multi-process, multi-topical (physics, chemistry, biology, geology, economy). It is both very compute and data intensive. Some people claim it requires several orders of magnitude more computing power to tackle the problem. Achieving Petaflop per second and Hexaflop per second are therefore eagerly awaited.
According to Tom Bettge, deputy director of the scientific computing division at NCAR, "A factor of 25 times the present NCAR computing resources is needed to accommodate CCSM requirements over the next two years, to prepare for the Next IPCC assessment starting in 2007. How this deficiency is to be remedied is a great challenge. Although special architectures, like the IBM Blue Gene R&D system, for protein folding, is delivering good results in its niche area, this architecture is not suited to ESM, which needs a small number of fat nodes, rather than the thousands of processors as in the Blue Gene."
It was noted that despite many computing centers having IBM systems with 15-to- 25 Teraflops per second peak performance, these locations are only delivering a few hundreds of Gigaflops per second of sustain performance to the user application. Presently, CCSM is in the hundreds of Gigaflops per second era. Only Earth Systems Models running on the Japanese Earth Simulator have graduated to Teraflops. This was aptly illustrated by the talk from Michel Desgagne, of Environment Canada on the study of Hurricane behavior. His simulations were performed on the ES and achieved 13 Teraflops per second of sustained performance, using 495 compute nodes and 7 TBs memory. Each run took seven to eight days wall clock time.
Added Desgagne: "Recent studies have shown that very high resolution is essential to properly resolve waves that have direct impact on the intensification of hurricanes. In particular, innovative potential vorticity diagnostic tools were applied to diagnose inner spiral bands formed in explicitly simulated hurricanes. It was shown that wave-number one and two anomalies are in fact vortex Rossby waves that explain 40 percent to 50 percent of the wave activity in a period of 24 hours. These meso-vortices within the inner core of a hurricane are responsible for the dynamical processes controlling the redistribution of angular momentum and numerical resolution of these vortices could help to more accurately predict the intensification of hurricanes".
For the Vortex Rossby waves (VRWs), it was found that a 6 Km resolution was not good enough, a 1 Km resolution had to be used to get useful results. What has recently being identified is a Rossby wave train starting from the Indonesia area moving across the oceans causing bad weather. Using the ES, one achieves 1hour simulation for 1hour of computation. To follow and analyze VRWs across the globe, one needs to attain a day's simulation for 10 minutes of computation. This translates to approximately 150 times more computing than that achieved on the ES, i.e. ~2Petaflops per second sustained performance.
Several talks concentrated on projects implementing Earth System Modeling Frameworks, ESMF in the USA and PRISM in Europe. PRISM has now moved from an R&D project to the status of PRISM Support Initiative delivering a service. For example, PRISM will be the modeling environment at DKRZ for the IPCC fifth assessment.
During the last workshop in 2003, a strong emphasis was placed on data management and the challenges this entails. This time the emphasis was more on power used by supercomputers and the footprint as well as facility space requirements, which are of most concern.
Christopher Lazou, a frequent contributor to HPCwire, is managing director at HiPerCom Consultants Ltd.
Brands and names are the property of their respective owners.
Copyright: Christopher Lazou, HiPerCom Consultants Ltd., UK. September 2005.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?