November 25, 2005
This spring, for the first time, real-time forecasts running daily on PSC's LeMieux, a lead resource of the TeraGrid, correctly predicted the details of thunderstorms 24 hours in advance.
If anything is certain in 2005, not counting death and taxes, it's that we're at the mercy of forces we don't control. Despite incredible advances in understanding nature,leading to amazing technologies our forebears couldn't imagine, our planet still unleashes furious energies that devastate communities and lives.
Even before Katrina, the U.S. loss due to extreme weather such as hurricanes, floods, winter storms and tornados, averaged $13 billion annually. The human cost of nearly 1,000 people each year is incalculable. Would better forecasting make a difference? No doubt. More to the point, is better forecasting possible?
You bet, according to Kelvin Droegemeier, who directs the Center for Analysis and Prediction of Storms (CAPS) at the University of Oklahoma, Norman. Take thunderstorms, the nasty ones, with rotating updrafts -- called supercells. They surge across the Great Plains each spring with the potential to spawn deadly tornados. How much would it be worth to know six hours in advance -- instead of, as with current forecasting, a half-hour to an hour -- that one of these storms is headed your way? And not to have just an ambiguous "storm warning" but precise information about when and where it will hit, how severe it is, how long it will last?
"We want to be able to say that over Pittsburgh this afternoon at 3:30 there'll be a thunderstorm with 30 mile-per-hour wind, golfball-sized hail, two and a half inches of rain, and it will last ten minutes, and to give you that forecast six hours in advance," said Droegemeier.
Since the 90s, CAPS has taken several strides toward demonstrating that, with sufficient resources in data-gathering and computing, it's possible to do this. This spring, they took another stride. In a major one-of-a-kind collaboration with NOAA (the National Oceanic and Atmospheric Administration), CAPS used resources of the NSF TeraGrid, in particular LeMieux, PSC's terascale system, to produce the highest-resolution storm forecasts yet attempted. On several occasions, CAPS predicted the occurrence of storms within 20 miles and 30 minutes of where and when they actually happened, and they did it 24 hours in advance.
"That type of result pretty much sets conventional thinking on its ear," said Droegemeier.
Are Thunderstorms Predictable?
In contrast to the daily weather reports on TV, which are generated from large-scale models that predict atmospheric structure over the continental United States, storm-scale forecasting involves a tighter focus -- at the scale of a county or city. It requires observational data such as temperature, pressure, humidity, wind speed and direction, and other variables -- at a corresponding finer spatial resolution, and it demands the most powerful computing available, and then some, to run the models.
When CAPS began in 1988, the prevailing view about storm-scale forecasting was skepticism. Numerical weather prediction was not in question. Since the 1970s, computers programmed with equations that represent the atmosphere and initialized with observational data were proven to be the best way, by far, to forecast weather. The question was more fundamental. Are thunderstorms predictable?
"The challenge we set ourselves to was, if you take the concept of computer forecast technology and apply it at this smaller scale, does the atmosphere possess any intrinsic fundamental predictability, or is at all turbulence," asked Droegemeier? "We had hopes, but we didn't know. With big help from the Pittsburgh Supercomputing Center, we resolved that question."
CAPS developed groundbreaking new techniques to gather atmospheric data from Doppler radar and to assimilate this data with other meteorological information. And they developed a computational model that uses this data to predict weather at thunderstorm scale.
"It all starts with observations, because to predict we need to know what's going on right now," said Droegemeier. Data to feed weather models comes from many sources including upper air balloons, the national Doppler radar network, satellites, and sensing systems on commercial airplanes. From these sources, a huge amount of information, computationally processed and spread across a 3D grid representing the atmosphere, becomes the initial conditions for National Weather Service forecasts. With grid spacing at 10 to 30 kilometers, the NWS operational models do well at showing high and low pressure areas and storm fronts that develop from them -- weather that happens, roughly speaking, on the scale of states. Individual thunderstorms originate at smaller scales and to forecast them, requires much finer spacing, down to at least one to two kilometers.
The foundation of a storm-forecast model is 15 to 20 non-linear differential equations. They represent the physical phenomena of the atmosphere and how it interacts with the surface of the Earth. To make a forecast involves feeding the 3D grid with initializing data, solving these equations at each position on the grid, and then doing it over again for the next time step, every five to ten seconds for 24 hours of weather. For a single forecast, this means solving trillions of equations. Each doubling of the number of grid points in 3D requires eight times more computing. If you also halve the time step, to capture corresponding finer detail in time, it's a 16-fold computing increase. For this reason and others, storm-scale forecasting poses an enormous computational challenge.
Since 1993, CAPS has run forecasting experiments during spring storm season. In 1995 and '96, using PSC's CRAY T3D, a leading-edge system at the time, for a limited region of the Great Plains, they successfully forecast location, structure and timing of individual storms six hours in advance -- a forecasting milestone. For this accomplishment, CAPS and PSC won the 1997 Computerworld-Smithsonian award for science, and CAPS garnered a 1997 Discover Magazine award for technological innovation.
If there were lingering doubts about storm forecasting, that the question has shifted from scientific and technological feasibility to national policy -- whether sufficient resources can be made available and when -- this spring's storm forecast experiment should erase them.
As they have during many storm seasons over the past dozen years, CAPS and PSC this spring collaborated to produce real-time storm forecasts. The difference this year was that the forecasts covered two-thirds of the continental United States, from the Rockies east to the Appalachians. Using LeMieux, they successfully produced an on-time, daily forecast from mid-April through early June. "This was an unprecedented experiment that meteorologists could only dream of several years ago," said Droegemeier.
Conducted in collaboration with NOAA, the program included about 60 weather researchers and forecasters from several NOAA organizations -- the Storm Prediction Center and the National Severe Storms Laboratory, both in Norman, and the Environmental Modeling Center in Maryland -- and the NSF-sponsored National Center for Atmospheric Research in Boulder, Colorado along with CAPS.
This experiment offered an unprecedented chance for forecasters, as well as researchers, to work with advanced technology on a daily basis, technology that, according to Droegemeier, may be five years from being incorporated in daily forecast operations at the resolutions used. Each evening, meteorologists in Norman transmitted new atmospheric conditions to Pittsburgh. By the next morning, LeMieux produced a forecast that covered the next 30 hours, and transmitted the forecast back to SPC and NSSL in Norman, where researchers turned the model output-data into images corresponding to what they see on radar. These model runs were conducted daily with virtually no problems.
Using several different versions of the Weather Research and Forecasting Model, an advanced model designed for research as well as operational use, the partners generated forecasts three times daily. EMC and NCAR used grid spacing of from four to 4.5 kilometers. With LeMieux at its disposal, running on 1,228 processors, CAPS went a step further. With grid spacing of two kilometers, more than five times finer than the most sophisticated NWS operational model -- and requiring 300 times more raw computing power -- their forecasts are the highest-resolution storm forecasts to date.
"Our daily WRF-model forecasts had twice the horizontal resolution and nearly 50-percent greater vertical resolution than the other two experimental products," said Droegemeier. This higher resolution meant that the forecasts were able to capture individual thunderstorms, including their rotation. On several occasions, when the 24-hour forecast showed development of thunderstorms, it proved to be accurate within 20 miles and 30 minutes.
Just as importantly, the computer model produced images that matched well in structure with what forecasters saw later on radar. "The computer forecasts looked very similar to what we see on radar," said Steven Weiss, SPC science and operations officer. "The structure you see on the screen is important in judging whether the storm is likely to produce tornadoes, hail or dangerous wind. These results were an eye-opener in many respects."
"Real time daily forecasts over such a large area and with such high spatial resolution, have never been attempted before, and these results suggest that the atmosphere may be fundamentally more predictable at the scale of individual storms and especially organized storm systems than previously thought," said Droegemeier. Such results could potentially lead to a revision of classical predictability theory put forth by Edward Lorenz, the now retired MIT professor, whose pioneering research led to chaos theory. The forecasting community is still absorbing the findings, but it may mark a watershed in the understanding of atmospheric predictability.
For more information, including graphics, visit http://www.psc.edu/science/2005/storms/.
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?