June 15, 2007
The French weather service Météo-France has announced that a new high performance computer was installed at the French national center for weather forecasts in Toulouse. Five times more powerful than its predecessor, it will allow Météo-France to use a new forecast model and to conduct new research on climate change in 2008.
On May 31, 2007, the vector computer SX 8R, made by NEC, was taken officially into operation. The NEC SX-8R installed at Météo-France has 32 nodes, each of which has eight processors. Each processor (2.2 GHz) can perform 35.2 billion computing operations (gigaflop) per second, which in total gives the Météo-France supercomputer a peak performance of one teraflop. Météo-France now has one of the fastest supercomputers in France.
This new supercomputer will be indispensable for the exploitation of the new AROME model as soon as 2008. This model will allow more detailed forecasts and more precise predictions of hazardous meteorological events. One of the reasons why Météo-France chose a vector computer from NEC is the reliable technology roadmap to scale the performance of the system over the next few years. It will allow Météo-France to raise its calculation capacity as planned in the contract of objectives signed with the French Government for the years 2005 to 2008. The increase in performance compared to the Fujitsu VPP5000 used by Météo-France since 2000 is a factor of 5 today and will be raised to a factor of 21 in 2009.
Extreme weather causes higher expenses
Severe weather causes costs that could be lowered with better forecasts at hand. Moreover, climate change is threatening to drive the costs even higher. In June 2005, the Allianz Group, Germany's largest insurance company, and the ABI (Association of British Insurers) conferred about the issue of climate change. They estimated the annual damage to property inflicted by extreme weather phenomena at 10 billion Euros. This sum is expected to increase to approximately 27 billion Euros by the year 2080.
To anticipate these challenges Météo-France designed a new weather model to be able to take into account additional observation data from critical local areas like mountains, cities or particular vegetation zones. "The new computer will allow us to use operationally an advanced non-hydrostatic model with a mesh of 2.5 km," explained Pierre-Etienne-Bisch, president-director general of Météo-France. "This capability is strategic for us because it is necessary to improve our understanding of high impact weather."
The accuracy of weather forecasting has benefited substantially from improvements in observation methods, weather models and computing capacities over the past years. Currently, a medium range forecast between 8 to 10 days is regarded as reliable. About 2 days have been gained by the advancements of the last 20 years. Additionally major quality improvements have been achieved in the short term predictions. A revision of Météo-France's forecast has come to the conclusion that the prediction of the next 24 hours is accurate with a probability of 90 percent and the local average temperatures are predicted with a deviation of only 1.5 degrees Celsius. But a detailed and exact weather forecast for a longer period of time is still impossible -- the point of zero forecast accuracy is about 15 days.
However, further improvements are expected in the next years. With the help of new satellites like MetOp and improvements of the forecast models, an increase of medium range forecasting accuracy by one day is within reach during the next decade. Of the same importance is the further improvement of the short-term weather forecast. The new AROME model of Météo-France will be supported by additional data from radar surveillance, observance networks in situ and geostationary satellites like the second generation Meteosat.
Improving climate models and local forecasts by international cooperation
The researchers at Météo-France will also benefit from this supercomputer by being able to create new climate change simulations and develop forecast models for applications operational from 2012. In his speech at the inauguration event, Bisch emphasized the importance of cooperation between the individual weather services and other organisations. He referred to the contact with the biggest weather services and research facilities in the world, such as Earth Simulator (Japan), DWD (Germany) and UK Met Office (United Kingdom).
"The requirements for computing power of weather services will keep growing," said Philippe Gire, Operations Director NEC HPC Europe, during the inauguration of the Météo-France. "Weather services cannot spend more time to get a more detailed prediction. They have to get the calculations within minutes, not hours, to allow the meteorologists to interpret the results and issue their reports. With our supercomputers we help weather services across Europe to improve their models. Our fast vector SX-8R CPU makes it easier to exploit their real potential and thus to run even more complex models faster."
So reading tea leaves was yesterday. At this stage, the new computer by Météo-France already raises one expectation: The increase of international cooperation and of performance in supercomputing will not only improve weather forecasts, the whole climate research will take a big step forward "internationally."
10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.
10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.
Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?