The Portland Group
CSCS Top Right Frontpage
HPCwire

Since 1986 - Covering the Fastest Computers
in the World and the People Who Run Them

Language Flags

Visit additional Tabor Communication Publications

Enterprise Tech
Datanami
HPCwire Japan

Supercomputer Predicts Flow of Toxic Water from Katrina


In the wake of Hurricane Katrina, scientists and research centers from around the country have came together to generate information on the contaminated floodwaters and offer it to hazardous materials experts and public health officials.

The University of North Carolina at Chapel Hill's Marine Sciences Program and Renaissance Computing Institute (RENCI) and National Center for Supercomputing Applications have played starring roles in the effort by providing rapid- response computing and modeling capability.

Floodwaters containing organic and chemical pollutants, such as sewage and oil, still cover swaths of Mississippi and Louisiana. To aid cleanup, researchers at the National Oceanic and Atmospheric Administration's Coast Survey Development Laboratory, along with UNC faculty, have been developing forecasts to predict the circulation of those foul waters.

Researchers, including marine scientists Richard Luettich and Brian Blanton from UNC's College of Arts and Sciences, have developed a three-dimensional computer program to model water levels and flow. This program, called ADCIRC, is a hydrodynamic code. Previously, the code was used largely for after-the-fact analyses of coastal circulation, but researchers now believe it can help produce answers during a crisis.

Marine science professor Luettich and assistant research professor Blanton knew to simulate the required 60 days of water velocity and water surface elevation would necessitate more computational power than the university possessed. So they asked UNC's Daniel A. Reed for help -- based on their NOAA-funded collaboration with RENCI -- to establish a computational system with web access for rapid-response forecasting to severe weather.

"If we had a month to do these runs, we could do them on our desktop computers or on a small cluster," Blanton said. "But to do it literally overnight requires some horsepower."

Reed, former director of NCSA, connected Blanton and Luettich with NCSA, the National Science Foundation-supported supercomputing center at the University of Illinois at Urbana-Champaign. Using NCSA's Xeon system, a parallel computer called Tungsten, the researchers completed their computational runs in about 15 hours.

"This is a prelude to the capabilities RENCI and the University of North Carolina at Chapel Hill will provide to North Carolina, as we deploy our own large-scale computing infrastructure and continue to build disaster-response collaborations with North Carolina experts," Reed said. "With state support, we are now building world-class capability for interdisciplinary research, technology transfer, economic development and engagement across North Carolina."

CSDL researchers, with assistance from Luettich and Blanton, are integrating information provided by the computational calculations with NOAA's North American Mesoscale Model, the primary weather forecasting model used by the National Weather Service to simulate wind speed, direction and other weather factors. Their goal is to provide daily forecasts of coastal circulation and pollutant concentrations in the Katrina-affected region. This information is vital as cleanup and recovery efforts continue.

The two professors also have extended their work with RENCI, Reed and colleagues to analyze various aspects of Hurricane Rita and its effects on Texas and Louisiana.

"We are trying to be prepared and generate reliable information that the hazardous materials experts will need to have," said CSDL scientist Jesse Feynen. "We're doing that, and we're doing it quickly."

Most Read Features

Most Read Around the Web

Most Read This Just In

Most Read Blogs


Sponsored Whitepapers

Breaking I/O Bottlenecks

10/30/2013 | Cray, DDN, Mellanox, NetApp, ScaleMP, Supermicro, Xyratex | Creating data is easy… the challenge is getting it to the right place to make use of it. This paper discusses fresh solutions that can directly increase I/O efficiency, and the applications of these solutions to current, and new technology infrastructures.

A New Ultra-Dense Hyper-Scale x86 Server Design

10/01/2013 | IBM | A new trend is developing in the HPC space that is also affecting enterprise computing productivity with the arrival of “ultra-dense” hyper-scale servers.

Sponsored Multimedia

Xyratex, presents ClusterStor at the Vendor Showdown at ISC13

Ken Claffey, SVP and General Manager at Xyratex, presents ClusterStor at the Vendor Showdown at ISC13 in Leipzig, Germany.

HPCwire Live! Atlanta's Big Data Kick Off Week Meets HPC

Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?

Xyratex

HPC Job Bank


Featured Events


HPCwire Events